Sunday, December 10, 2017

Linguistic subtlety and grammar

In my previous post, I was talking about the importance of proper grammar, punctuation, and spelling—trying to recognize the great value of good command of these conventions that help facilitate communication, while also trying to keep focus on the ideas, not the formalities. Basically the argument was that errors in convention don’t matter if they don’t interfere with the communication of ideas (and that people who complain about errors in punctuation and grammar are often annoying when they over-emphasize attention on minor grammatical points at the expense of the ideas being communicated).  

Language is, or at least can be, extremely subtle in expressing significant difference, and the attention of the reader would be well spent exploring the subtleties, where the important difficulties lie, rather than attending to conventions.  

To reiterate the importance of punctuation, grammar, and spelling, it should be noted that the conventions themselves contribute to the subtlety—presence or absence of a comma can often have a significant influence on the meaning of a sentence, for example.  

But there are also times that the crucial questions are not problems with grammar, but rather small linguistic differences that are crucial to differences in ideas.  The Roman Catholic Church has recently moved to alter the Lord’s Prayer.  The change is linguistically minor, from one phrase that is grammatically sound to another phrase that is also grammatically sound.

The traditional English phrasing was “lead us not into temptation,” and the new recommended phrasing is “do not let us fall into temptation.” The conceptual difference of interest to the Catholic Church is the difference, roughly, between the Pied Piper and a lifeguard—the difference between actively luring people and aiding only when people go too deep (metaphorically). 

For me, such linguistic differences and their influence on the concepts being described are harder to notice when I’m focusing my attention on grammatical issues. And I definitely notice that people who are spending their time correcting grammar, and proving how well they know grammar, often miss the point of what is written. I remember once seeing a professional writer make a comment about the difference between US and UK conventions regarding use of the words “that” and “which,”  and someone responding “the rule is easy: here’s how you use ‘that’ and ‘which’…” Yup, you’re real proud that you know that grammatical "rule," but you totally missed the point about how that “rule” isn’t actually a rule, but rather is specific to the US context.  (Actually, even in the US context, that "rule" is often viewed as a suggestion--see what The Elements of Style has to say about "that" and "which.")


In short, the important stuff in writing isn’t the grammar. The ideas are what matter; grammar is only important as a tool to help communicate. And people who focus on grammar and miss the actual ideas are annoying.

Monday, December 4, 2017

On the importance of proper grammar

I was reading a book recently that quoted the 18th century scientist, Sir Joseph Banks, and commenting on the spelling used says in a footnote: “Despite his expensive education, [he] had managed somehow to avoid the basics. His disdain for grammar, spelling, and punctuation give his writings a magnificent immediacy.” The footnote, to me, characterizes an attitude that I find quite annoying. (I’m not going to give a source because I have no interest in criticizing the author, only the way that grammar is approached.)

As an editor, I value proper grammar and style highly. Despite the high value I place on grammar, I am tempted to correct it as little as possible. To me, grammar serves a larger purpose: a writer wants/needs to communicate clearly; proper spelling, punctuation, and grammar aid in clear communication. When I read as an editor, my major concern—often my only—is with the  ideas that the author is trying to communicate: what are they? Are they clear? Can I understand what the author is trying to accomplish? Can I paraphrase the author’s purpose in a way that would satisfy the author and also encompass all the issues that I see in the work?

The attitude that annoys me is when people just cannot stop from correcting other people’s grammar.  As a matter of my job as an editor, of course, I am often called to fix people’s grammar.  But some people just want to fix other people’s grammar for no particular purpose, except, possibly, to boost their own ego by proving that they know the rules of writing better than others. This is especially annoying in which an individual complains about a supposed grammatical error that isn’t really an error.

This is very much the case for the footnote I quote above: the author hides his disdain for Banks behind the backhanded compliment of a “magnificent immediacy,” and his claim that Banks “avoid[ed] the basics,” and had a “disdain” for grammar, spelling, etc.  Banks’ writings were bestsellers; they were works of great influence, largely responsible for Banks’s elevation to the prestigious position of president of The Royal Society (The President, Council and Fellows of the Royal Society of London for Improving Natural Knowledge), the English scientific organization, a position he held for over 40 years. To criticize their grammar seems profoundly irrelevant, even if we don’t take into account the historical linguistic context.  In criticizing Banks, the author does not take into account the history of the English language. Written English is guided by convention, and in some cases by specific style guidelines. But in the middle of the 18th century, those conventions were not set in stone.  There was, in fact, great variation in spelling used by different authors.  In 1754, the Earl of Stanhope complained that it was “a sort of disgrace to our nation, that hitherto we have had no… standard of our language; our dictionaries at present being more properly what our neighbors the Dutch and the Germans call theirs, word-books, than dictionaries in the superior sense of that title." Samuel Johnson’s dictionary was first published the next year, but it was hardly a uniform convention when Banks was writing during his 1768-1771 voyage with Captain James Cook.  The fluidity of English at the point in time is evident in Laurence Sterne’s The Life and Opinions of Tristram Shandy, Gentleman, which was published between 1759 and 1767, and uses quite unconventional English. Sterne, too, was a bestseller, and is still regarded as a significant figure in the history of English literature.  The less conventions are settled, the less appropriate it is to complain about someone diverging from convention.  Knowing a little of the history of English usage, the footnote disparaging Banks’s writing shows the misplaced interest in grammar that I find annoying.

Today, the English language has far more settled conventions than it did in the middle of the 18th century, but even today, there are people who want to correct when they just shouldn’t.  The obvious example of this is people who correct other people’s grammar on webpage comments: really, who cares that the commenter made a grammatical error or not? If the grammar is so bad that the thought is incoherent, sure. And if you want to insult someone, sure, pick on their grammar (it’s not egregious to complain about the grammar of the common “your an idiot/moron” or “your stupid”). 

As a consulting editor who works with graduate students, I get particularly annoyed with professors who spend their time focusing on grammar when, in my opinion, they should be focused on the ideas. Yes, it is within the purview of a professor to correct grammar, but the primary job of a professor is to teach higher-level subject matter.


Proper grammar and punctuation and spelling help a writer communicate to an audience. They are crucial tools in communicating. But the idea is the important part. If the idea comes through clearly, then any individual grammatical error is essentially irrelevant with respect to the larger purpose of the written work—or at least, it seems that way to me.  

Thursday, November 30, 2017

Benching Eli Manning

I generally try to write things that can at least provide some reflection on the issues of scholarly writing, but this doesn't have that. I grew up in New York, and as long as I’ve been a football fan, I’ve rooted for the Giants. I’m not as enthusiastic a fan as I was when younger, but the Super Bowl wins in the 2007 and 2011 seasons brought me a good deal of pleasure. Eli Manning, of course, was crucial in those Super Bowl games, making great plays when the game was on the line.

Eli Manning is the best quarterback in team history, and may go into the Hall of Fame. He had started 210 consecutive games—second longest such streak in NFL history (edging past his brother who started 209 consecutive games). He’s still playing about as well as he always has, although his stats are down because the rest of the team is not playing well.

Manning is getting benched for the next game, breaking his streak. This move has generally been panned, with lots of people saying it’s a bad decision, and many insisting that it’s the end of Manning’s career with the Giants. Maybe it is, but I agree with the decision to bench Manning at present—though my reasoning is not, apparently identical with that of the Giants’ coach.

 Coach McAdoo has decided to start Geno Smith. This is not a good decision, in my opinion. Geno Smith is not as good an NFL quarterback as Eli. And we’ve seen a good deal of Geno—30 NFL starts, over 850 NFL pass attempts. Geno is not some dude who has barely had a chance. Geno has had years of opportunity to impress. Maybe Geno’s coaches have all been wrong, but who has Geno Smith impressed? This is his fifth season, how many surprises does Geno have for us? There is no particularly compelling reason to start Geno over Eli, except, maybe, that you want to keep Eli from getting sacked so often. Or that you want to lose some games—tanking is an option here.

 It is hard on Eli to get benched, of course. But a lot of the difficulty comes from the way it was handled, too. They could have said: “You’re getting sacked a ton, our record stinks, and we want to see if the young guy is any good. Next year you’re our starter, and maybe for a few years, but we need to start thinking about our next QB.” That doesn’t feel good for Eli, sure, but at least it’s not a commentary on his play. It’s just a smart decision with respect to evaluating the state of the team.

The Giants are having a terrible season, and to fix the problems, they need to see what they have. They need to assess young players who haven’t had the time to play. In particular, they need to assess the quarterback on whom they spent a 3rd round pick last year, Davis Webb. A 3rd round pick is a very valuable asset in the NFL. If they don’t ever play Webb, then that’s just a wasted pick. If they do play Webb, and he plays well, he’s suddenly a highly valuable asset that the Giants can use. Maybe they keep him to groom him as a starter a couple of years from now if Manning flames out, or as a trade asset in the way the Patriots used Garoppolo. Maybe Webb looks bad, which might motivate the Giants to take one of the highly regarded quarterbacks who will be available in the upcoming draft—again to groom as Manning’s backup for a season or two.

Eli Manning will be 37 when next season starts. He’s not going to play forever. The Giants suck right now, Manning is getting hammered behind a bad offensive line (he’s been sacked more in 11 games this year than he was all last year), and the Giants need to assess the quarterback they drafted last year. That’s not a reflection on Manning’s quality or ability, it’s just a realistic assessment of what the Giants need to do to start preparing for future seasons, because they’re surely not going to the playoffs this year. If I ran the Giants, I would tell Eli that he’s my quarterback until he starts playing badly, but that right now he’s sitting so I can see whether my valuable 3rd round draft pick (Davis Webb) is worth anything that I can use to help the team win next year.

Saturday, November 25, 2017

The "Problem" of Similar Work

One problem that many graduate students face is that they have started on a project and then discover a book or article that is very close to what they have done.

Recently I received an email that said: “I have just found a book that makes a large number of the same arguments I was planning to make. I am having a bit of a rethink on my basic proposal, and will take longer than I planned.” I think the “will take longer” part of this is one of the most common stumbling blocks, and I think it can be generally resolved by looking at the similar work differently.

If you want to do original work, finding a work similar to what you intended can be seen as a block, as something that prevents you from doing what you wanted because what you wanted to do will no longer be original. There is, possibly, some loss in prestige in following work that someone else has already done, but this does not prevent you from doing original work that supplements or complements the already-published work. But finishing a project is a primary concern, and the existence of a published work that is very similar to what you hoped to do is actually a boon in terms of designing a project and getting it accepted.

Every similarity with some other work is something that you can cite in support of your own work. Instead of asserting a point yourself, you can make that assertion in combination with a citation, which makes the assertion more acceptable to most academic readers. The greater the similarity, the greater the strength of the foundation for your own work. When you read a work that is similar to yours, you can profit from that work if you can find one question about the work that you can turn into a good research project that you would be willing to do.

All scholarly works have some limits—some conclusion that may have interesting unexamined implications, some premise that had been defended or explained poorly, some side issue that hasn’t been examined, some point where you disagree with the work. All you have to do is find one place where you think the work is limited, and you can do some sort of study that addresses the limitation. There are even times when attempting to replicate an experiment or study can be valuable.

If you can find such a single point, you can build a study using the same theoretical framework as the work that was similar to what you wanted to do, which saves you a lot of working in explaining the motivations and theoretical foundations of your own project (which are often stumbling blocks).

When you use a lot of a specific work, you can get the additional rhetorical benefit of speaking positively of other scholars: you present your work as an attempt to cooperate with and build on work that you respect. If you frame your work in that positive cooperative relationship with the similar work, you will not be perceived as contrarian, even if you do choose to challenge one aspect of the similar work.

Sunday, November 19, 2017

New Review of My Book on Amazon UK

It's always pleasing to get a new review for my book, Getting the Best of Your Dissertation (at least so far--I'll see how I feel about that after I get slammed in a review for the first time).  This one is on Amazon UK, and I don't know if it will ever migrate to the regular US Amazon.

A very sensible and readable book, packed with good advice for doctoral students
I purchased the Kindle book because I wanted to review Dr Harris's ideas before speaking with him, and found it so useful that I have put in an order for the paperback as well. I'm based in the UK, and the book's advice is slanted towards the US system, but not overly so, and most of the discussion of topic selection, etc. is equally applicable over here. The fundamental rationale behind doctoral level study is pretty much universal, and that is what this book addresses.

If you want to know how to "survive" the "ordeal" of a doctoral degree, then this is probably not the book for you, but if you want sensible advice and an explanation of "what" you are being asked to do, "why" you are being asked to do it (dissertation tasks are not - only - the sadistic tendencies of your professors, they do serve a purpose), and most importantly "how" to do the various elements of a PhD, then I wholeheartedly recommend this book.

Part of getting the best out of your dissertation is the enjoyment to be found in the process of studying, and Dr Harris emphasises this factor. He doesn't hide the facts about the hard work required, but demonstrates how a change of attitude about this aspect can help you to work more effectively, faster, and to produce a better piece at the end.

The book covers questions over all aspects of study, from getting onto a programme to finishing your dissertation, and I cannot recommend it highly enough.

Tuesday, November 14, 2017

Words and Things

Writing is a process that involves a lot of learning…so much that what one wants to say can get sidetracked.
I picked the title of this post because I wanted to talk about the gap between things in the world and the words that are used to refer to them.  But there seemed a familiarity to that title, and a quick search showed that, in fact, there is a famous book titled Words and Things, by a philosopher named Ernest Gellner.  A brief perusal of Wikipedia suggests that my concern is not the same as Gellner’s but there is a relationship between them. But I don’t want to talk about Gellner’s ideas.

What I want to talk about is the gap between things in the world and the words that we use to describe them.  I have been thinking of this both with respect to a common issue that causes trouble for academics: the question of genres and how to write about genre issues, and also the question of race, which is getting a lot of attention in the U.S. press, for obvious reasons.

My main concern is that the words are not the things, and I think that dangerous effects come from assuming identity between the words and things.  This is especially a concern for the damage of over-generalization, especially the use of stereotypes.
Genre and race are, social constructions: they’re lenses through which people can see the world, but close examination of the ideas will reveal that drawing the boundaries on categories like these is more a matter of choice than a matter of reality: the words get used  as people see fit, but those usages do not necessarily adhere to any objective standard that is beyond dispute. Yes, of course, if we look at individual examples (whether people or artifacts), we can easily see gross differences: yes, this man has dark skin and this man has light skin; this piece of writing has rhymes/verses, while that has prose narrative. And yes, these gross differences can be used to characterize large groups for whom those gross differences hold true.  Sometimes it can be very useful to hold on to such generalizations.
But sometimes those generalizations can become burdens.  I suppose that these burdens depend on the context, but in general, the issue at hand is what any term means is not objectively definable, nor is the meaning of any term the same for all people.

My concern for genre is prompted by a paper draft I was reading recently that spent a lot of effort on defining a genre and discussing the different theoretical concerns for the genre. The problem for the academic writer trying to use genre is that it is very easy to slip into genre debates, and little clear way to end genre debates.  The alternative for the academic writer is to aid relying on genre terms (and other sweeping generalizations), and to focus on specific things: for example a specific work, or a specific characteristic of certain works. By focusing on the specific issues in the world—the things to be described, rather than the words chosen to describe them—there is no ground for debates that grow out of different ideas of what a word means.
This question of avoiding genre in academic debate is often a real danger to graduate students, in the sense that it can really delay development of good research, but this is a relatively insignificant concern compared to concerns about race.

My concern for race was sparked by a number of different articles I was reading recently, all of which made gross generalizations about race, despite the clear intention of the articles to reveal and disrupt the systemic patterns of racial discrimination present in the US. One article (https://www.thedailybeast.com/why-do-white-people-feel-discriminated-against-i-asked-them) quoted Tim Wise: “Whites have ALWAYS felt that we were being discriminated against every time there was evidence of black or brown progress.” With all due respect to Mr. Wise, I think he should speak for himself. He has no idea what all white people think. I feel absolutely safe to say that in any large group of people, there will be a variety of opinions and ideas. Personally, I don’t feel that black or brown progress means that I am being discriminated against, even if black or brown progress erode my white privilege. Personally, I feel that black and brown progress shows a move towards the kind of society that I would like to live in, one in which all people have real opportunities, and where success is more dependent on personal traits than on parentage.
In this society that is characterized by such great divisiveness, I think that generalizations about groups tend to expand the divisions in society. Assuming that someone has a certain thinks or feels a certain thing on the basis of some gross generalization (e.g., “white vs. black”), dehumanizes the individual.  If the hope is to eliminate racism or other divisive patterns of thought, then there is benefit in trying to avoid such gross generalizations: reducing people to nothing more than avatars of some category that you have constructed in your understanding of the world, reduces your chance of cooperating with actual people.  
Whether someone gets counted as white or black depends on context—and that means that categories like white and black can be fluid. In the movie The Commitments the protagonist says “Do you not get it, lads? The Irish are the blacks of Europe. And Dubliners are the blacks of Ireland. And the Northside Dubliners are the blacks of Dublin. So say it once, say it loud: I'm black and I'm proud.”  But Irish, in the context of the US in the 21st century, are most definitely not black. 
Such things can be much more personal—one can imagine, for example, the Commitments character being “black” while in Dublin, and then traveling to the US and immediately becoming white.  Do we then assume that that person will feel the same way about discrimination against blacks that a white person born and raised in a Southern family with roots tracing back to the  Confederacy and further?

Using words to focus on one thing can also obscure focus on others: I recently read an article that argued that all white people are racist, because all white people have experienced the privileges of being white. (This does make the kind of over-generalization of which I was speaking—does a Dubliner experience “white privilege” if he or she lives in a context where he or she is at the bottom of the social order?)  In the case of this article, the word “racism” was used to describe a certain thing: the experience of white privilege.  But this use of the word “racism” obscures another use of the word racism—the meaning that “racism” is an attitude of racial superiority. And that attitude is a crucial one. Yes, whites all experience white privilege, but do all whites share racist attitudes?  Using “racism” to talk about people who have benefitted from white privilege obscures the fact the some white people think that black people are inferior, while other white people do not. A white child of 3 years of age had benefitted from white privilege, but I think it unlikely that a 3-year-old can have any meaningful sense of racial superiority. More personally, since I believe that race is a social construction, I don’t think it’s meaningful to speak of racial superiority because “race” isn’t inherent in people: how can one race be superior to another if races don’t exist?


My concern here is for the use of words and for the danger of using common simple words to describe complex things in the world. If we reduce a work of art to a genre—a “novel”, classical music, etc.—then we can miss important details. Reducing people to a concept—“black,” “white,” “racist,” “woke”—obscures the complexity of people and limits chances to work together.  Trying to focus on the thing in the world may require more words—it’s more complicated to say “people who have benefitted from structural in equalities that are often based on visual cues like the light color of their skin” than it is to say “whites”—but such careful definitions avoid making simplistic assumptions about people and their attitudes.

Friday, November 10, 2017

Unscrupulous man admits to lack of morals

Yesterday, in a tweet about trade relations with China, Donald Trump said:

"How can you blame China for taking advantage of people that had no clue? I would've done same!" (https://twitter.com/realDonaldTrump/status/928769154345324544)

That's right. He admits to "taking advantage of people" and seems to think it's a good idea. In some situations it is ok to take advantage of people: I don't blame sports teams for trying to exploit opponents' weaknesses or mistakes. But the president of the United States? That's a lousy way to govern within the nation. And, unfortunately, his policy choices certainly suggest his willingness to take advantage of the people who voted for him, by cutting services that help his voters, and giving massive tax breaks that help himself and his family.

It's a lousy way to conduct international diplomacy, too. How can any government deal with his administration and think that there's any cooperative spirit in the negotiations? Trump has made it clear that he will try to take advantage of people.

It's one thing to resist being taken advantage of. And it's completely another to go out and try to take advantage of people--especially those who are lacking knowledge (i.e., "have no clue" to use Trump's words). Saying "I take advantage of those lacking the knowledge I have," is hardly taking the moral high ground. Unless morality is measured in money.