You are currently browsing the category archive for the ‘common usage’ category.

This blog was linked to a while ago in a Reddit discussion of uninterested and disinterested. (My opinion on them is that uninterested is restricted to the “unconcerned” meaning, while disinterested can mean either “unconcerned” or “impartial”, and that’s an opinion based on both historical and modern usage. In fact, despite the dire cries that people are causing the two words to smear together, it actually looks like the distinction between them is growing over time.)

The reason I bring this up again is that one of the Redditors was proposing that having a strong distinction could make sense, because:

“Some people draw a distinction between disinterested and uninterested. There is nothing to lose and perhaps subtlety to be gained by using that distinction yourself. Therefore observing the distinction should always be recommended.”

But I’ve already asked my question about this in the title: is there really nothing to lose? Is there no cost to maintaining a strict distinction between words? Or, more generally, is there no cost to maintaining a grammar rule?

Well, in a myopic sense, no, there’s nothing much to lose by having the rule. In the case of uninterested and disinterested, it would be hard to argue that not being able to use disinterested to mean “unconcerned” is a substantial loss. It can be done, though: I, for instance, am a great lover of alliteration, and as a result, I like to have synonyms with as many different initial letters as possible. There’s a cost, small though it may be, to not having disinterested available as I’m constructing sentences. But that’s a triviality.

A more substantial consequence is that it introduces a discontinuity in the historical record. If we decide that from now on disinterested only means “impartial”, then historical and current uses of the “unconcerned” sense will be opaque to people taught the hard-and-fast rule. That’s problematic because, despite the belief of some people that this is an illiterate usage, it’s actually common even for good writers to use. This, again, isn’t a big problem; we regularly understand misused words, especially ones whose intended meanings are very close to their actual meanings. Saying that we can’t have a rule of grammar because sometimes it isn’t followed is the sort of whateverism that people accuse descriptivists of, not a reasonable concern.*

No, the true cost is a higher-level cost: the overhead of having another distinction. This might also seem trivial. After all, we have tons and tons of usage rules and distinctions, and a lexical distinction like this is really little more than remembering a definition. But let me illustrate my point with an example I recently saw on Tumblr (sorry for the illegibility):

The distinction here is well-established: affect is almost always the verb, effect almost always the noun.** Yet here we see that it is costly to maintain the distinction. First, it’s costly to remember which homophone goes in which role. Second, it’s costly to make an error, as people may mock you for it. Third, it’s very easy to get it wrong, as the replier did here.

If there were really no downside to adding an additional rule, we’d expect to see every possibly useful distinction be made. We’d expect, for instance, to have a clear singular/plural second-person distinction in English (instead of just you). I’d expect to see an inclusive/exclusive first-person plural distinction as well, as I sometimes want to establish whether I’m saying we to include the person I’m speaking to or not. The subjunctive wouldn’t be disappearing, nor would whom.

But all distinctions are not made. In fact, relatively few of the possible distinctions we could make at the word level are made. And that suggests that even if the reasons I’ve listed for not maintaining a lot of distinctions aren’t valid, there must be something that keeps us from making all the distinctions we could make.

So next time someone says “there oughta be a rule”, think about why there isn’t. Rules aren’t free, and only the ones whose benefits outweigh their costs are going to be created and maintained. The costs and benefits change over time, and that’s part of why languages are forever changing.


*: Of course, if the distinction is regularly violated, then it’s hardly whateverist to say that it doesn’t exist.

**: Affect is a noun in psychology, effect a verb meaning “to cause” that is largely reviled by prescriptivists.

Let me talk about something that I feel like I’ve been circling around for some time, but never quite directly addressed. It’s a common thing in grammar grousers: playing up other people’s questionable usages as symptomatic of a larger disease while playing down one’s own as a clever subversion of stodgy English. Whereas the complainant’s usages are all justified by improving the language or enlivening the prose or just plain sounding right, the scorned writer’s usages are utterly unjustified — not because the complainant has considered possible justifications and found none of them sufficient, but rather because it is simply self-evident that an error is an error.*

Thus we see Salon’s Mary Elizbeth Williams writing a screed against sentential hopefully, but then absolving herself for using stabby and rapey. I find both of those to be worse than the targets of her ire — especially rapey, the jokey tone of which I find borderline offensive. Crucially, though, even as I reject her words, I can see why she likes them; it’s just that for me, their benefits don’t outweigh their downsides. Williams, on the other hand, seems to ignore any potential upsides to the usages she dislikes. When she says rapey, she sees it as the considered usage of a professional writer, an improvement on the language. When you write sentential hopefully, it’s because you can’t be bothered to think about your usage and the effects it could have on the language.

Similarly, I got into a short Twitter war with a follower who tweeted that she wanted to send copies of education majors’ grammatical errors to future employers. I pointed out that the follower (whose Twitter name is “Grammar Nazi”, about which the less said the better) had questionable usages in her bio:

“A soon to graduate English major whose biggest turn on is good grammar.”

In my grammar, there’re three missing hyphens, but she responded to me noting this with “I’m sure you’re aware compounding is a grey area. Rules may be generally agreed upon, but no official guidelines exist.” Such “generally agreed-upon” rules were probably settled enough for the tweeter to treat as errors had others broken them, but because she’s doing it, it’s okay. Her choice to go against the standards is justified, because she sees the justification. The education majors’, with their justifications left implicit, probably wouldn’t be.**

This forgiveness extends, of course, to include other people whose viewpoint the writer is sympathetic to. Kyle Wiens, who wrote that Harvard Business Review piece on his intolerance for grammar errors in his hiring practices, had a couple of questionable usages in the piece — nothing too bad, but things that would violate a true Zero Tolerance stance. Another blogger quoted some of the piece and added:

“Ignoring the one or two grammatical glitches within the quoted text (they may be the result of a message that was delivered orally, rather than in written form), the message […] should be taken to heart. If you write poorly, you tell your reader: I haven’t changed. My education hasn’t made me better, it hasn’t touched my core. […] I’m certainly not looking to have excellence be part of my personal brand – it’s too hard and too time consuming.”

The blogger seeks out an explanation for Wiens’s errors that diminishes the errors, but then chooses an explanation for everyone else’s that diminishes the writers.

We all do this to some extent. The most prominent example for me is when I come home from work and find a pile of dishes in the sink from my roommates. “C’mon guys, you can’t be bothered to do the dishes?” I wonder to myself and to anyone I talk to over the next few days. Yet I’ve just realized that I forgot to finish the dishes this morning before going to campus. Somehow I can’t muster the same indignation at myself as I have toward my roommates, because I had an excuse. (And I’ll tell you it as soon as I figure it out.)

Sure, it’s fair to give known-good writers more leeway than known-bad ones. But every error has a cause, and every usage a rationale. Don’t decide ahead of time that someone can’t be wrong or can’t be right.

*: This isn’t unique to grammar by any means; half of politics is explaining away your side’s missteps while playing up the other side’s.

*: By the way, you may wonder if I’m not doing exactly what I oppose here by complaining about a minor error that some people do not see as an error. On that, two points. One, hyphenating phrases that are used as adjectives (especially more-than-two-word phrases) is about as standard a rule of punctuation as one can find. Similarly with hyphenating a phrasal verb in its nominal form. Two, not that she needs to justify herself to me, but she doesn’t explain any reason why she’s breaking the rule, so as far as I can tell, she’s breaking the rule just to break it — hardly appropriate behavior for an otherwise hard-liner.

I’ve mentioned my fondness for compiling historical grammatical errors as a reminder that we are not, point of fact, destroying what used to be a perfect language. Previously, I’d found unnecessary quotation marks in a 1960 World Series celebration, it’s for its in a 1984 John Mellencamp video, and an apostrophe incorrectly marking a plural in a famous 1856 editorial cartoon. But these were all punctuation-based errors. Today’s is a proper grammatical error, and one that people full-throatedly bemoan nowadays.

I found this error by admitting to myself that I am secretly an old man, and coming to terms with it by spending much of the summer sitting in parks, reading books on naval history and international relations. One of them, Nathaniel Philbrick’s Sea of Glory, tells the story of the U.S. Exploring Expedition, who discovered Antarctica and created the country’s first accurate naval charts for the Pacific islands. It’s a good book, but then it turned great by having two interesting old quotes four pages apart.

In the first, the Expedition is approaching Fiji and takes on another pilot due to the many coral reefs in the area:

“Wilkes felt it necessary to secure yet another experienced pilot at Tonga named Tom Granby. ‘You will find when we get to the Islands,’ Wilkes assured Granby, ‘that I know as much about them as you do.’ Granby smiled. ‘You may know all about them on paper,’ he replied, ‘but when you come to the goings in and goings out, you will see who knows best, you or myself.'”

Myself here is clearly non-standard, as no first-person pronoun has appeared anywhere in the sentence. The standard rule for reflexives, known as Principle A in Government and Binding theory, and discussed in pretty much every introductory syntax class, is that a reflexive must be bound in its governing category. Or, to say it in a more theory-agnostic and somewhat looser way, the coreferent of the reflexive (I/me for myself) has to appear within the smallest clause that contains the reflexive, and structurally “above” the reflexive. The syntactic specifics they depend on which syntactic theory you’re adhering to, but luckily they don’t really matter here; there’s no possible coreferent anywhere within the sentence, so any standard definition of Principle A will label the sentence ungrammatical.

Turning from this syntactic jungle to the Fijian jungle, a few pages later the Expedition lands on an island and hikes to its peak:

“Almost two years at sea had left them ill-prepared for such a demanding hike. ‘I have seldom witnessed a party so helpless as ourselves appeared,’ Wilkes wrote, ‘in comparison with the natives and white residents, who ran over the rocks like goats.'”

Again, it’s obvious that this is a non-standard usage, since no first-person plural noun phrase appears in the sentence to justify the reflexive.

Now, I’ve been marking these as non-standard rather than incorrect, and there’s a reason for this that is more than a desire to be non-judgmental. These supposedly erroneous uses of reflexives are widespread — so much so that I’d argue they’re at least borderline acceptable in many people’s forms of Informal Spoken English. That means that they ought to be explainable, that there ought to be some option in the rules of English that allow you to consider these uses acceptable without having to change much else in the language. I’m going to speculate for the rest of this post, so feel free to bail out here.

But before you bail, let me just brag about where I get to read.

Here’s my idea, which I don’t think is novel.* Reflexives are allowed only when, in some sense, there’s a sufficiently salient coreferent for the reflexive. Salience is standardly assessed syntactically, meaning that a coreferent appears structurally above the reflexive, and close enough to remain salient when the reflexive appears. But there is pragmatic salience as well, for people and things who haven’t been explicitly mentioned but remain prominent in the discourse all the same. And what is more pragmatically salient than the speaker? In both of these cases, it seems that the speaker is thinking of themselves as sufficiently salient to trigger the reflexive.

My intuition is that there are more instances of inappropriate reflexives for first person (myself, ourselves) than second person (yourself), and more of either than for third person (himself, herself, itself, themselves). I did a quick corpus search on COCA for sentence-initial As for *self, and the intuition wasn’t fully borne out; as for myself was the most common, but combined as for him/herself showed up almost as often (64 to 60), and as for yourself only registered one instance. So maybe I’m totally off-base on the specifics.** But something is going on that allows so many people to view reflexives as standard in positions that we don’t expect to see them, and like this or not, that needs explained.


*: If you know of any references to discussions about this issue, please share. I’m not primarily a syntactician, and didn’t see anything in a cursory search of the literature, but I really doubt this discussion hasn’t been had before.

**: I think the as for *self construction may be a special case. Most of the third-person uses look to be about how some third party views themself, and while one can state one’s own introspections and speculate about a third party’s, it’s a little bit weird to tell someone their own introspections. That could artificially deflate the second-person counts.

I think the best explanation of this construction may be as an indicator that we are switching mental spaces, if you’re familiar with that theory. Saying as for Xself establishes a new mental space focused on X and their inner workings or opinions, rather than the more generic mental space of the rest of the conversation. Sorry, I’m really going down a rabbit hole here.

A woman drove past me recently in a car with a license plate holder reading “ALUMNI — BOSTON COLLEGE”. It’s a perfectly standard thing to have on one’s car — although BC was a bit of a surprise given that I’m in San Diego –, but it also presented a minor choice point in my day. I could either think of it as totally unremarkable and move on, or I could fret over its grammaticality.*

It looked like this, except mounted on a car instead of floating in a featureless void.

The problem with the license plate holder is a minor one that you’d easily never know if you’re unfamiliar with Latin. I was unaware of it until college, and even then it was perhaps only because I went to a school so fond of Latin as a scholarly language that our degrees were not BAs but ABs (Artium Baccalaureus instead of Bachelor of Arts) and our diplomas were written entirely in Latin.**

Anyway, the problem is that alumni is, at least in Latin, plural. Furthermore, it’s masculine (or mixed-gender). For a single graduate, the Latinally accurate form would be alumnus for a male or alumna for a female. And for multiple female graduates, the Latinally accurate form would be alumnae.

I imagine many of you readers already knew that, but maybe you didn’t. If I’m being perfectly honest, I wish I didn’t. Why? Because I can’t help noticing it. I suspect that a majority of the English speaking population doesn’t think that alumni has even the hint of inherent plurality about it. I’m looking at the Corpus of Contemporary American English right now, and there are 70 hits for “an alumni”, 61 of them in writing.*** That’s more common than “an alumna” and “an alum”, and only 29 hits less than “an alumnus”. Quite simply, singular alumni is standard in all but the most formal of Englishes, and I’m not sure it’s non-standard even there.

Why is singular alumni standard? Because it fits better with English. We don’t really like gender on our nouns (at least not anymore — Old English was fond of it). And we don’t really care about adjusting the plurality of borrowed words, especially not from Latin — see agenda or stamina. Rather than having to remember a fairly idiosyncratic gender/number system, it’s easier to treat alumni as a base singular form with a zero-plural, just like strong ol’ Germanic words like sheep or fish. And it saves university bookstores from having to stock four different license plate holders.

[EX-CUSE: Syracuse Alumni]

It’s a tangent, but this pun is almost enough to make me wish I had gone to Syracuse.

To return to the point of the opening paragraph, I can’t, much as I’d like to, stop myself from correcting singular alumni. It’s not even like it’s a choice, or a conscious decision — I see singular alumni, and my brain says “alumnus” or “alumna”. That much is automatic.

Where the choice comes in is whether I say something about it or judge people for it. In almost every situation, I don’t. For seemingly everybody, singular alumni is acceptable. For many of the rest, they’re okay when it’s used in a reasonable situation (such as when you don’t know the gender of the person buying the item). It’s only in very formal or very edited English (or around close friends who I think will be interested) that I would raise the issue. In other situations, bringing it up would just seem like an attempt to show off my passing familiarity with Latin, which would be a especially pathetic boast.

This is not linguistic whateverism. I’m not saying that editing is stupid or that nothing should be corrected. Editing, I can’t stress enough, is critical. But my point is that for all of you who insist that, say, it’s for its kills you and you can’t stop yourself from correcting it: yes, you can. We’re not beasts; we have self-control. When it’s something trifling, or in an ephemeral setting, or clearly not indicative of a larger ignorance of the language, you can and should let it pass. You’ll be happier for it, and you might even see a drop in your overall peevishness levels.

*: This is a false dichotomy; there is clearly a third way — to base a blog post upon it, thereby spending far more effort than if I had been content to simply complain about its grammaticality. Given that I’m going to berate that choice as a foolish use of one’s time, I’m aware of the irony in mine.

**: In fact, we are so enamored of traditional uses of Latin that to this day the salutorian of the class delivers their graduation speech entirely in Latin. The graduating seniors are given a copy of the speech in both Latin and English, with the Latinate portion marked for where to laugh, cheer, applaud, etc. I don’t think the rest of the audience is given this cheat sheet, thereby creating the illusion that we all speak Latin fluently enough to understand it in oratorical form.

I know, it sounds stupid and pretentious and ridiculous, and it is. But it was also great silly fun to overlaugh at something incomprehensible, sort of like being a member of a studio audience clapping at “APPLAUSE” signs must be. I highly recommend you petition your alma mater to do the same.

***: Many of these are in noun-noun compounds like “an alumni club” or “an alumni trustee”, where the grammatical number of alumni is unclear. Though my original intuition is that it’s being thought of as plural in these cases, English does tend to disprefer plural first nouns in noun-noun compounds (cf. mousetrap, cowcatcher, leafblower). Also, if one were to replace alumni in these compounds with some standardly pluralized noun like student, it’d be “student club”, not “students club”. Thus, I’m inclined to think of these examples as further, though weaker, evidence of singular usage alumni.

Post Categories

The Monthly Archives

About The Blog

A lot of people make claims about what "good English" is. Much of what they say is flim-flam, and this blog aims to set the record straight. Its goal is to explain the motivations behind the real grammar of English and to debunk ill-founded claims about what is grammatical and what isn't. Somehow, this was enough to garner a favorable mention in the Wall Street Journal.

About Me

I'm Gabe Doyle, currently a postdoctoral scholar in the Language and Cognition Lab at Stanford University. Before that, I got a doctorate in linguistics from UC San Diego and a bachelor's in math from Princeton.

In my research, I look at how humans manage one of their greatest learning achievements: the acquisition of language. I build computational models of how people can learn language with cognitively-general processes and as few presuppositions as possible. Currently, I'm working on models for acquiring phonology and other constraint-based aspects of cognition.

I also examine how we can use large electronic resources, such as Twitter, to learn about how we speak to each other. Some of my recent work uses Twitter to map dialect regions in the United States.



@MGrammar on twitter

Recent Tweets

If you like email and you like grammar, feel free to subscribe to Motivated Grammar by email. Enter your address below.

Join 975 other followers

Top Rated

%d bloggers like this: