You are currently browsing the monthly archive for March 2010.
I like reading the Economist. I had a subscription to it when I was back in college and someone had moved away from my dorm without sending them a forwarding address. (I also had a subscription to Newsweek, Time, and Us Weekly that way.) I think they have insightful analysis of economic things that I otherwise don’t know very much about.
But I also think they have a completely insane style guide. For instance, did you ever think about the fact that under the circumstances is strange? I sure didn’t. But the Economist has, and even demands that their editors remove it. After all,
“Circumstances stand around a thing, so it is in, not under, them.”
Etymologically, they are right. Circumstance comes from Old French circum-, circonstance, which comes from Latin circumstantia, which meant “standing around, surrounding condition” (all this is from the OED, not me). But then, circumstance wasn’t originally pluralizable, since its original English meaning was “that which surrounds”. Pluralizing the original English circumstance would be akin to saying “I saw the outsides of my house”.
Look, etymological reasoning is never a solid reason for a prescription. Bradshaw of the future had a wonderful post that I always like to refer to that discusses the etymological fallacy, and a while ago I listed a set of words whose meanings have changed, spitting in the face of etymology. The fact that a word meant something in Latin, or Old French, or even Old English does not mean that it means the same thing now. And sure enough, circumstance doesn’t mean “that which surrounds” anymore. In fact, here are the first seven definitions of circumstance from the OED that are not considered obsolete:
- The logical surroundings or ‘adjuncts’ of an action; the time, place, manner, cause, occasion, etc., amid which it takes place
- An adverbial adjunct.
- ‘The adjuncts of a fact which make it more or less criminal; or make an accusation more or less probable.’
- The ‘condition or state of affairs’ surrounding and affecting an agent.
- The external conditions prevailing at the time.
- Condition or state as to material welfare.
- Circumstantiality of detail.
The definitions go on, but they move further and further away from the original definition of “that which surrounds”. Notice that only two of the listed definitions even mention surrounding, and they apply only metaphorically. So the original meaning is pretty well lost here.
Now note further that while it might not make a whole lot of sense to say under with any of these definitions, it doesn’t make any more sense to say in either. They’re all metaphorical meanings, with abstract surroundment, so prepositions of position are all “logically” weird. (Such metaphorical usages are why the whole “logic of language” argument tends to break down, as Emily discussed in her guest post.)
Furthermore, under the circumstances has been attested for many, many years; the OED first attests it in 1665. That said, the OED has, since at least 1893, claimed a usage distinction between in and under the circumstances: “Mere situation is expressed by ‘in the circumstances’, action affected is performed ‘under the circumstances’.” But the Merriam-Webster Dictionary of English Usage (among others) dismisses this distinction as cryptic and unrelated to actual usage. Both in and under the circumstances are commonly used, and there is no reason to avoid either one aside from personal prepositional preference.
Or, to put it in a format that the Economist will recognize:
“SIR – Your prohibition against under the circumstances is based on an etymological fallacy. I assume for consistency’s sake that you also write stamina are, since stamina is plural in Latin. Or, more relevantly to your publication, I assume you use laissez-faire economics only when describing someone else’s economy to them because it comes from the second person plural conjugation of French laisser.”
Summary: Under the circumstances is fine. So is in the circumstances. Use them as you see fit.
*: A perhaps interesting footnote: here I am defending under the circumstances, and yet it appears I haven’t ever used the phrase on this blog. I have used in with circumstances three times, though as in some/appropriate/certain circumstances as opposed to in the circumstances. Using it with the still sounds funky to me.
I feel like this past month more and more people have mentioned to me their belief that languages either do or should strive to be logical. On the one hand, this is an obvious point. A more logical language is a more learnable language, and since language is passed down from generation to generation, we expect that exceedingly difficult-to-acquire portions of a language will be eventually lost by this process. That’s fairly uncontroversial and is known as “regularization” in linguistics. But the problem is that the logic of language is generally opaque. It’s not the same as the logic of mathematics or the logic of argumentation, so it’s hardly obvious what it means for a language to be logical. I’d wanted to make a post about this, but I was having trouble saying what I meant to say. Thankfully, my labmate, Emily Morgan, ended up saying some great stuff about it in a comment elsewhere. She’s been kind enough to elaborate on those thoughts here. Without further babbling from me, here’s a guest post from her.
When linguists speak out against prescriptivism, one question we get asked is why we care so much about it. This post is an attempt to answer that question.
To begin with, it’s important to point out that linguists generally aren’t blanketly opposed to prescriptivism; rather, we’re opposed to uninformed or misinformed prescriptivism. So for example, I’m very much in favor of standard spelling and punctuation use, but with the understanding that these are more or less arbitrary conventions–not because I believe that these particular conventions are better than any others. Prescriptivist rules often come with supposed justifications, but under further scrutiny those justifications frequently don’t hold water. In particular, many rules are justified on the basis of some “logical” argument. The problem with that is that it’s easy to construct arguments that sound logical for certain cases, but don’t follow the bigger-picture logic of how language works. To give an analogy from mathematics, I could make a pseudo-logical argument that because we count …8, 9, 10… then the next number after 18, 19 should be 110. Of course, given an understanding of how the decimal system works, that’s nonsense. But without that broader understanding, it would sound logical. So bringing this back to language, if someone tries to argue, for example, that You drive too slow is incorrect because slow is an adjective not an adverb, that sounds logical under the simplified view that slow is an adjective while slowly is an adverb. But in the bigger picture, we find that slow can be used either as an adjective or an adverb–and has had both uses for hundreds of years.
That bigger-picture argument puts a lot of weight on descriptive generalizations about how native speakers use their language. I think it’s important to understand why linguists so often use arguments like these, which are based on descriptions of what native speakers do. The underlying reason is that language is a natural phenomenon, and our goal as linguists is to understand how it works. And to do so, we call upon all the empirical tools of science, and our primary source of data is the way that people actually do use language. Now, I recognize that how people do use language and how people should use language are not inherently the same thing. But I think that any claims about how people should use language need to be grounded in a solid understanding of what language is. And I think that many prescriptivists fundamentally misunderstand this. Language is not an ideal system that we as individual speakers are trying to draw upon or conform to. Language is something that we as a community of speakers collectively create and reinvent each time we speak. So any statement that we make about language is inextricably rooted in a descriptive generalization about what that community does. Even the most fundamental notions of grammar—things like the division of utterances into words, or the grouping of words into parts of speech—are not a priori assumptions about how communication should work: rather, they’re based on our empirical understanding of how speakers treat language.
So in the bigger picture, why do we linguists care about all of this? There’s a lot of reasons, but I think the most fundamental is that there’s hugely widespread misunderstanding of a topic that we care a lot about, and we feel a professional obligation to set the record straight. In the worst case, baseless prescriptions like “don’t split infinitives” or “don’t end a sentence with a preposition” actually lead to worse writing, as people learn to go through contortions to avoid what are actually perfectly standard grammatical constructions. In milder cases, people just waste their time trying to remember rules like the supposed distinctions between that/which and less/fewer, which are mostly harmless when followed, but equally harmless when violated. Additionally, as Gabe discussed recently, these shibboleths distract from the true pleasure of studying language, which is an amazingly rich and fascinatingly complicated system—but instead of being exposed to the excitement of unsolved questions in linguistics, people are instead being drilled on arbitrary and unnecessary rules. To draw another analogy to math, it’s the same sort of regret I feel for people who had poor math instruction early in school, and end up hating all things number-related, without ever seeing the beauty of abstraction that comes out in higher-level math. (If you are one of those number-haters, feel free to substitute your own favorite discipline or activity, and consider that sense of “But you don’t understand!” that you feel when someone misunderstands it or dislikes it for no good reason.)
Finally, I want to clarify that in arguing for more permissive, less prescriptive attitudes towards grammar, we are not trying to convince people to use language in ways that sound unnatural to them. As native speakers, we all have intuitions about what sounds right and what sounds wrong. Gabe can say “needs done”, but to me that sounds unnatural, and so I never use it myself. One underlying assumption to the linguist’s descriptive approach to language, which we probably don’t stress enough, is that there can be more than one right way to say something, and the fact that we are describing variation between speakers does not mean that we expect to find the same variation within all individual speakers. So no one is trying to convince you to say “needs done” if it sounds wrong to your ear—we’re only trying to convince you not to be upset if someone else does use it. As a caveat, I recognize that this position gets more complicated when thinking about English as a Second Language instruction, or when teaching people who have grown up speaking a dialect that deviates in major ways from Standard English, in which cases it’s obviously valuable to discuss what standards exist and what cultural implications they bear. But even in these cases, the fundamental ideas remain unchanged: we should acknowledge variation as natural, and any usage advice needs to be based on factually grounded descriptions of that variation.
Apparently sentential adverbs are a secret. An open secret, of course, which explains why almost everyone knows about them and uses them regularly. Everyone, of course, except prescriptivists. I already talked about this regarding prescriptivists’ insistence that hopefully can’t be used as a sentential adverb, but now I’ve come across it again in the belief that most importantly can’t be used as a sentential adverb, as in (1a), and that instead most important should be employed (1b):
(1a) Most importantly, you want to intrigue students [...]
(1b) Most important, you want to intrigue students.
When I read that, I thought they were putting me on. (1b) sounds awfully awkward to me. If were editing someone and they came to me with this sentence, I would immediately suggest that most importantly was surely what they meant. If they insisted on using the adjectival form, I’d want something stronger than a comma to separate it from the rest of the sentence; I think I’d want to use a colon.
So why do people disagree with my exquisite punctuative tastes? What’s their argument for the adjective? It’s an intriguing one: the sentential modifying most important is said to derive from what is most important, as in sentence (2):
(2) “His color is very good, and what is most important, he is himself, just as much himself in color as he was in pen and ink.
The claim is that the modern form most important is an elided version of the longer what is most important. Now, that strikes me as something of a just-so story; if that sort of elision is standard with what’s more important, why don’t we also see it attested with other similar constructions, such as what’s most interesting or what’s more notable? One possibility is that what’s more important is more frequent than the other constructions; evidence for this hypothesis comes from the Google n-gram corpus, in which there are far more examples of what is more important than any other single what is more X:
what is more important: 31740
what is more interesting: 5795
what is more likely: 4566
what is more difficult: 2413
what is more surprising: 2189
(and so on)
And some of these other adjectives do behave like important:
(3) Even more surprising, he has put his scholarly findings in “popular” form [...]
So maybe the elision story isn’t a just-so story after all. And even if it is, sentential most important is well-attested in the Oxford English Dictionary and on the Internet:
(4a) What were these quasi-stellar objects and, perhaps even more important, how were they giving off so much energy? [OED, 1964]
(4b) Most important, he never wavered from his driving principles [...]
And as such, I am willing to accept most important as standard for people who are not me. But what of most importantly? Well, the secret of sentential adverbs is simply that there’s nothing wrong with them either. Certainly you’d sound quite mad if you said what’s most importantly, but that’s fine, because that’s not where most importantly comes from. Most importantly is just a sentence-modifying adverbial phrase like any other:
(5a) Most importantly, he wants to focus on moving Provo residents past the campaign [...]
(5b) Clearly, he wants to focus on moving Provo residents past the campaign
(5c) Oddly, he wants to focus on moving Provo residents past the campaign
(5d) Luckily, he wants to focus on moving Provo residents past the campaign
(5e) Frankly, he wants to focus on moving Provo residents past the campaign
(The last two sentential adverbs have been attested in the OED since 1717 and 1847, respectively.) In none of (5b-e) could the adverb be converted to an adjective.
More importantly probably arose independently of what’s more important, either as a regularization of the sentential adjective more important to sentential adverb, or through some separate lineage. And I say “regularization” here only because sentence-modifying adjectives like most important (and most surprising) are outliers; most sentential-modifying phrases are adverbial.
Lastly, I’m told by the MWDEU that the bare adjective important cannot be used as a sentential modifier, even though more important can. That strikes me as very strange; after all, what is important is no less valid than what is more important, right? Instead, importantly must be used in that situation.
So prescriptivists holding the “most important, not most importantly” view are asserting that importantly is only valid if it is unmodified, while important is only valid if it is modified. That seems to me an odd stance to take, especially compared to the simpler explanation that importantly is valid whether or not it’s modified.
Summary: more important and more importantly are both valid sentence-modifying phrases, although I personally would only use the latter. Importantly is also a valid sentential modifier, although oddly important is not.
Every time National Grammar Day comes around, I’m struck with a spot of dread. Any of my friends or acquaintances might, at any moment, spring upon me and shout “Hey! It’s totally your day! So don’t you hate when people use the passive voice, since you’re all into grammar?” And then I will be forced, as the crabby old coot I am, to meet their well-meaning inquiry with the level of vitriol normally reserved for a hairdresser who’s decided to treat your head as a testing ground for a new theory of hair design. “No,” I’ll shout, “that’s not it at all! I love the passive, I love variation! Grammar isn’t about telling people what they can’t say; it’s about finding out what people do say, and why they say it!” And through that outburst, my Facebook friend count will be reduced by one.
My problem with National Grammar Day (and most popular grammarians in general) is that it suggests that the best part of studying language is the heady rush of telling people that they shouldn’t say something. But if you really study language, you know that there’s so much more to it than that. Each time March 4th comes and goes, we’re missing an opportunity to show people how wonderful the field of linguistics is. So if you’ll permit me to steal a moment, let me show you the two papers that really made me fall in love with the field.
The first is from Murray, Frazer, and Simon: “Need + Past Participle in American English“, which is the first in a series of three papers on the Midwestern/Appalachian construction needs done (e.g., this article needs re-written, my cat needs washed). This paper made me realize how deep the rabbit-hole of colloquial and dialectal speech goes. (Sadly, you need a subscription to JSTOR to read it.)
The second paper is the one that launched me into the exciting world of alternation studies, Bresnan & Nikitina’s “On the Gradience of the Dative Alternation“. (This paper has since been superseded by revised versions, but I think this draft is still the best version for an alternations newbie.) If you ever have the chance, take a look at these papers. Maybe they won’t do anything for you, but then again, maybe they will, and maybe you’ll understand why I think so many celebrants of National Grammar Day are missing the point.
On to the meat of the post. As you might remember from last year, my favorite way to celebrate National Grammar Day is by debunking popular grammar myths. Here’re 10 facts about the English language that run counter to the rubbish that pedants prescribe. The first eight are from the last year of posts here at Motivated Grammar. The last two are from other sites. Explanations and justifications for the statements below are found by following the links, so if you disagree, please don’t grouse to me that I must be wrong until after you’ve read the reasons why you are.
Singular they is standard English. What’s wrong with the sentence Everyone celebrates today in their own way? Historical usage, contemporary usage, the usage of revered writers, acceptance by language authorities, analogous constructions, and issues of ambiguity all agree: absolutely nothing.
Slow is an adverb. It has been used as such for years, for centuries even. Shakespeare, Milton, and Thackeray all used adverbial slow, so it’s even fine with the literary set and style manuals. You may resume drinking Dr Pepper if you so choose.
People are using hopefully correctly. Hopefully has two distinct usages, one a regular adverb meaning “in a hopeful manner”, and the other a sentence-modifying adverb meaning approximately “I hope” or “With any luck”. The latter usage has been unreasonably derided, because it is a sentential adverb and it is a new meaning for an old word. But neither of those complaints is valid, especially since…
The meanings of words can and do change over time. Hopefully isn’t the only word with a new-meaning stigma; prescriptivists often vilify words that have sprouted new meanings. But this is a very standard part of the English language. In fact, not only hopefully, but also of course, snack, naturally, enthusiasm, and quarantine have all changed their meanings over time.
You can eat healthy food. This meaning was fine for 300 years, and then Alfred Ayers came along and declared it wrong. Of course, it was he who was wrong, but his edict has stuck around at the edges of prescriptivism ever since.
I’m good is good. Every once in a while, someone gives me guff about my careful avoidance of the phrase I’m well when I am asked how I am. There’s nothing wrong with I’m well, but it isn’t what I mean to say. There is also nothing wrong with I’m good, and it is what I mean to say.
Between and among differ not in number, but in vagueness. The rule that between can only be used with two items, and among with more than two, is specious. The real tendency of English favors between when the connections are conceptualized as being specifically between individuals, and among when the connections are more vague and collective.
An invite is informal, but hardly wrong. It’s a minor point, of course, but the noun has been around for 500 years. I mention this post mostly because there was a great discussion in the comments about the psychology of prescription.
And from others:
Strunk & White’s The Elements of Style isn’t a good grammar reference book. From Geoff Pullum. While Strunk & White are able dispensers of style advice, they drop the ball in their grammatical advice, and unfortunately, that’s what people use them for. Pullum explains why the 50th anniversary of the book should have been met not with celebrations, but with shaking heads.
Choosing between which and that is more interesting than you’d think. It’s nearing five years old now, but Arnold Zwicky posted about his understanding of different contexts in which which and that can be used as relativizers in a relative clause. It’s much more interesting and rewarding than just saying that which is to be limited to non-restrictive clauses. It’s also much more accurate.
Want more debunked myths? 10 more are available on last year’s post! See why 10 items or less, different than, and alright are all right. Want still more, preferably in fewer-than-140-character chunks? Follow Motivated Grammar on Twitter.
[Update 03/04/2011: For National Grammar Day 2011, I've listed another 10 grammar myths, addressing topics such as Ebonics, gender-neutral language, and center around.]
[Update 03/04/2012: And again for 2012. Ten more myths, looking at matters such as each other, anyways, and I'm good.]