You are currently browsing the monthly archive for January 2010.

A few posts ago, I was talking about the sentential-modifier meaning of hopefully, or in non-linguist speak, hopefully in the sentence:

(1) Hopefully I’ll be able to escape from the dungeon this afternoon.

This is not the original meaning of hopefully, which originally meant “in a hopeful manner”. Although it seems that the original meaning has lost prominence in recent years (and has almost completely fallen out of at least my lexicon), it’s still in use:

(2) “‘A whip isn’t a weapon,’ he replied hopefully.”

But as soon as you have the perception that a new meaning is edging the old one out, prescriptivists see it as a battleground for the language, and lift their skinny fists like antennas to heaven, crying out for someone to aid them in their quest to return the word to its original, unsullied state. And you know what? On its face, that might seem like a reasonable stance; after all, we don’t want to open the floodgates and allow any word to mean anything, right? At that point it seems it’s a slippery slope to the Humpty Dumpty position on language, named for the following exchange in “Through the Looking-Glass”:

`I don’t know what you mean by “glory,”‘ Alice said.
Humpty Dumpty smiled contemptuously. `Of course you don’t — till I tell you. I meant “there’s a nice knock-down argument for you!”‘
`But “glory” doesn’t mean “a nice knock-down argument,”‘ Alice objected.
`When I use a word,’ Humpty Dumpty said in rather a scornful tone, `it means just what I choose it to mean — neither more nor less.’

But this slope is not nearly as slippery as prescriptivists would have you believe. There is a world of difference between Humpty’s singular declaration that glory means “a nice knock-down argument” and the acknowledgment that a meaning that has been in common usage for almost 80 years (sentential hopefully) is a proper meaning of the word at this point. Maybe you don’t believe me, or still don’t feel entirely comfortable with new meanings. I wouldn’t blame you; this is a commonly-held belief known as the etymological fallacy.

So let’s look at some examples of words whose common and well-accepted meanings were really quite different from their original meanings. None of these, as far as I’m aware, are controversial meanings. They all represent substantial changes from their original meanings. And the English language has not fallen into whateverism as a result. Keep these in mind the next time you’re about to object to a newer usage just because it’s new, whether it be hopefully, anxious, nauseous, or something else entirely. All the definitions are based on the Oxford English Dictionary, 2nd edition, online version (http://www.oed.com). And many thanks to the commenters on the earlier post, who offered suggestions for some of the best words below.

Read the rest of this entry »

Back in high school, I used to read etiquette guides. In fact, I read them and took extensive notes, because I was going to be somebody, and somehow I got the idea in my head that impeccable etiquette was a crucial part of that.  It was a simple error I’d made, mistaking a need for “good manners” as a need for “good etiquette”.  I worked on this for probably two or three years, and now I can’t tell you a single rule I read out of an etiquette book.  Why?  Because there was absolutely no discernable method or pattern to the rules of etiquette.  In search of a pattern, I even studied the history of etiquette guides in college, spending Saturday afternoons up on the third floor of the University Library pulling out books that hadn’t been borrowed since 1943, containing advice on the use of calling cards and what use good etiquette had in a world with horseless carriages. I certainly enjoyed it, but I’d be reluctant to say I really learned anything.

I was reminded of this period of my life when, at the used bookstore, I chanced upon a copy of Miss Manners’ Guide to Excruciatingly Correct Behavior.  I opened up to a random page in the middle, and was happy to find out that it was about eating, something I happen to know a little bit about.  Luckily, this portion of the book is available online, so you can follow along! The question posed to Miss Manners is a simple one:

“How do you eat spaghetti with a [fork and] spoon?”

To which Miss Manners icily replies that eating spaghetti with a fork and spoon was “outrageous”, because

“A fork is the only utensil that may be used to eat spaghetti while anyone is looking.”

Miss Manners delves slightly into the details of how to eat spaghetti using only the fork — plant the fork on the plate, twirl, present to mouth — and closes with a tart reprimand that allowing the ends of the spaghetti to fall back onto the plate after a bite would be unthinkable, and that the only acceptable solution is to slurp the remnants into your mouth.  (Of course, she doesn’t use the proletarian term slurp, but rather dances around it by suggesting that the eater inhale.)

To this, an agitated reader responds:

“[Your proposed method is] Proper, perhaps, for a Roto-Rooter man. The correct way to eat spaghetti is with a fork and a soup spoon. […] One cannot eat spaghetti properly without a soup spoon.  Shame on you.”

(Why, by the way, a soup spoon?  Why is there no such restriction on the fork?)  And, of course, Miss Manners replies with this convincing counter-argument:

“In the civilized world, which includes the United States and Italy, it is incorrect to eat spaghetti with a spoon.  The definition of ‘civilized’ is a society that does not consider it correct to eat spaghetti with a spoon.”

Two dandies

"I say, Eustace! That man eats spaghetti, yet he uses a spoon!"
"Edward, you have espied a true scoundrel!"

So, to recap, the entire debate consists of three points: 1) Miss Manners asserts that using a spoon is unacceptable; 2) A reader asserts that not using a soup spoon is unacceptable; 3) Miss Manners counters that using a spoon (any sort) is uncivilized.

It is a fruitless argument where both sides insist that the boundary of acceptability is what they say it is — without a single piece of evidence in favor of their points — each implying that it is self-evidently obvious that their claim is true, heedless of the fact that their opponent considers it self-evidently false.  No evidence is given, no argumentation advanced, nothing.  And then, just in case you couldn’t pick up on the subtle connection I’m trying to make between etiquette mavens and language mavens, Miss Manners underscores the point by changing the very definition of a word (civilized) to pretend that it supports her claim. (Much as grammaticasters misuse educated as meaning “agreeing with me”.)

I think you can see why I stopped analyzing etiquette advice in my free time. But, why, again, did I replace it with analyzing arguments over grammar?  The evidence presented here suggests it is because I am stupid.

Many grammarians go about their days maligning ambiguity. Don’t use while when you mean although, they say, because it’s ambiguous. Don’t use since in place of because either, they say. And so on. If they were right, then everyone would be confused by these two sentences:

(1a) Since I eat the right foods in the right combinations, I’m not focused on calorie restriction.
(1b) The Oscar-winning director tells the story of Venezuela’s “peaceful revolution” since Chavez came to power in 1998 […]

But people aren’t confused, because the clauses readily disambiguate since. (1a) uses the present habitual I eat, which prevents the “ever since” meaning from making sense. (1b) uses the past perfect, which would allow for either meaning, but there’s nothing in the sentence that a “because” clause could attach to, so the “ever since” meaning is the relevant one.

In general, these concerns about ambiguity are actually concerns about potential ambiguity, where someone intentionally misreading the sentence or not paying a lot of attention to it could misread it.* These situations usually don’t result in actual ambiguity for reasonable readers. That’s not to say there are never ambiguities, but only that these ambiguities are usually much less of a problem than prescriptivists claim.

(2) In a second term, Carter might have moved the course of government toward the left, but since Reagan won the election the nation’s political movement has been toward the right instead.

When it is important that the reader gets exactly the meaning you desire, it is important to remove ambiguity, and at those times you’d want to, for instance, replace since in (2) with because or ever since. When the distinction is either obvious or unimportant, there’s no reason to change it. And the problem is that trying to make language completely unambiguous often comes at the cost of readability and comprehension:

(3) Upon such default, and at any time thereafter, Secured Party may declare the entire balance of the indebtedness secured hereby, plus any other sums owed hereunder, immediately due and payable without demand or notice, less any refund due.

That’s legalese; an officiously precise form of the English language that is borderline incomprehensible to those not trained in its tortuous wendings. Although there is little ambiguity in (3), it’s very difficult to extract the meaning, and the sentence seems bloated. But just try shortening or clarifying the above sentence without re-introducing an ambiguity, and you’ll see the difficulty: languages are not built for precision. And, in fact, ambiguity in language is not a bug, but a feature. This is a point nicely summarized by Frederick Newmeyer in a paper that I otherwise disagree with heartily, Grammar is grammar and usage is usage (PDF):

“The transmission rate of human speech is painfully slow […] less than 100 bits per second—compared to the thousands that the personal computer on one’s desk can manage. A consequence is that speakers have to pack as much as they can into as short a time as they can, leading to most utterances being full of grammatical ambiguity […] For that reason, humans have developed complex systems of inference and implicature, conveyed meanings, and so on. […] Stephen Levinson phrased it beautifully: ‘[I]nference is cheap, articulation expensive, and thus the design requirements are for a system that maximizes inference’ (Levinson 2000:29).”

[Emphasis mine.] Ambiguity is useful, as ambiguous sentences can convey the necessary information just like unambiguous sentences, but in fewer words. The reader, listener, or whoever you’re directing your language to is then able to use their knowledge of context and implicatures to determine the appropriate interpretation (this is the “inference” process). A great example of this (again from the Newmeyer paper, but originally from Martin, Church, & Patel 1987) is (4), which has 455 possible parses, many of which yield different meanings.

(4) List the sales of products produced in 1973 with the products produced in 1972.

And yet, given a bit of context, and some knowledge of what one is trying to do in the situation in which this sentence is uttered/written, you are able to pretty quickly figure out which potential meaning is the best. Trying to make the sentence perfectly unambiguous would only drown the reader in words.

Summary: Pick your battles against ambiguity. Where ambiguity is truly detrimental, put forth the effort to clarify, to root out plausible ambiguities and remove them. Where ambiguity is tolerable, it can be better to leave it in to keep from exhausting yourself and your audience.

[If you’re interested in more on potential vs. effective ambiguity, Arnold Zwicky had a post on Language Log from 2008 discussing this topic. Now that I look at his post again, I’ve realized that most of what I said here, he already said there, plus more.]

*: I know some of you in the audience are editors, and I’ve had a few editors explain to me that their job consists in part of idiot-proofing writing. This requires you to try to make it as easy on the reader as possible, and to assume that the reader will fall into whatever garden paths and other meaning pitfalls are possible. Removing the potentially ambiguous situations might be seen as a step in this task. That’s a fair counter-point, but it does not compel a change, and the change must be weighed against the considerations. Avoiding ambiguity that requires the reader to wantonly misinterpret is less crucial than avoiding easy-to-fall-into ambiguities.

What does momentarily mean?  It’s a bone of contention for many prescriptivists, who insist that it must mean “for a moment”, not “in a moment”.  It’s a common enough debate to have appeared in an episode of Sports Night, when Dana (the show’s executive producer) begins discussing this point in the midst of preparing for that night’s show. (Video of the exchange here, if the embedded bit below doesn’t work.)

DANA: Momentarily does not mean “in a moment.”
DAVE: Here’s 2 dissolving to 3.
DANA: Thank you. It means “for a moment.”
JEREMY: Yes.
DANA: That makes me crazy.
JEREMY: We’ve been wondering what the source was.
DANA: Let’s see a graphic for Seattle.
CHRIS: Coming.
DANA: It means “for a moment,” not “in a moment.”
CHRIS: Seattle’s up.
DANA: On the plane when they say “We’ll be landing momentarily,” I call over a flight attendant, and I tell them, “if we land momentarily, it won’t give the passengers enough time to get off the plane.”
JEREMY: And once safely inside the airport, how long do they usually detain you for questioning?
DANA: Well, they know me by now.

But is Dana correct?  If Sports Night had been set in the 1830s, then she may have been.  But in our modern world, she is not.

Let’s go through a quick history of momentarily, from the Merriam-Webster Dictionary of English Usage.  Momentarily is first attested in 1654, with the “for a moment” meaning.  Two other meanings, “instantly” and “at every moment”, popped up in the 18th century.  The newest meaning, “in a moment”, is first attested by the Oxford English Dictionary in 1869.  Interestingly, the MWDEU notes that momentarily — with any of these meanings — was used only rarely until the 20th century.  Then, in the early 20th century, momentarily usage picked up.  It was the “for a moment” sense that became popular first, and the “in a moment” sense followed shortly thereafter.  (The other two meanings never hit the big time.)

This popularity lag is probably the source of the modern concern that “for a moment” is the more original, more pure sense, and “in a moment” the interloper.  It doesn’t help that the “in a moment” meaning is “chiefly North American” (according to the OED), which prescriptivists generally interpret as meaning “a boorish American misusage”.  But the truth is that both meanings are more than 140 years old.  If you’re concerned about ambiguity, take heart in the fact that it’s unlikely that the two meanings will be confused:

(1a) You will be sent to the new Environmantal [sic] Laboratory site momentarily or you may click here.
(1b) […] the Pacific breezes momentarily gave way to a brisker wind.

That’s not to say that they could never be confused, because they can if you leave out the context:

(2a) I will visit your house momentarily, (as I’m only a few blocks away.)
(2b) I will visit your house momentarily, (since I have to hurry to another engagement.)

But context usually offers the necessarily disambiguation. And if you were really that concerned about ambiguity, there’d be a lot of words more common than momentarily that you’d have to avoid. (For instance, did my use of common in the last sentence mean “not rare” or “undistinguished”? I don’t know myself.)

Lastly, both usages are accepted as standard by the MWDEU and the Columbia Guide to Standard American English. That said, “in a moment” isn’t without its detractors; the American Heritage Dictionary’s usage panel is a holdout, with only 41% of the panel accepting it. But all that means is that 59% of the panel is uninformed.

Summary: Momentarily can mean either “for a moment” or “in a moment”.  Both meanings are over 140 years old, and both date back to before the word momentarily was common.  Allowing for both meanings doesn’t introduce much ambiguity.

Post Categories

The Monthly Archives

About The Blog

A lot of people make claims about what "good English" is. Much of what they say is flim-flam, and this blog aims to set the record straight. Its goal is to explain the motivations behind the real grammar of English and to debunk ill-founded claims about what is grammatical and what isn't. Somehow, this was enough to garner a favorable mention in the Wall Street Journal.

About Me

I'm Gabe Doyle, currently an assistant professor at San Diego State University, in the Department of Linguistics and Asian/Middle Eastern Languages, and a member of the Digital Humanities. Prior to that, I was a postdoctoral scholar in the Language and Cognition Lab at Stanford University. And before that, I got a doctorate in linguistics from UC San Diego and a bachelor's in math from Princeton.

My research and teaching connects language, the mind, and society (in fact, I teach a 500-level class with that title!). I use probabilistic models to understand how people learn, represent, and comprehend language. These models have helped us understand the ways that parents tailor their speech to their child's needs, why sports fans say more or less informative things while watching a game, and why people who disagree politically fight over the meaning of "we".



@MGrammar on twitter

If you like email and you like grammar, feel free to subscribe to Motivated Grammar by email. Enter your address below.

Join 973 other subscribers

Top Rated