The icon indicates free access to the linked research on JSTOR.

It’s a funny, auto-antonymous kind of fact that those who consider themselves ardent language lovers are quite often also language haters. Too often, prescriptive grammar myths are celebrated as an outward signal of virtue, class, and erudition, which can make some people fairly anxious about using their own language. Pet peeves are often inconsistent judgments about the so-called misuse and abuses of written conventions in the spelling, punctuation, style, and usage of standard language (none of which are actual elements of grammar). Many grammar-lovers seem to have an expectation that language should be ever logical, meaningful, and efficient, and find themselves disappointed in the topsy-turvy, nonsensical language we hear in the wild.

JSTOR Daily Membership AdJSTOR Daily Membership Ad

Take, for example, the strange bunch of words we call intensifiers. They’re words of “fevered invention” that are used “for impressing, praising, persuading, insulting, and generally influencing the listener’s reception of the message.” They’re emphatic, hyperbolic, colorful words that are generally supposed to boost and amplify meaning, not take meaning away, and they do it with style: frightfully well; very cool; wicked smart.

Yet Strunk & White’s famed guide The Elements of Style has a less flattering view of poor qualifiers, those degree words that seemingly have lost their semantic way:

“Rather, very, little, pretty—these are the leeches that infest the pond of prose, sucking the blood of words.”

Put like that, they do seem rather semantically useless, because those words have little meaning on their own. But is a word leached of its lexical meaning really totally meaningless?

Language Changes. Literally.

For example, a strange thing happened to “very,” which used to mean “true” or “real,” as in the Middle English “He was a verray parfit gentil knyght.” Once it was regularly used as an intensifier, it somehow stopped being a meaningful word. Style guides might implore us not to use it (in standard written contexts at least), but it’s still linguistically relevant as one of the most popular intensifiers in both British and American English, according to a study by linguistics scholars Rika Ito and Sali Tagliamonte. Interestingly, there’s evidence that the use of “very” is declining in the younger generation, replaced by a very similar, upcycled intensifier that once littered early eighteenth-century letters (as well as internet memes of the mid-aughts): “really.”

This analogous semantic connection between words that mean “real” and “true,” and later are used as intensifiers, perhaps links up with another famous, most viciously hated leech, namely, the word “literally.” I seem to hear the distant gnashing of teeth just typing the word. Many people believe that, like the Highlander, there can be only one “literally.” That’s when it strictly means what it says, referring only to the exact conventional sense of words, e.g. I am literally on fire” (i.e. I am actually undergoing combustion).

The fact that people can use “literally” not to mean anything in actual fact at all may literally make your blood boil. Even if some of them are esteemed authors such as Jane Austen, F. Scott Fitzgerald, Charlotte Brontë, Mark Twain, and James Joyce, whom we praise for their language skills, when not scrutinizing their poor life choices in the realm of grammar. Considering the social stigma (despite the past centuries of this very same usage) many of the grammatically-anxious among us prefer to avoid using “literally” altogether, which some arbiters of language style might point to as yet more evidence of its superfluity.

The danger is, if we allow for some fearsome, figurative reading of “literally,” then it’s a word—nay, an abomination—that becomes its own opposite. A world where things can be strictly true yet also only virtually true is a world we logically can’t have, lest, like matter and anti-matter colliding, these dimensions destroy each other. Except that it doesn’t really matter because this happens in English all the time. As we’ve already seen, language is not logical. Words can, over time, turn themselves inside out and become their own opposites, auto-antonyms like “cleave” (to stick together or to split apart), “dust” (to remove fine particles or to sprinkle them on) and “draw” (drawing the curtains can open or close them) and we don’t generally have a problem with those words.

“Literally,” by the way—not to mention by the letter (the Latin “littera”)—comes to us already metaphorical at its heart. “Literal” and “literally” are not actually about the letters of the alphabet, even if we’re translating literally (i.e. word for word). In spirit, if not to the letter, “literally” is about letters standing in for knowledge—what we think of as the lexical meaning of words.

But that’s beside the point. Down with this sort of thing, if linguistic clarity, logic, and meaning have been lost, no matter whether the despised usage has a long, established history or whether there are plenty of writers (a.k.a. your problematic faves) who have ignored the strongly-worded advice of their copy editors. We might note though, that if F. Scott Fitzgerald had said of Jay Gatsby that “he really glowed” instead of “he literally glowed,” even though he didn’t, in real life, glow, we probably wouldn’t be in such a tizzy about it, largely because no one judges us for using “really” in this way.

Grammaticalization

This will likely not convince you if you just have a visceral hatred of “literally” and how it’s often used. And that’s really okay. Language changes, as many of us know (but may not like). Words can die. At least, they sometimes lose their meanings so thoroughly it’s like we’re looking into a dictionary of complete strangers—I don’t even know who you are anymore, whispers your neighborhood language pedant bitterly to these turncoat words. Yes, “literally” crazily ended up with an illogically-opposite meaning that some people just find completely unacceptable. But it started out being used not figuratively at all, with the “wrong” meaning; it has always been used for emphasis, as an intensifier just like “really” and “very.”

This is through a remarkable process called grammaticalization, when lexical words living otherwise meaningful lives suddenly are forced to get a job as a part of grammar, often losing or changing or adding meanings as a result. English scholar Paul J. Hopper uses the example of “Amy manages to get a salary increase every year,” where the verb “manage” acts more like an auxiliary verb than conveying its conventional lexical sense of managing something like an office. Similarly, “the fact that” no longer has to be followed by a true, factual statement (another popular pet peeve) but is often used syntactically as a complementizer, as in this real sentence that was not an awkward confession no matter what you heard on the radio: “My opponent has charged me with the fact that I used illegal campaign funds.” So the fact that words may end up losing their meaning, or turn out to convey very little in the way of lexical meaning, doesn’t necessarily make them useless or bloodsucking leeches in the pond of your prose—indeed they may have a more important grammatical part to play than we first expect.

Grammaticalization can happen through analogy, when groups of words with related meanings (like “real,” “true,” “literal”) undergo the same change. Metaphor certainly also plays a big part in changing how we begin to view words differently and invent other uses for them. And when it comes to intensifiers, according to Ito and Tagliamonte, there’s a particular group of speakers who have historically been blamed or lauded (depending on who you talk to) for being terribly good at inventing and popularizing intensifiers: women. In 1754, Lord Chesterfield noted that women have turned the word “vastly” into an intensifier, having overheard a lady weirdly describe a very small snuff box as “vastly pretty, because it is so vastly little.” (And people complain about “literally”!) Scholars of the past noted that ladies (like small children) “are notoriously fond of hyperbole.” In 1922, Otto Jespersen clearly linked the development of intensifiers with female speakers: The fondness of women for hyperbole will very often lead the fashion with regard to adverbs of intensity.”

So this linguistic fashion of the ladies plays a part in how we get the different intensifiers we all use today, in standard language as well as in slang. Oddly, modern intensifiers can often be bad words, made frightfully, terribly, horribly, awfully good. How did this trend of bad words becoming good intensifiers come to be?

When Bad Words Go Good

Curiously, bad words can often develop good meanings (amelioration) and vice versa (pejoration). Although amelioration is generally somewhat rarer, it certainly seems to happen a lot in slang these days, when strongly marked negative words like “bad,” “sick,” “ill,” “dope,” “wicked,” and of course “the shit” (not to be mistaken with its plural, because less is more) for some reason acquire more positive senses to mean “cool” in informal usage. Linguists encounter a lot of wistful regrets for those halcyon days gone by when “awesome” used to mean terrifyingly awe-inspiring, while now it can be used in passing to mean just something blandly good. It’s awfully confusing for a lot of people.

The use of certain intensifiers is a major signal of group identification. Intensifiers are constantly changing. They’re often initially very strong words that can work well to express emphasis or hyperbole. The most popular intensifier in Old and Early Middle English was “swīþe” which meant “strong” and then came to mean “extremely/very” before it was completely replaced in the thirteenth century by other intensifiers such as “well,” “full,” and “right,” some of which are still in use in certain dialects today. Other intensifiers are words that also convey semantic extremes, such as “perfectly,” “absolutely,” “completely,” “utterly.”

But familiarity breeds contempt. As these strong words are used in weaker contexts, or more widely used outside of the group in standard language, they become less useful as an indicator of the group. The more familiar the words, the less expressively emphatic they are, and new styles and trends need to be found. Not only do these new words have to be strong, emphatic, and extreme, they have to be original enough and attention grabbing, perhaps even shocking, as words lose their potency. So it’s not surprising that negative words, which might even be taboo, like “terribly,” “horribly,” “dead,” “wicked,” “mad,” “crazy,” “bloody,” and of course “fucking,” can develop into informal intensifiers. It’s dead weird.

In the end, we all have our linguistic preferences, which is perfectly valid as long as we understand it comes down to personal taste. If we only consider correct the strict and sterile rules laid out by guardians of linguistic taste, those gatekeepers who tidy up the language of everyone except those already notable enough to break the rules, we may just miss out on the richness, versatility, and inventive fun of the language of the people. Even bad words have a chance to be good.

Resources

JSTOR is a digital library for scholars, researchers, and students. JSTOR Daily readers can access the original research behind our articles for free on JSTOR.

Language in Society, Vol. 32, No. 2 (Apr., 2003), pp. 257-279
Cambridge University Press
Annual Review of Anthropology, Vol. 25 (1996), pp. 217-236
Annual Reviews
Language, Vol. 67, No. 3 (Sep., 1991), pp. 475-509
Linguistic Society of America
American Educational Research Journal, Vol. 6, No. 2 (Mar., 1969), pp. 271-286
American Educational Research Association