Some of these words might make you beet red. But are the people who use them always misinformed, or is there more to the story?
Whether we like it or not, the English language is changing and will continue to change. Even though it developed from a language that had a great many rules—German grammar is a tedious thing—we’ve all but done away with many of them for the sake of convenience.
Sometimes, we add rules about how a word should behave—what it means, how to make it plural—only to have that word change entirely. But languages are defined by their speakers. So no matter how “wrong” the new change might seem, we just have to grin and bear it. Or argue to the ends of the Earth to stand on the side of correctness. That works, too.
This might seem a little vague, so we’ll look at some examples. Here are 10 controversial words that might not be as wrong as you think.
It’s a common thought—and even a common teaching—that this word is correct. There’s a historical reason for that. But the most correct plural is not really “octopi” or octopodes as some socially inclined grammarians, who don’t understand that etymologies are not good small talk, would tell you.
The most “correct” plural of “octopus” is actually “octopuses.”
This is because the word “octopus” is not Latin, so the correct plural isn’t “octopi.” The word isn’t Greek either, though, so it isn’t octopodes. The word “octopus” is actually a relatively recent Latinized Greek word and was coined by Carl Linnaeus, the man we call the “father of taxonomy” (the fancy animal-naming system), in the 1700s.
Given that somewhat mixed origin, it’s far safer to use the standard English system of pluralization rather than turn to the Greek or Latin.
On the other hand, these plural forms aren’t entirely incorrect either, just more controversial. The word “octopus” could be thought of as an ancestor of the Latin word polypus, whose plural is polypi. So from a certain perspective, “octopi” could be correct.
And Linnaeus himself used the plural octopodes, even while clearly understanding polypi to be the plural formed from the original word. It seems no matter which plural form you take, somebody, somewhere, will think to correct you for it.
The word “ornery” is a different beast. Despite being quite difficult to spell, it isn’t a word that has accepted an unconventional spelling and the pronunciation differences are quite insignificant. Instead, this is a word that has taken on a different, regional definition that isn’t listed in most dictionaries.
When using this word, many people from the South or Midwest in the United States imagine that they’re describing a rowdy or mischievous child. It could be thought of as an endearing or affectionate term.
But this word can also cause a bit of a cultural clash. It’s often used negatively to refer to somebody who is disagreeable, stubborn, or easily upset. There are several examples spread out over the Internet’s existence of people discovering the word’s “true” meaning and then going on forums and other discussion sites to talk about their experiences, like here.
Although major dictionaries don’t define this alternate use of the word, some sites, like Wiktionary, do. Wiktionary isn’t the most credible authority on any subject, though, so it may be a while before we see the popular regional variant of “ornery” officially recognized.
The misuse of this word is a common complaint. Some who use this word have a verbal tick, as in “I literally can’t even right now.” For others, it’s a form of hyperbole, as in “I feel literally on top of the world right now.” But whatever the case, “literally” should never be used to convey anything except an entirely, unquestionably, literally true claim.
Or that would have been true six years ago, anyway. The word “literally” was used so often in a figurative sense that the Oxford English Dictionary literally added that definition in 2011.
What naysayers of this sense of the word may find interesting is that “literally” has been used as a figure of speech for over 250 years. Even Mark Twain used the word in its more cringeworthy sense in The Adventures of Tom Sawyer.
Despite its addition to the dictionary, “literally,” as used in a figurative sense, is still not standard English. It would be a bad idea to include that usage in a dissertation, but only those who enforce standard English on the general population would correct you for using “literally” in a casual environment.
This list wouldn’t be complete without the word “ain’t,” even if it is fairly common information that “ain’t” very much is a word. What is less common information is that “ain’t” has been in dictionaries for over 250 years, albeit as “slang,” and became a proper, non-slang (but still nonstandard) word in 1993. So the old adage, “ain’t ain’t a word and I ain’t gonna say it” isn’t entirely accurate.
Actually, “ain’t” evolved from an earlier contraction, “an’t” or “amn’t.” For a good portion of English history, the word was used unironically, even by writers like Tennyson and Swift.
Then the word took on too many uses, also representing “is not” and then “has not.” After that, a wave of prescriptivists of the 18th and 19th centuries—people who promoted certain grammatical practices while discouraging others—decided that “ain’t” had become an abomination, a sort of chimera. These groups of people are also part of the reason that contractions, in general, are discouraged in essays and more formal writing.
Today, “ain’t” is an acceptable informal word. Literally everybody in the English-speaking world knows what it means, and there’s not much reason to guilt people for using it. It’s doubtful that it will ever lose its stigma in formal circles, though.
The most common “educated” action when presented with this word is to mark it with a red pen and tell the writer to update their spell-check. But interestingly, this spelling has some backing, even though more formal circles will slap a writer on the wrist for using it.
The combination of the words “all” and “right” into a compound word has been done throughout history. Even in Old English, we had the compound phrase eallriht. One of the most popular songs by The Who is “The Kids Are Alright,” and authors like James Joyce used the word, too.
Despite its history of use, the reading of this word often registers a scowl on the face of editors, as many people hate the word. Using the word “alright” in writing is an act of rebellion. “There’s simply no need for two words that serve basically the same function,” they would say.
It’s alright to use the word, though, but only for those who like to live dangerously. It may not be standard now, but there is a push to make it so.
The most common meaning of this word slipped into the English language somewhat quietly. At one point, the word referred to an action that was done in a “hopeful” manner. A person could look on a situation hopefully or look at the sky hopefully. But they could not say, “Hopefully, this situation will end favorably for me.” Surprisingly, the change to the word was first documented as late as the 1960s.
But even though AP style finally accepted the word in this sense, a select few still hold onto the idea that it isn’t correct. The reason for the controversy is that the meaning of “hopefully,” when used as a sentence adverb, can’t be directly understood from the sentence. “Hopefully” does not translate to “it is hopeful that.” “Hopefully” instead modifies an implied subject, translating to “I hope that” or “we hope that.”
However, this argument against the word has always been tenuous. Words like “sadly” or “thankfully” work the same way, with the meaning simply implied from the context. Despite their similar construction, nobody bats an eye at their use, possibly because “hopefully” in its new sense is such a recent inclusion while words like “sadly” predate it by quite a long time.
Defending this word is certainly not a popular thing to do. Any writer who did so would surely have to issue this type of disclaimer: “I’m sorry in advance. Please don’t send a lynch mob after my family, and please keep the comment section civil.”
The word’s origin is unclear, but it’s probably a hybrid of “regardless” and “irrespective”—an attempt to sound educated but an ultimate failure. Irregardless, the word has been around for quite a while, first appearing as far back as 1795. It’s one of the more controversial words in our language, and despite much insistence to the contrary, it is in fact a word with its own entry and everything.
Like “hopefully,” though, the meaning of “irregardless” doesn’t follow literally. The “ir-” and “-less” both mean the same thing, forming a kind of redundant double negative that should technically mean the opposite of what is intended.
Chances are, if it ever gains acceptance as anything other than nonstandard, it would be idiomatic. It wouldn’t be the first time that a word or phrase becomes figurative due to a linguistic misunderstanding. (See also “I could care less,” another linguistic irritant that’s gained traction over the years.)
This word frequently appears on lists of words you’re probably using incorrectly, but there’s more to the story than that. Like “literally,” “peruse” is host to a controversial double meaning due to common misuse.
Unlike “literally,” the new definition addressing the “misuse” only cropped up recently, sometime late in the last century. A small handful of dictionaries added definitions in the ’80s and ’90s in response, but not every dictionary agrees with the new meaning yet.
The primary meaning (paraphrased) for the word “peruse” is “to read or look at carefully or thoroughly.” Its original meaning is synonymous with a word like “examine” or “inspect.”
By contrast, the new meaning for the word is “to skim, to look through in a casual or selective manner, or to look at.” These definitions, upon close inspection, are opposites, with one suggesting a thorough examination and the other a few simple glances.
It’s like the difference between reading Harry Potter or On the Origin of Species. One perusal involves a close look into the sociopolitical framework of the wizarding world, while the other perusal probably involves Wikipedia and a lot of caffeine.
Some would say that this dual meaning is actually harmful and could cause confusion. On the other hand, a word with two potential, opposite meanings is not altogether unheard-of.
It’s called a contronym, an autoantonym, or a Janus word, and these words can often reflect the more fun or interesting aspects of our language. “To bolt” can mean “to run away,” but it can also mean “to hold in place.” “Fine” can mean “perfect,” or it can mean “only adequate.” “Sanction” can mean “to approve” or “to boycott.” It might be fun trivia, depending on whether a game of Scrabble sounds like a good time to you.
The original confusion may have arisen because “peruse” sounds like “browse” or because the prefix “per-” can mean either “through” or “entirely.” In the original definition, it’s the latter. “Peruse” literally translated to “to thoroughly use” or “to exhaust”—as in “to look at every aspect of something.”
The confusion could have also happened due to simple ignorance. People might have heard the word and thought it meant “to read” rather than “to examine.” Then they started using the word to sound more intellectual, not realizing that they were creating an entirely new definition.
Of the words on this list, “fulsome” is probably used the least. This infrequent use is probably one of the reasons that people use it in a “wrong” sense when they use it at all.
By far, its most common use is in the expression “fulsome praise,” which would seem like a good thing. The word sounds positive—drawing mental associations to “full” and “wholesome.” At one point, this was exactly what the word meant.
Then Samuel Johnson, considered the father of the English dictionary, went and goofed everything up. He and Noah Webster thought that the word “fulsome,” which mostly held a positive connotation for hundreds of years (meaning “copious” or “abundant”), drew its roots from the word “foul.” In his defense, he was probably only cataloging how the word was used in his lifetime and the Old English words ful (essentially “foul”) and full do look similar.
In any case, “fulsome” gained its negative connotation. However, many have fought and continue to fight for its original use, even as far back as 1868.
The word simultaneously retains both definitions, and even former president Barack Obama used the word in its much older, positive sense. He came under some scrutiny for that and for the fact that he misused “enormity,” a word which just missed this list. It’s another word that may create an entirely new definition because people incorrectly associate it with size.
“Fulsome praise” usually means “disgustingly over-the-top and insincere praise,” but some have assumed its meaning to be more positive. Historically, they’re not entirely wrong.
1 Self-deprecating And Deprecate
The holdouts against this word are few and far between, but they do exist. In an archaic sense, “to deprecate” meant “to pray against or ward off” or “to disapprove earnestly.” A word like “self-deprecating” was once viewed to be incorrect, while the correct word was “self-depreciating” (meaning literally “to lower the value of yourself”).
But “self-depreciating” is falling into disuse, kept alive largely by people who want to maintain a notion of correctness because they view “self-deprecating” as wrong. They are right that the meaning of “self-deprecating” does not literally follow, but even standard English has come to acknowledge the word as proper.
The other place where this controversy emerges is in software and code industries, where the word “deprecate” has come to mean “become obsolete.” This meaning has also been accepted, but there are some who say it doesn’t logically follow, as in this blog’s comments.
For whatever reason, the term “deprecate” became associated with computers, even when the association doesn’t logically follow. “Surely, when a feature becomes obsolete,” some argue, “it makes more sense that it’s been ‘depreciated,’ not ‘deprecated.’ ” (Even this text editor denies that “depreciated” is a word.)
Likely a big part of why “deprecate” has historically been used where “depreciate” makes more sense is that “deprecate” is just easier to say and rolls off the tongue. In any event, software developers aren’t likely to stop using it, and “self-depreciating” is much less popular than “self-deprecating.”
Gannon Kendrick earned his English degree in 2014 and has since then studied animation, programming, music, and game development, intending to one day tie it all together.