The other day, this blog post appeared in my Facebook feed with the title: “This Surprising Reading Level Analysis Will Change the Way You Write.”
Once you get past the clickbait title, it’s a pretty good post. The reading level analysis the post is talking about is called the Flesch-Kincaid readability test, a method developed in the 1970s for evaluating the difficulty of a text. Basically, it analyses a text for complexity and assigns it a reading level.
Just for kicks, I fed a brief, randomly-selected chapter from my newest book, Whisper Blue into the analyzer over at ReadabilityScore.com and clicked analyze. My score? A whopping 4.1. Fourth grade reading level. Ha! Shows what an erudite elitist I am! The analysis of Text Quality said I had:
—5 sentences with 30 or more syllables
—20 with 20 or more syllables
—3 words with 4 or more syllables, and
—no words exceeding 12 letters.
And this, really, is ALL that Flesch-Kincaid analyzes. Long sentences and long words get you a higher grade level score; short words and short sentences get you grammar school scores. A reductive metric if ever I saw one, but Flesch-Kincaid was never meant to be more than a very rough guide. There’s really no judgment involved. It’s simply a crude measure of complexity. And you know who else wrote at a fourth grade level? Ernest Hemingway. How about Cormac McCarthy? Fifth grade. Likewise Jane Austen. Tolstoy? Fitzgerald? Stephen King? Yup. They all, apparently, wrote for middle schoolers.
Obviously, these sorts of grades tell us a lot more about the test then they do about the texts. And the whole thing points out some of the difficulties involved in using any kind of standardized approach to evaluating creative work. For example, the analyzer at Readabilityscore.com also includes a tally of three of the best known No-Nos all writers should avoid: use of the passive voice, adverbs, and clichés. My score was 7 passives, 1 cliché, and a whopping 68 adverbs!
Yoiks! I suck! Only, I don’t really. In the first place, a large amount of the text I selected is dialogue. And normal human speech is riddled with adverbs and passive constructions, not to mention clichés. And in the second place, most of these adverbs weren’t bad adverbs, and most of the passive constructions weren’t even passive.
Example—the site flagged the following phrases as passive:
“maybe I shouldn’t be encouraging him.”
“Everything was going to be all right.”
“One minute I was dismissing the whole thing as mumbo jumbo, the next I was withdrawing a hundred dollars from the ATM, just in case.”
These aren’t passive. Apparently, when the search algorithm sees a construction like “was going” or “was dismissing” or “be encouraging,” it mistakes auxiliary verb constructions (such as continuing action) for a passive constructions.
As far as my copious use of adverbs, here are a few examples (the underlined words were flagged as adverbs):
“There you go.”
“That’s probably exactly what happened.”
“That old fake had us jumping around like a couple of citified rubes, but it was all just a show!”
“You are not going anywhere three nights from now!”
Adverbs are not evil. Yes, it’s awful when writers overuse those dreaded -ly constructions, especially in dialogue tags. But adverbs expressing place (there) or time (now) really don’t get my critical dander up. I suppose probably is modifying exactly, (adverbs can modify other adverbs) but then why didn’t they flag exactly, which is an adverb of manner? And I’m not at all sure which verb, adjective or adverb just is supposed to be modifying in the third example.
Needless to say, this isn’t very useful as a writing tool, but there are a lot of sites like readability-score.com out there. Usually they let you try it out for free—paste some text and have them identify the alleged problems. Then, after you’ve used your share of free samples, you can sign up and pay for membership. I’m not sure what readability-score.com charges for membership, but even as a free service, it seems slightly overpriced. And it isn’t simply that it misidentifies the passive voice—I’ve seen human editors do the same.
The problem is more one of attitude. Grammar and syntax have rules, but there are endless subtleties. Even if the website correctly spotted adverbs and passive constructions, the suggestion that adverbs are always wrong, or that the passive voice is always a weak choice, is simplistic at best. You should be aware of what your are doing, at all times,, and make good choices but following boilerplate suggestions for improving your prose is only going to produce boilerplate prose. Good writing is clear, evocative, and surprising—and no algorithm is going to make that happen for you.
(By the way, If you’d like to check it out for yourself, my paranormal thriller about voodoo and cyber-ghosts and the mass hysteria of the crowds is available at Amazon, Kobo, and other places. I think you’ll like it. No matter what your reading level is.)