Wine scores, and why they don’t work (still)
Hedges CMS White is the kind of wine we don’t see enough of in the United States — a well-made, everyday wine at a reasonable price. It’s a blend of sauvignon blanc, chardonnay and marsanne, and I’ve been drinking it (along with its companion, CMS Red) for years. The suggested retail price is $14, but it usually costs less than that, and I picked up a previous vintage over the weekend for $10.
The wine was as it always is — crisp, fresh and even a little more interesting than usual, with more of a stony finish and less fruit than normal (probably because it was a year older). Is it white Burgundy? Nope. But, as the Wine Curmudgeon always points out, it’s not supposed to be.
Which brings us to the Hedges’ entry in CellarTracker, the blog’s unofficial wine tracking software. One of CellarTracker’s most fun features is the public tasting notes, where you can see what other people have to say about your wine. The public notes also encourage wine scores, which I tolerate for two reasons: first, because the software is so good otherwise, and second, because it allows me to point out the fallacy of wine scores.
One drinker gave the CMS White 77 points, which is the equivalent of something you wouldn’t use to get drunk with. Two others gave it 90 points, which is about as good a score as this style of wine is going to get. But my favorite score was an 86, from a person who noted that the wine was “over the hill and starting to fall apart.” Yet, somehow, it still got an 86, which is a fine score for almost any kind of wine.
Those four scores demonstrate everything that is wrong with wine scores. Here are four people, drinking the exact same thing, who come to almost completely different conclusions. And one person, who finds the wine flawed, still gives it a high score. How can anyone, reading this, be anything but confused? And yet wine scoring is considered the be all and end all of wine criticism. It’s enough to make the Wine Curmudgeon take to his wine closet and refuse to come out.
Please note that I’m not criticizing the people who gave the scores. Each is entitled to his or her opinion. I’m criticizing the system that forces them to give scores. They have different palates, different approaches to wine, and different ideas about what the wine is supposed to taste like. The 77 person described the Hedges as bland. Which I can see, if 77 prefers heavily oaked California wines with high alcohol. If, on the other hand, you prefer lighter, less extracted wines, as I do, then the wine isn’t bland. It is (as I wrote) crisp and fresh.
At best, as this exercise demonstrates, scoring is wildly inconsistent. At worst, it’s even more confusing than not using any score at all. Look at the Hedges scores in CellarTracker!, and tell me how anyone can decide to buy the wine based on what the scores say.
Image courtesy of Jacksonville Wine Guide, using a Creative Commons license