Quantcast

Tag Archives: wine scores

Wine Spectator: If you can’t buy it, we won’t review it

Wine writing, and what's wrong with it

wine spectatorThe Wine Spectator, in a stunning reversal of policy, announced today that it will only review wines that people can buy, ending a decades-long practice where it preferred to critique wine made in such small quantities that there were never any for sale.

“Frankly, when we started to think about it, it seemed kind of silly to review wines that weren’t in stores,” said a magazine spokeswoman. “Yes, there was a certain cachet to do wines in the Spectator where the producer only made three cases, because it showed how much better we were than everyone else. Because we are much better than everyone else. But, in the end, we are a wine review magazine, and if our readers can’t buy the wines we review, there isn’t much reason for us to exist, is there?”

The new availability policy, said the spokeswoman, was based on the one used by legendary Internet blogger Jeff Siegel, the Wine Curmudgeon. Siegel, who declined to be interviewed for this story, uses what he calls general availability: He only reviews wines that consumers can find in a quality wine shop in a medium-sized city. Said the spokeswoman: “Considering how much fun he makes of us, and that he is has no credibility because he is an Internet blogger, Siegel’s policy seems quite practical. Just don’t tell him we stole it.”

Reaction from the wine world was immediate:

• A host of cult wines in the Napa Valley, whose production rarely exceeds 100 cases each, announced plans to increase the amount of wine they make so they can be reviewed. “If we’re not in the Spectator, what’s the point of making wine?” asked one winery owner, a Silicon Valley zillionaire. “It’s not like I care about the wine. I just want my friends to be jealous when they see my wine, which they can’t buy, got a 99.”

• Several other wine magazines said they would follow suit, although the Wine Advocate said it would use availability in China as its threshold. “Listen, when you pay as much money for the Advocate as we did,” said a co-owner, “you really don’t care if anyone can buy the wine in Omaha.”

• The country’s largest retailers, including Costco and Walmart, made plans for special Wine Spectator sections in their wine departments, now that the Spectator would review most of the wine that they carry. “They’re already selling some wine for us with their scores and shelf talkers,” said one retailer. “So why not just get rid of the pretense and let them do all the work?”

More April 1 wine news:
Supreme Court: Regulate wine writing through three-tier system
Gov. Perry to California: Bring your wineries to Texas
California secedes from U.S. — becomes its own wine country

 

Chateau Bonnet Blanc and why scores are useless

winerant

Chateau Bonnet BlancChateau Bonnet is the $10 French wine that is one of the world’s great values and has been in the Hall of Fame since the first ranking in 2007. As such, it has always been varietally correct, impeccably made, an outstanding value, and cheap and delicious. The 2012 Bonnet blanc, which I had with dinner the other night, made me shake my head in amazement. How could a cheap white wine that old still be so enjoyable?

What more could a wine drinker want?

A lot, apparently, if a couple of the scores for the 2012 on CellarTracker (the blog’s unofficial wine inventory app) are to be believed. The Chateau Bonnet blanc scored 80 points from someone who said the label was ugly and 83 points from a Norwegian, and that a Norwegian was using points shows how insidious scores have become.

The irony is that the tasting notes for the low scores were quite complimentary. The 80-point mentioned “crisp dry tones and pleasant blend of melon flavours” while the 83 described herbs, minerals, and citrus, and neither noted any off flavors or flaws. Yet, given those scores, the Bonnet blanc was barely an average wine, hardly better than the grocery store plonk I regularly complain about on the blog.

Which it’s not. Those two wine drinkers are allowed to score the wine as low as they like, and they’re allowed to dislike it. That’s not the problem. The problem is consistency; someone else gave the Bonnet blanc a 90, citing minerality and lime zest — mostly the same description as the low scores. Yet a 90 signifies an outstanding wine. How can a wine that three people describe the same way get such different scores?

Because scores are inherently flawed, depending as they do on the subjective judgment of the people giving the scores. If I believed scores and I saw the 80 or the 83, I’d never try the Chateau Bonnet blanc, even if I liked melon flavors or minerals and citrus. Which is the opposite of what scores are supposed to do. And that they now do the opposite of what they’re supposed to do means it’s time — past time, in fact — to find a better way.

For more on wine scores:
Wine scores, and why they don’t work (still)
Wine competitions and wine scores
Great quotes in wine history: Humphrey Bogart

Can we use wine back labels to figure out wine quality?

winenews
wine back labels

Mark Thornton: “The words — and not what they mean — on wine back labels are a clue to wine quality.”

Because, finally, someone has discovered a way to measure the relationship between what’s written on wine back labels and the quality of the wine.

The breakthrough came from a Harvard Ph.D. student named Mark Thornton, who took data from 75,000 wines in the Wine.com inventory, and compared what was written on their back label — and not what the words meant — with ratings from the site’s users and from wine critics.

The findings? That certain words appear on the back labels of wines of lesser quality, while certain words appear on the back labels of wines with higher ratings. Thornton told me he knows this isn’t perfect, given how scores and wine ratings work, and he wants to improve that part of the study. In addition, he wants to refine the way his software decides which words to analyze, perhaps eliminating regions and better understanding phrases, like grilled meats instead of grilled and meats.

What does matter is that Thornton’s work is apparently the first time anyone has done this kind of research, making it as revolutionary as it is helpful in deciphering the grocery store wall of wine.

Thornton, whose parents teach in Cal State Fresno, says this study interested him because it’s about wine, which he likes, and because it ties into his PhD research, which deals with how we describe things. One of the concepts this study takes into account is called “naive realism,” in which we assume that what we sense has to be true for everyone, when it obviously isn’t. Which dovetails neatly with wine.

Thornton’s findings confirm many of my suspicions about wine back labels, as well as how critics use descriptors. The word clouds on his site summarize the results; I’ve set them up so you can see them more easily here for the consumer ratings and here for the critic ratings.

These are among the highlights of the study:

• Restaurant food pairings or terms like pasta appear on the labels of the lowest-rated wines. Thornton says this may well be because the wine doesn’t have any wine-like qualities to recommend it.

• Words used to describe sauvignon blanc — grapefruit, herb, clean — show up on the critics’ lowest-rated white wines. This is not surprising, given that sauvignon blanc has always garnered less respect from the Winestream Media than chardonnay.

• A location on the back label seems to indicate lower quality white wine; “handcrafted” is in the higher quality word cloud. For reds, “value” and “soft” are poor-quality words, while “powerful” and “black,” probably used to describe black fruit, infer higher quality. Handcrafted is especially interesting, since it doesn’t mean anything in terms of wine production.

Finally, a word about prices, which is also part of the study. Thornton divided the consumer ratings into five price ranges, and there was little difference in perceived quality between the first three ranges. In other words, you get more value buying the cheapest wine. Shocking news, yes?

The critic price-value rankings were even more bizarre. The worst value came from wines that got scores in the mid-90s, while wines in the high 90s (and even 100) were less expensive, and the best value wines were around 90 points. Thornton says he isn’t quite sure why this is true, though it may have something to do with critic bias. My explanation is simpler: Wine scores are inherently flawed.

Powered by WordPress | Designed by: suv | Thanks to toyota suv, infiniti suv and lexus suv