Wine Ratings vs Lucky Rock
Full disclosure: While this blog draws upon certain research, in the end the content represents the subjective opinion of the author, and should be taken as such...
At Lucky Rock, we have had a…let’s say complicated relationship with wine ratings. Like an attractive, but out of our league, woman (or man, or gender non-binary) we have sought out critical reviews of our wines and have succeeded many times, and also have been spurned a few times. We’ve briefly mentioned our scores before , but it is a recent review that we received from a fairly acclaimed wine blog (name redacted) that made us want to take a deeper dive into the murky world of wine reviews.
What happened was we received our first ever B- (82 pts) rating for our 2018 County Cuvee Pinot Noir. Now B- isn’t the worst possible score there could be, in fact we would have killed for a B- in high school, but it was the description that accompanied the grade which confused us. The reviewer described the wine as “Yuuge!”, quoting a certain Ex-Commander in Chief, and said that our Pinot Noir was enjoyable to people who love Cabernet Sauvignon and Malbec. Now, this contrasted our own perception of 2018 Lucky Rock Pinot, an opinion braced by multiple other wine reviewers(Wine.com) that described the wine was “medium-bodied” (certainly far, far from “Yuuge”). How could the perception of two or more “esteemed” wine publications be so different? Is there a right or wrong answer to wine evaluations? What is love? (Baby don’t hurt me), and-most importantly- how can you, the consumer, make the best educated decision about buying wine via scores or reviews?
In the words of Eminem, “Let’s talk about it”
While most of you are probably familiar with the standard rubrics of points, stars, grades, smiley faces, pigs, and other things given to wine to determine its quality, it is tough to qualify exactly what they mean. Originally established by Robert Parker Jr and his publication “The Wine Advocate”, scores took on a life of their own and now every publication, blog, or self-important “wine aficionado” online uses some kind of a scale to say if a certain wine is good or not. The Advocates’ 100-point system (which is basically just a 20-point system, with any wine under 80 points considered “unremarkable” and therefore rarely purchased) is the “Gold Standard” and was tremendously influential in the 70s and 80s. But regardless of the influence, it was basically Mr. Parker Jr’s opinion. It would have been simpler if the ratings were just phrased like this “Robert Parker Jr, a guy who tasted a lot of Bordeaux wines (his specialty) said that this wine is really dang good compared to his previous experiences”. We certainly think it would be less pretentious. Instead, all these mathematic and statistic grades are used to define an incredibly subjective product such as wine. Just another way for the Wine industry to make a simple thing more complicated, and therefore more exclusive.
Wines can be tasted by themselves, or in comparison with others (commonly as part of a competition). Neither of these two options are foolproof, and both are often skewed by the point of reference (Not to mention by the reviewers’ habits, like if a reviewer drank cups of coffee for breakfast or some reviewers who were known for smoking a pack of cigarettes between the tastings.). If some reviewer tastes the wine on its own, they use their prior experiences in tasting the wines of the same type (same region, same grape, same year) as a benchmark and decide where the wine stands. If you are tasting in a line-up, you inadvertently use the other wines in the line-up to compare. In these cases, we typically find that the heavier, more robust, tannic wines will overwhelm the lighter, or more ethereal wines, which is probably not the most realistic evaluation.
Are wine scores reliable?
Essentially, the answer -similar to what your College philosophy professor might have said - “it depends”. Every critic, rater, reviewer, and so on is just a person with a mouth and an opinion. This person has preferences, interests, and their own motivations.One critic might only review low alcohol wines. If you put a higher alcohol wine in front of them, they might balk and say that the wine is total garbage. Another might think wines that lack a richer body that alcohol can give to a wine are simple and unremarkable. Who is right?
In a way… We are not conspiracy theorists, but there can undoubtedly be some foul play. Most winemakers believe that there is not a strong connection between the reviews of the wine and the actual quality of the wine. But there is a direct correlation between the reviews of the wine and the priceof the wine, or the relationship of the winery, winemaker, owner, etc. and the publication. So, some reviewers may have ulterior motives in giving a certain wine a high score, while marking another wine down. A simple, but layman’s example would be this: Winery X releases a wine, and an important critic gives it an extremely high score. Less important critics see this review and decide that they need to get in on this. So, they all mirror the famous critic’s score, and the wine becomes super popular and sells out. Winery X then uses the money it got from the sales, and puts it towards critics advertising, while also reducing the quality of the wine (using oak chips instead of barrels, or subpar grapes, etc., to cut corners). Now the critics are more likely to give Winery X’s next wine high scores, even if the wine isn’t as good as the previous.” In this example, the critics’ self-interests and the Winery’s clout create a cycle of potentially misleading reviews. This might sound cynical, but it happens more than you think.
It is true that point scores are mainly focused on beginner consumers. If a person is new to wine, they lack the point of reference to judge quality for themselves and will look to other sources for validation on purchase- after all, we are human. Wine costs money, and because there are so many choices involved (grape, vintage, region, winemaking, etc, etc), there is a legitimate worry about buying “bad” wine (or FOMO on a good wine). Scores help calm the new consumer down, basically saying “hey, this one guy who knows a lot more than you, says this wine is really good, so at least it can’t possibly be bad.” The best way to avoid pitfalls in wine scores is to be adventurous. Drink wine with ratings, drink wine without ratings, and make up your mind about the wines you like and dislike- practice makes perfect. You will find that some critics and wine blogs have similar taste preferences as you. You can start trusting their reviews more than you would any other critics because they taste quite a lot of wine.
We believe that wine is more pleasant when enjoyed without pretension. Having fun with wine, while still making high quality wines of intent is our motto, and we certainly won’t let the critics, or the media take that away. Lucky Rock will be loved by some and disliked by others. We are totally okay with that. All we are saying is, someone in a twill jacket who ‘stans’ (is a huge fan of) $100 French wines, or a New York city hipster who rolls up his beanie and skinny jeans, drinks only natural wines that taste like mouse droppings and feet (Check out our YouTube video to see what’s that like!), shouldn’t make the decision for you. We would love to have you taste our wine, and let us know what you think
Articles worth reading:
Comments will be approved before showing up.