Jimmy Watson Winner: Casella Premium Cab/Sauv 2003!!!
Posted: Fri Aug 13, 2004 8:39 am
Oh no!!
Australia's First, and best, wine discussion board, and group ... Join Us!
https://forum.auswine.club/
TORB wrote:JWT ~~~~~ yawn! A great marketing opportunity but once again, let me state, who in their right mind would give a trophy for a building that is half finished? Or can you imagine the Archibald giving a trophy to the best unfinished portrait with half the subject missing? No! Yet the wine industry continues to do it every year.
Perhaps Thorpie can get a gold medal for a medal for being first at the 800 meter mark in the 1600?
Watch that analogy. Perkins was accorded the 800m world record on his way to winning a 1500m race a couple of years back.
Sean wrote:TORB wrote:... wine judging is subjective.
Apparently show judges have to taste wines blind so that there is some objectivity in the process. The glasses are lined up on the white-clothed tables with just a number identifying them. The judges taste the wines, give them a score and then the results are compared. The wines will be presented in flights based on variety or wine style. The French Brunet system is usually used, which is a twenty point system. Basically 3 points for condition and colour, 7 points for nose and 10 points for palate. Then medals are awarded to the better wines and after a bit of consultation the trophy for best wine is given. This is obviously how the JWT is done I think. The best wines are lined up and samples of the wines are called for and there is a "taste-off" by the judges to decide a winner. Well that's really the only time I can think of when a scoring system might be required. But even then it is far from perfect. Wine judges have enough experience to recognize some wines and they probably have preferences for certain kinds of wines anyway. This came up at some of the Shiraz Challenges when it wasn't all that difficult to see the link between SA winemakers as judges and SA wines doing better than other ones generally. Up until 2002 there were public tastings of the top 60 shiraz held at various wineries around Nagambie Lakes. There was always a bias toward the SA wines and by the end of the day's tastings I felt there was a sameness with the wine style as well. In 2002 the format for the public tasting was changed to the Wine Australia event in Sydney. I always wondered just how representative it really was and if the judges were out of step with not so much what shiraz people liked, but all the variation in style there is now. In Halliday's Odyssey book he gives an insight into how the judges see this. On the first day of the 2002 event Halliday and John Duval and Stephen Henschke (Trevor Mast had to drop out, because the sale of Mount Langi had to be finalised) had to taste some 217 shiraz and decide on the top 60. These would be tasted again the next day - "We are greatly relieved at the end of the tasting of the 59 wines sifted out from yesterday, with 10 receiving gold medal points, 29 silver and 11 high bronze medals. There are an additional 88 bronze medal wines selected from the first day's judging. The Visy Board Trophy for the Best Wine goes to 1999 Tatachilla Foundation Shiraz, with 1999 Ingoldby Reserve Shiraz coming in second. Third place, and the Hahn's Haulage Trophy for Best Shiraz under $25, goes to 1999 Montrose Mudgee Black Shiraz. In addition to these, 1997 Wynns Coonawarra Michael, 2001 Ballast Stone Estate, 1999 Fox Creek Reserve, 1999 Fullers Barn, 2001 Western Range, 2000 Hanging Rock Heathcote and 2000 Punter's Corner Spartacus all win golds. The public choice, made before our tasting, was for the Punter's Corner Spartacus and for once there was agreement between the judges and the public." I have since wondered if the so-called objectivity of wine scores is there because wine judges don't want to be embarrassed by what wine wins. Or if it just highlights the rorts in the wine show system.
You could argue I suppose that shows like this aren't the same as the capital city shows where professional wine judges are used, not just winemakers. But they're a small cogiscenti made of some real trained wine judges and then a mix of wine writers, wine auctioneers and winemakers who all regularly do the circuit at these wine shows. The idea behind the wine show system is that it promotes quality in the tradition of the old agricultural shows. Maybe it still does that, but all too often it is much more a marketing exercise for wine companies and many well-crafted wines made by small wineries don't enter because of that. When you think about it, it makes sense that the big wine companies invest everything in the wine show system. A row of medals on a bottle will mean a lot more sales in the supermarkets or big chain-stores where their wines dominate and most of their wines are sold. The most blatant example of this was the Rosemount Traditional that won the Jimmy Watson Trophy in 2002. Unlike all the other trophies given out at the Melbourne Show this one is awarded to a one year old red still in the barrel. I still haven't had one bottle that tastes anything like the rapturous reviews it had been given. It is a big volume wine and bottle variation is a factor I think. Also not for the first time I suspect the best bottles went to the critics and industry types and for the rest of us it is the luck of the draw. The problem with a wine show (or any big tasting) is that characteristics will stick out - over-oaking, over the top tannins, high alcohol levels, over-extracted flavours. These things could be regarded as faults or as positives depending on what wine judges like. In recent times the big show-style SA shiraz have been very successful at wine shows, but for the average punter who might want to drink them in the next year or two often they are undrinkable. As others have said before blind tasting of large numbers of wines at wine shows hasn't been all that successful. It's the reason why a lot of wine is so high in oak and alcohol and why show style wines often lack drinkability. Just as bad has been the pendulum-like reaction by wine judges against these wines. In the last year or two a raft of cool-climate shiraz or shiraz blends are in favour really just because they are different. With this emphasis on wine style and all the copycats it encourages, all the wines need do is hit all the right notes with the judges. The trick with all this is that wineries know the wine judges and what is fashionable and send in wines that will do well at these wine shows. A more under-handed method is that specially prepared wines are sent into wine shows and the wine that goes out into the bottleshops can be totally different. Responsibility for what happens at these wine shows is diluted, because the expectation and pressure on wine judges is that they come to a consensus on wines that are awarded the top medals and that the winner is not a matter of chance, but one that has been chosen in a sense. The point is the show system is a contrived way to judge a wine. Even more of a problem is all the marketing BS that happens after that.
This brings me to the power of wine critics and just how reliable their reviews are. Often these so-called reviews are little more than PR - after the wine critics have been to a junket put on by the wine company or distributor or a release tasting or some industry function. Not only is there no blind tasting, but all sorts of intangibles come into play no matter how objective or independent the wine critics make out they are. A big factor is don't bite the hand that feeds you and often I think wine critics might not say anything unless it is something positive. Usually a lot of wine is reviewed at home as well. But in this case it is a sample bottle (or two) sent to the wine critic and then it's just a matter of whether or not a review of the wine is published. It is a dubious arrangement no matter how you look at it. Even more dubious are the wine magazine tastings, where the same wines turn up year after year and somehow by coincidence advertising space is given to the wines that have been favourably judged as well. We don't drink wine the way they are judged and maybe the opinion of a wine that we have drunk over the course of an evening is a much better one anyway. At a tasting whether or not it's a blind tasting at best you are making a quick assessment of a wine. (I don't know if wine critics use blind tasting to do their reviews, but my guess is generally they don't.) Knowing the label and some of the background actually helps that process. If you have a good tasting ability and the wine has had every opportunity to show its best, then really it's just a matter of honesty, isn't it? I think if I was going to recommend a wine to someone else then it should be a wine that I have drunk one or two times and that I have paid for myself. That is because we are more wary and demanding about things that we pay for than things we get for free. And human nature being like it is, subjectivity (or prejudices) rather than objectivity can take over. With wine critics the wine is probably known so that it can be compared with other vintages or similar wines and often the review takes account of the wine's history and style. Is it as good as previous vintages? Has the style changed? How does it compare with other similar wines? These are probably much more useful questions than just asking if it is any good. But if they're not blind tasting then critics still have to juggle the problems of how they can be influenced. Not just by the wine label and the circumstances around the tasting. Also their own history and how accountable they will be. If a critic has praised a wine for some time or talked up a winery you can see that a bad wine could be a problem for him. Or if a wine has been given a high score and then subsequently even the critic realizes that it is not such a good wine. It is not just about skill and craft, but honesty as well. They have to block out everything else and just zero in on what's in the glass. Often you just get a lot of me-too reviews, especially when wines are released. Far too much emphasis is on their scores anyway and critics are fully aware that their scores are going to be bandied about in advertisements. Unfortunately I wonder if a few critics are a bit self-serving and look at other scores and jump on the bandwagon or just outscore others to get attention.