Page 1 of 2

Score creep

Posted: Mon Feb 23, 2015 10:18 pm
by dingozegan
A fellow wine lover and I were talking recently about the wine scoring of critics. Neither of us really follow the wine scores of critics much but it's a topic that's often discussed among wine lovers. We talked about the apparent score creep of critics: the idea that the average score of all the scores given by an individual critic have slowly crept upwards over the last many years (you might say that what was a Halliday 90 would now be a Halliday 95). I said that I would fully expect the scores of the vast majority critics to have 'crept' in their publications, not just Halliday. But, you would never really know unless you actually looked at the scores in detail. Some analysis followed, and these are some of the results.

For those interested/so inclined there are notes on the data and analysis below.

Some might prefer a probability density function (basically, that classic 'bell curve' you get for all kinds of natural phenomenon), but I prefer this variation:
scores1.gif


The above figure shows the cumulative percentage of all scores that equal or are greater than a certain score value. For example, of all the wine scores published in the Wine Companion in 2005, about 42% of those wines scored 90 or above, but only 18% scored 93 or above.

There definitely appears to have been score creep between 2005 and 2009, for both The Wine Front and Wine Companion.

(There's a distinct kink in the curve on Wine Companion scores between about 91 and 94 for both 2005 and 2009. That's weird...)

But how do these publications compare to other critics?

Here's The Wine Advocate in 2006, Gary Vaynerchuck in 2009 (deserves a shout-out as a critic), and the fellow wine lover I was talking about scoring with :-).
scores2.gif


Where's the most score creep happening?
scores3.gif


The Wine Front scores have crept more than the Wine Companion scores. (And again, while there's a nice bell curve for The Wine Front, the Wine Companion is all weird around scores of 90 to 94.)

Notes: Scores sourced from the Wine Companion database and The Wine Front database. The analysis does not distinguish between individual critics, only the scores published. Whole-number scores used only (not ranges), and scores recorded as "x+" where x is the score where considered equivalent to a score of x. Wine Advocate score range is 50 to 100, which is different to the 80 to 100 range of others.

Re: Score creep

Posted: Mon Feb 23, 2015 10:36 pm
by TiggerK
Hmm, that's interesting indeed.... thanks for all the effort you've put into that dinzozegan (and shrewdwinelover ;-)

Follow up comments are going to be fascinating...

Image

Re: Score creep

Posted: Mon Feb 23, 2015 10:53 pm
by Diddy
Would be really interesting to see some more recent datasets... I've been into wine in a serious sense for about 3 years and the creep during that time seems substantial.

That being said, you can't discount the significance of vintage variation; was 2009 a better vintage on average than 2005 hence the creep?

To make it really scientific, you'd probably need to look at vintages rated comparably by the critic.

Re: Score creep

Posted: Mon Feb 23, 2015 11:16 pm
by dave vino
Diddy wrote:Would be really interesting to see some more recent datasets... I've been into wine in a serious sense for about 3 years and the creep during that time seems substantial.

That being said, you can't discount the significance of vintage variation; was 2009 a better vintage on average than 2005 hence the creep?

To make it really scientific, you'd probably need to look at vintages rated comparably by the critic.


You could probably dib, dib, dib these into a spreadsheet to get a basic average for the 2 years?

https://www.langtons.com.au/Content/pdf ... eChart.pdf

I also reckon if you did the last 3 years the score creep on Halliday would be even more prounounced, 2011 nonwithstanding.

Re: Score creep

Posted: Mon Feb 23, 2015 11:50 pm
by Polymer
Well, 2005 and 2009 would have represented the published dates not the vintages. So 2005 would represent mostly 2003/2004 with some 2002. 2009 would be mostly 2007/2008 with some 2006.

Interesting analysis and pretty much goes with what people thought they were seeing...

Hard to say if score creep is intentional or not but it is a perfectly natural reaction to scoring higher...You get rewarded for scoring higher..places quote you, put you on shelf talkers..your brand grows....you get sent more wines to review..you get asked to go here or there..whatever....Scoring lower gets you nowhere...

Re: Score creep

Posted: Tue Feb 24, 2015 12:07 am
by Diddy
Polymer wrote:Well, 2005 and 2009 would have represented the published dates not the vintages. So 2005 would represent mostly 2003/2004 with some 2002. 2009 would be mostly 2007/2008 with some 2006.

Interesting analysis and pretty much goes with what people thought they were seeing...

Hard to say if score creep is intentional or not but it is a perfectly natural reaction to scoring higher...You get rewarded for scoring higher..places quote you, put you on shelf talkers..your brand grows....you get sent more wines to review..you get asked to go here or there..whatever....Scoring lower gets you nowhere...


Fair call on the vintage vs published date. The more I think about it, that would be a great way to prove/disprove the 'score creep' hypothesis - compare similar vintage wines across a number of publication years.

Re: Score creep

Posted: Tue Feb 24, 2015 9:01 am
by maybs
Nice one. Good to see we aren't all going crazy :)

Re: Score creep

Posted: Tue Feb 24, 2015 12:28 pm
by pstarr
When did Gary Walsh shut down Winorama and move across to Winefront? Are you comparing Campbell Mattinson's scoring in the 2005 sample to a 2009 sample with different reviewers?

Re: Score creep

Posted: Tue Feb 24, 2015 8:19 pm
by TiggerK
pstarr wrote:When did Gary Walsh shut down Winorama and move across to Winefront? Are you comparing Campbell Mattinson's scoring in the 2005 sample to a 2009 sample with different reviewers?


Yeah that's probably a factor of course, maybe just reflects CM's slightly lower scores overall. Perhaps he shows less regular tendency for a big pointy hypefest, but every reviewer is guilty of it, it's what makes/keeps them known.

http://www.winefront.com.au/winorama-the-wine-front-announcement/#.VOxBUC7p9AM

Re: Score creep

Posted: Wed Feb 25, 2015 5:15 am
by rossmckay
I agree that scores appear higher in Australia than in other places and that they seem to be creeping higher. I'm happily in 86-89 land when drinking bordeaux and burgundy and love those 90-93 splurges when the opportunity arises. Italian and Spanish wines seem to be higher scores for $ point so regularly drink 90+ wines from there.

However, in this instance is there other variables at play here? Hanging around economists too long (damn their logical brains) makes me want to investigate further.

The quality of vintages over the sample time has been raised.

Another might be the number of reviews that are published versus not published? For example, if you taste 1000 wines and only publish the 50 that are 95+ then all your scores would apparently be 95+ but not 95% of the wines tasted would be 94 or below.

Has a similar comparison been made about wine show results?

The only way to truly know would be for all wines made to be scored by the same source over sufficiently long period of time to remove any bias.

A great piece of work btw.

Re: Score creep

Posted: Wed Feb 25, 2015 11:17 am
by Duncan Disorderly
Interesting graphs, although one imagines that comparing two years worth of data does not a trendline make, particularly with the vintage on vintage variability of wine.

In regards to your last graph. Noting the evenness of The Wine Front's curve would it be fair to say that they simply reviewed more wines in 2009 than 2005, noting that the high point of the graph is at 90 which is not a particularly high score - by Australian standards. Moreover by that metric the real bracket creep is in the Wine Companion as their curve lacks evenness and is more oriented to greater percentages of difference in the higher scores.

Re: Score creep

Posted: Wed Feb 25, 2015 2:14 pm
by Polymer
As far as comparing the two points in time...It is possible it wouldn't be representative...but it definitely makes it look like that is the case....People keep talking about vintage variation but the comparison is the 2005 and 2009 publishings...So we'd cover probably 2002/2003/2004 for 2005...and 2006/2007/2008 for 2009. If you look at Australia as a whole, would you say there is a huge difference between those groupings of years? I'd say probably not...

But also, I'm not sure we're reading that right because of how the bottom graph is...Maybe Dinzo can clarify...

If the 2005 publication is EVERYTHING published by them up to that point..and 2009 is EVERYTHING published by them at that point (which also includes all the data from 2005) then the data is even more damning and really, we're just taking the next added vintages (2005, 2006, 2007, 2008) into the mix...Which makes the bottom graph make more sense because if we were comparing only the published scores from the two years, we'd have both positive and negative % change...whereas only a positive % graph just shows how many more wines in each score have been added..I actually think this is probably the case and makes an even stronger argument IMO.

Re: Score creep

Posted: Thu Feb 26, 2015 3:09 pm
by Gary W
Problem with that is a few things
a) we don't nearly review everything we taste - I tend to select the better wines - thus you will get a higher average. Halliday website has many scores, brief reviews. I would not mind doing this, but a function of time v (fiscal) reward at present.
b) in 2005 they are all CM's reviews - none of mine - and back in the day he would do a broader mix of scores. I also bring in a lot of pretty high end imports...skews upwards.
c) now we have MB, who is eclectic in what he reviews, but much of his Australian wine work goes to other publications.
d) as an aside, I reckon CM is a bit more generous than me in the scoring department, MB more parsimonious. Sometimes when I go, I go, that being said, not really THAT many big things.

Here's a list of ALL the 97 point rated wines from V2004 - 2014 - Author initials on right. It would represent roughly 8 or 9 years of reviews, based on vintage. 45 wines over thousands an thousands of reviews. I did the percentages one time, it was very low.

    Wine name RatedPrice Drink Author
    1. Bass Phillip Reserve Pinot Noir 2012 97 350 2017 - 2028+ GW
    2. Bindi Block 5 Pinot Noir 2008 97 92 2014 - 2022 CM
    3. Bond Estates ‘Vecina’ 2008 97 550 2018 - 2028+ GW
    4. Castagna Sparkling Syrah 2005 – 3rd Disgorgement 97 ? 2009 - 2021 CM
    5. Chateau Latour 2004 97 1280 2024 - 2044 GW
    6. Clonakilla Shiraz Viognier 2005 97 78 2011 - 2020 CM
    7. Clonakilla Shiraz Viognier 2007 97 80 2009 - 2025 CM
    8. Cullen Diana Madeline 2012 97 115 2015 - 2042 GW
    9. Cullen Kevin John Chardonnay 2006 97 70 2008 - 2016 GW
    10. Dalwhinnie Moonambel Shiraz 2007 97 60 2010 - 2020 CM
    11. Didier Dagueneau Les Jardins De Babylone 2005 97 340 2014 - 2029 MB
    12. Domaine de la Romanée-Conti La Tâche 2006 97+ 2200+ 2022 - 2040+ GW
    13. Domaine Raveneau Chablis les Clos 2006 97 2012 - 2030+ GW
    14. Giaconda Chardonnay 2010 97 ? 2015 - 2021+ CM
    15. Giaconda Warner’s Vineyard Shiraz 2004 97 70 2008 - 2018+ GW
    16. Grampians Winemakers Reserve Shiraz 2008 97 2012 - 2023 CM
    17. Harlan Estate Proprietary Red Blend 2007 97 850 2017 - 2037 GW
    18. Henschke Mount Edelstone Shiraz 2004 97 93 2013 - 2024 CM
    19. Hillcrest Vineyard Premium Cabernet Sauvignon 2005 97 50 2007 - 2025 GW
    20. Hoddles Creek Estate 1er Pinot Noir 2013 97 45 2016 - 2028+ GW
    21. Houghton Gladstones Cabernet Sauvignon 2010 97 79 2014 - 2030+ GW
    22. Houghton Jack Mann Cabernet 2004 97 105 2014 - 2030 GW
    23. Howard Park Abercrombie Cabernet Sauvignon 2012 97 113 2019 - 2039 CM
    24. Howard Park Cabernet Sauvignon Merlot 2004 97 75 2011 - 2024 GW
    25. Hunter Valley Semillon 2013 pre-release tasting May13 97 - MB
    26. Leeuwin Estate Art Series Chardonnay 2005 97 80 2008 - 2015+ GW
    27. Leeuwin Estate Art Series Chardonnay 2006 97 96 2012 - 2018 CM
    28. Marius Symphony Shiraz 2012 97 45 2016 - 2032+ GW
    29. Mayford Tempranillo 2013 97 36 2017 - 2026+ CM
    30. Olivier Bernstein Chambertin ‘Clos de Beze’ 2009 97 650 2021 - 2039+ GW
    31. Penfolds Bin 707 Cabernet Sauvignon 2005 97 175 2022 - 2040 CM
    32. Penfolds Bin 707 Cabernet Sauvignon 2006 97 175 2015 - 2030 CM
    33. Penfolds Grange Shiraz 2006 97 599 2020 - 2040 CM
    34. Penfolds RWT Shiraz 2005 97 160 2015 - 2025 CM
    35. Produttori del Barbaresco 2007 – Single Vineyard Wine Wrap 97 125 2012 - 2042+ MB
    36. Punch Lance’s Vineyard Close Planted Pinot Noir 2005 97 80 2007 - 2020 CM
    37. Pyramid Valley Vineyards ‘Field Of Fire’ Chardonnay 2009 97+ 95 2012 - 2022 MB
    38. Standish Wine Co The Standish Shiraz 2012 97 95 2014 - 2039 MB
    39. Stella Bella Serie Luminosa Cabernet Sauvignon 2011 97 75 2019 - 2030 CM
    40. The Wanderer Upper Yarra Pinot Noir 2008 97 50 2012 - 2019 CM
    41. Tyrrell’s 4 Acres Shiraz 2007 97 34.50/46 2017 - 2037+ GW
    42. Vietti Villero Barolo Riserva 2006 97+ 616 2016 - 2036+ GW
    43. Vincent Paris La Geynale Cornas 2010 97+ 130 2020 - 2040 GW
    44. Weingut A Christmann Idig GG Riesling 2013 97 120 - MB
    45. Wendouree Cabernet Malbec 2012 97 45 2018 - 2042 GW

http://www.winefront.com.au/search/?l=0 ... earch&c=50

Not a bad list of wines, really.

PS. If anything, as a reaction to all this stuff, I reckon both MB and myself, if anything, are marking HARDER, across the board.

Cheers
GW

Re: Score creep

Posted: Thu Feb 26, 2015 3:14 pm
by Gary W
Duncan Disorderly wrote:Interesting graphs, although one imagines that comparing two years worth of data does not a trendline make, particularly with the vintage on vintage variability of wine.

In regards to your last graph. Noting the evenness of The Wine Front's curve would it be fair to say that they simply reviewed more wines in 2009 than 2005, noting that the high point of the graph is at 90 which is not a particularly high score - by Australian standards. Moreover by that metric the real bracket creep is in the Wine Companion as their curve lacks evenness and is more oriented to greater percentages of difference in the higher scores.


That is precisely right, to my interpretation. And yes, confirmed, the number of reviews on WF has jumped dramatically. 859 reviews Dec14 alone, for example. 1,147 Jan14 alone. Average is around 300 a month now. Back in 2009, around 150 per month. Before, that less than 100 quite often.

Re: Score creep

Posted: Thu Feb 26, 2015 3:31 pm
by Gary W
And while I'm here, one last thing.

Our 'gold medal' score has stayed at 94 points, which was modeled on the 'gold standard', as it were, of James Halliday.
In the last couple of books, specifically the last one, he notes on scoring that to move in line with Aus wine shows, gold is now 95, not 94. Thus all the scores get shuffled up 1-2 points to meet this new scale. Talking about the tail wagging the dog here. The shows SHOULD have just adopted JH's old scale. I sat on a panel with him discussing this up in the Hunter. But JH being an affable and kind man, adjusted up to match the shows. Anyway, there you go. There's a jump in trend of 1-2 points directly over one year. You can reference this in his current book, in the pages preceding the reviews.

Also, on WF, someone (not me) did an analysis of score creep on WC over the last few books - in the brackets. I won't publish it here out of courtesy, because I got into trouble for it last time...

Re: Score creep

Posted: Thu Feb 26, 2015 3:41 pm
by pstarr
Good stuff Gary.

Re: Score creep

Posted: Thu Feb 26, 2015 3:41 pm
by Gary W
Polymer wrote:
You get rewarded for scoring higher..places quote you, put you on shelf talkers..your brand grows....you get sent more wines to review..you get asked to go here or there..whatever....Scoring lower gets you nowhere...


First part, yes. None of us at WF care about that. Long since given up. We are regularly replace by Tyson and WC in the quoting department, aside from when we are close (or higher), but often we are quoted because we are most timely with the reviews. Also, I'm acutely conscious of the image of a consumer having, say $50, and trying to work out where to spend it. If they are paying for (or just going on) my review/opinion, I really want them to LOVE that wine. Our currency is joy and satisfaction, not cursing and someone saying 'this is shit and over-rated'. i.e. we do not write for the wineries. Also, I'm dropped from heaps of wineries because they don't like me, or I've given them a bit of a touching up...

Second - none of us go to 'lunches' and junkets hardly ever. Too busy working, and they don't pay money - and it takes up time. I can't stand long wine hosted lunches. I have always (as all of us do) turn down hosted offers to the Australian Open, Bledisloe (in NZ), Test Cricket (Adelaide) , helicopters, private jets - you fkn name it! etc. May work for others, but I think unrelated and compromising. Rare to see the WF boys at any events, in fact, we are the lost boys... Do go to trade tastings and sit down and review 20,30,40 wines in a sitting, maybe eat a few crackers, then leave.

Just setting a few things out. Hope seen as enlightening rather than defensive/aggressive.

Re: Score creep

Posted: Thu Feb 26, 2015 3:49 pm
by Polymer
Gary - So what is your general opinion about Score creep? Do you think it is happening? And if so, what do you attribute that to? Do you feel Australian critics, on average, score higher than their counterparts in other areas? I think I remember reading something about the scores to be viewed in relation to other Australian wine critics and not from a global perspective.

Definitely appreciate the responses..not the least bit defensive/aggressive...

Re: Score creep

Posted: Thu Feb 26, 2015 3:55 pm
by Gary W
It's happening. Locally, for sure. Globally, also.
Why? Counter-intuitively (at least to me), it seems to INCREASE popularity with consumer. I think the 'general public' are not as finely tuned, interested, as the small percent of the type of people who subscriber to review sites/visit forums etc. But we do what we do, and I'd like to keep it a bit more tame. 95 points to me means 'blow your socks off', not just 'a very good wine'.
Australian writers score too highly, yes. But then, Parker gave a lot of this really gross, oaky, acid adjusted SA shiraz 98/99/100 points, way back when :) Also, LBP for the Wine Advocate is not exactly shy on the scoring front...
Jancis just seems to give everything 16 :)

Re: Score creep

Posted: Thu Feb 26, 2015 5:17 pm
by Polymer
Parker did score some wines highly..but not on every wine...and while I certainly don't agree with this scores, he made it very clear what type of wine he liked and scored them accordingly and I feel he is relatively consistent in what he's looking for...

One other question...Do you feel any of the critics compromise on their journalistic integrity and release some higher scores for the extra press? You've mentioned that isn't important to Winefront but do you feel any of the other critics do that? You don't have to name them but just as a general question...I'm not saying they accept gifts or incentives from wineries to give better scores...It is more, do you think they sometimes get a bit loose on the scoring because they know it'll garner them extra shelf talkers, quotes, etc. I guess this is the same thing I've said in a statement that you said yes to but I guess I'm just flat out asking the question :).

Re: Score creep

Posted: Thu Feb 26, 2015 5:21 pm
by Gary W
Yes. I think it's apparent around the time of Penfolds Bin releases...leading with Grange.

Re: Score creep

Posted: Thu Feb 26, 2015 10:36 pm
by dingozegan
Appreciate the feedback, thanks, and glad to see the discussion.

Some good points raised about the analysis too. It was just a bit of fun - not a scientifically rigorous study - so there are obviously plenty of holes in it. For a start, there's the issue that the number of wines reviewed per year (and published) are quite different, which brings the whole statistical significance of the analysis into question.

The comparison of 2005 and 2010 was chosen for the 5 years separation and the (larger) quantity of data available for each year. Dates quoted in the figures are tasting dates, not vintage dates, and multiple taster scores are included (for The Wine Front).

The figure only suggests score creep between 2005 and 2010 - and, I agree, that's not enough to confirm a 'creep' trend. I'm not actually so sure the trend would hold for other tasting/publication years. It's potentially confounded by the fact that there's no account made for wines being of different vintages (as others have noted) - and the average vintage between different critics might be different for the same publication/tasting year (i.e., most of the wines scored in 2009 by critic A could be 2007 wines while most of those scored by critic B might be 2008).

Comparing similar vintages across different tasting dates would be a good way to go. Originally I thought comparing one mutiple year range to another mutiple year range (like 2000-2005 versus 2006-2010) would be a good way to go (especially if the number of wines reviewed in each of those two datasets could be similar), but the available databases were too small for that... so it would be back to data collection/collation... given some time and being so inclined, it might be interesting to look at other variables too...

Just to answer some questions:
pstarr wrote:When did Gary Walsh shut down Winorama and move across to Winefront? Are you comparing Campbell Mattinson's scoring in the 2005 sample to a 2009 sample with different reviewers?

The Wine Front scores are for all reviewers (perhaps not stated clearly enough in small text in original post).
Duncan Disorderly wrote:Noting the evenness of The Wine Front's curve would it be fair to say that they simply reviewed more wines in 2009 than 2005, noting that the high point of the graph is at 90 which is not a particularly high score

In the database used it's the other way around: there're about 3300 scores for 2005 and about 2200 for 2009. Shape of the curve is another story. The fact that the Halliday scores aren't normally distributed (and especially around scores of 91 to 93) is IMO... strange ("manipulated"?), irrespective the issues raised about the analysis. The distribution's like that for all scores from 2005-2010, BTW.
rossmckay wrote:The quality of vintages over the sample time has been raised. Another might be the number of reviews that are published versus not published?

Sure, but then, it is about score creep in published scores anyway.
rossmckay wrote:Has a similar comparison been made about wine show results?

That would be interesting. I've seen it for some small-scale show results (mainly in the USA) but I've never seen any similar analysis done on the scores of critics. If anyone here has, I'd be interested to hear.
Polymer wrote:If the 2005 publication is EVERYTHING published by them up to that point..and 2009 is EVERYTHING published by them at that point (which also includes all the data from 2005) then the data is even more damning and really

The figures are for scores from wines tasted in 2005 only, and scores from wines tasted in 2009 only. Perhaps not a very fair comparison, as previously mentioned (need more data).

Re: Score creep

Posted: Fri Feb 27, 2015 10:27 am
by pstarr
"3300 scores for 2005 and about 2200 for 2009".

This seems strange. I'd have expected more wines tasted in 2009 than 2005.

Re: Score creep

Posted: Fri Feb 27, 2015 11:21 am
by phillisc
Gary, thanks so much for your input and time here...nice to see people who do this for a living post their honest reflections.
I am of the firm belief that more wineries in this country need a touch up, a reality check, call it what you will.

I don't taste/purchase enough wines a year to be excessively worried by points, score creep, who does what with what scale etc. etc.

As a small time punter what does worry me is how certain wineries 'excessively' trade on a GW, CM or particularly, a JH score.
It if comes in at 97 then it should automatically be a $100+ wine, or that the tiresome quote is wheeled out time and time again...'this should be 3 times the money'.
Equally if a wine collects a fist full of trophies at a show...it will often be marketed as exceptional and therefore "worth" lots and lots of money.
Funny, at next show same wine 'might' get a gold only.

I guess from my point of view, not much can be done about this other than, don't by said wine and winery might get the message, or two, buy wines where writers such as yourself have reviewed them, but it is an overall balanced perspective and wineries do not end up being continually seduced by high scores and thus increase their prices by ridiculous amounts each year.

It would be great if there was a position where a writer has reviewed a wine, the winery takes this in good faith, but it is just one opinion...no more no less.
Great when wines 'can quietly speak for themselves'. They might shout loudly one day and be mute the next...just like a full or empty trophy cabinet.

Anyway probably have not made much sense here, but my 2c.

Cheers
Craig

Re: Score creep

Posted: Fri Feb 27, 2015 12:14 pm
by Gary W
pstarr wrote:"3300 scores for 2005 and about 2200 for 2009".

This seems strange. I'd have expected more wines tasted in 2009 than 2005.


All the ones marked as 2005 are actually 2002,2003,2004,2005 where we loaded all the stuff done on the old 'WineFront Montly' PDF to the site (well, I did), so that is actually 4 years of reviews. About 98% of it is CM's work.

As a result, 2009 is lower as it represents exactly one year. This is only CM and GW, no MB.

Re: Score creep

Posted: Fri Feb 27, 2015 12:15 pm
by Gary W
phillisc wrote:Gary, thanks so much for your input and time here...nice to see people who do this for a living post their honest reflections.
I am of the firm belief that more wineries in this country need a touch up, a reality check, call it what you will.




Cheers
Craig


Most of the wineries take it as 'a review' and move on. The bright ones do anyway. That being said, no one really enjoys being told their children are ugly...

Re: Score creep

Posted: Fri Feb 27, 2015 5:01 pm
by kaos
I reckon we're all a bit like junkies - a fix now just doesn't me off the way a fix diod when I was twenty. If you give me a gold medal standard wine now (now that I have drunk loads of good stuff and my tastes have developed) then I would say good wine. If you had given me the same standard wine twenty years ago I would have been effusive in its praise and would probably go all hyperbolic. The result is that we all probably tend to see the same score to the same standard of wine as being marking too highly as time goes by, when in fact it may be the same and that our perceptions are what have changed. On a differnt tack, could be that wines have improved.....

Re: Score creep

Posted: Fri Feb 27, 2015 8:57 pm
by GraemeG
I thought about a new thread, but thought I'd post here anyway, since it's evolved into a bit of a critics discussion...
I have to say, I'm increasingly impressed by what seems - consistently - to me to be the 'take no prisoners' attitude of Jeremy Oliver. I long ago gave up any though of subscribing to any 'website' of his, but his annual guides are - despite their far more limited scope - of far more value the JH's WC.
I'm old enough the remember JO going out on a limb and calling Henschke for brett when the 98 reds came out; and I've never had a top notch Edelstone or Cyril from that vintage; basically, he was right as far as I'm concerned. He trashed Yarra Yering's 2007 and 2009 releases (and having tasted a very flat 07 No 1, I'd back him here too), and, it would seem, has been dropped by them from supply of future tastingsamples, at least to judge by their absence from his 2015 guide.

So here's a tip; Bannockburn will go soon too. He's been ripping their wines for a number of editions now; I haven't tasted them, but I do rather appreciate it when someone experienced is actually prepared to say "these aren't as good as they used to be/should be" or similar. Redman might be in a similar category. In the 2015 guide he's got stuck into Moss Wood (again), and even the sacred cow of Cullen; I get the impression that even Mount Mary is being re-assessed in the brutal light of recent developments. He's also been forthright about what he obviously saw as the decline in Howard Park; indeed, despite the 'wannabe' nature of his 'five-star wine' list, it's interesting to see he wines he's prepared to drop from it!

Look, I don't always agree with what he says, but it's good to see all his tasting experience being dealt 'both ways' so to speak, and that he doesn't just stay shtum when he thinks some winery has lost the plot.
I think the whole "we only publish notes of the best wines; the ones you should buy; don't worry about the wines we don't mention" attitude (rarely so crudely expressed though) is the ultimate cop-out.

Hell, if I can post even cursory notes of wines I think are crap on Cellar-Tracker on an amateur basis, the least professionals can do is record some kind of thought on all the wines they taste; purely for reader calibration purposes if nothing else.
cheers,
Graeme

Re: Score creep

Posted: Fri Feb 27, 2015 10:06 pm
by Polymer
kaos wrote:I reckon we're all a bit like junkies - a fix now just doesn't me off the way a fix diod when I was twenty. If you give me a gold medal standard wine now (now that I have drunk loads of good stuff and my tastes have developed) then I would say good wine. If you had given me the same standard wine twenty years ago I would have been effusive in its praise and would probably go all hyperbolic. The result is that we all probably tend to see the same score to the same standard of wine as being marking too highly as time goes by, when in fact it may be the same and that our perceptions are what have changed. On a differnt tack, could be that wines have improved.....


You think they've improved so much that we're all butting up against 100 points for every wine?

I have no doubt things have improved so the very low end is actually drinkable..and no doubt there are better ways to adjust and save a wine...and there are fewer flawed wines as well...but great wines are great wines..and decent wines are decent wines...Older great bottles are still great. I'm not a fan of points but just for arguments sake...a very high scoring wine made many years ago is still a very high scoring wine now...If the competition has improved as you say it might have, that shouldn't be the case..but it is..

If you look at how scoring used to be done..90 used to be a really good wine...wines in the high 80s were still very good...now, anything below 90 is considered complete plonk and in Australia, if it doesn't have at least a 93 or 94 from at least someone, it probably isn't any good at all..

But if we assume all wine has gotten better...would it be safe to say the high end is better then? So where do those wines have to go score wise? Are we now on a 105 point scale because all the wines are better? If a 98 point wine is the norm, what is a 100 point wine? Or if you don't feel the high end has gotten any better, when you try these wines that have "improved" and a really fantastic bottle of wine...you don't see any daylight between the two? Maybe for some it is a crack but I still see a valley...

I'm also not saying what you said can't be true...There is no doubt what we consider "good" changes...but I think a lot of wine geeks try a lot of stuff too and if you put everything in relation to other wines there should be a scale..that scale just seems off.....

The 100 point scale really went from a 30 point scale (70+) to a 20 point scale (80+) to a 10 point scale (90+) and it is nearly a 5 point scale in Australia (95+). I just think it has more to do with promotion and keeping wineries sending you wine (for smaller guys).

Haha..I'm not even sure why I care..for the most part when people talk scores all I hear is 'buzzz bzzzzzzz"...although to be fair, when I hear 98, 99, 100 points, do I take notice? Sure..even if I'm pretty sure that wine is nowhere near that good, I do pay attention and I give the person that scored some notice (branding) and I'll want to try the wine because hey, someone who knows what they're doing liked it, A LOT. So the very thing that I dislike I'm supporting...and to be fair, I've yet to have a wine that was 98/99/100 by any critic that I thought wasn't a good wine. I might not have liked it as much as they did and might have been disappointed because expectations (or the price tag) was so high but they've all been good wines..

Re: Score creep

Posted: Sat Feb 28, 2015 1:31 am
by Tom A
GraemeG wrote:I thought about a new thread, but thought I'd post here anyway, since it's evolved into a bit of a critics discussion...
I have to say, I'm increasingly impressed by what seems - consistently - to me to be the 'take no prisoners' attitude of Jeremy Oliver. I long ago gave up any though of subscribing to any 'website' of his, but his annual guides are - despite their far more limited scope - of far more value the JH's WC.
I'm old enough the remember JO going out on a limb and calling Henschke for brett when the 98 reds came out; and I've never had a top notch Edelstone or Cyril from that vintage; basically, he was right as far as I'm concerned. He trashed Yarra Yering's 2007 and 2009 releases (and having tasted a very flat 07 No 1, I'd back him here too), and, it would seem, has been dropped by them from supply of future tastingsamples, at least to judge by their absence from his 2015 guide.

So here's a tip; Bannockburn will go soon too. He's been ripping their wines for a number of editions now; I haven't tasted them, but I do rather appreciate it when someone experienced is actually prepared to say "these aren't as good as they used to be/should be" or similar. Redman might be in a similar category. In the 2015 guide he's got stuck into Moss Wood (again), and even the sacred cow of Cullen; I get the impression that even Mount Mary is being re-assessed in the brutal light of recent developments. He's also been forthright about what he obviously saw as the decline in Howard Park; indeed, despite the 'wannabe' nature of his 'five-star wine' list, it's interesting to see he wines he's prepared to drop from it!

Look, I don't always agree with what he says, but it's good to see all his tasting experience being dealt 'both ways' so to speak, and that he doesn't just stay shtum when he thinks some winery has lost the plot.
I think the whole "we only publish notes of the best wines; the ones you should buy; don't worry about the wines we don't mention" attitude (rarely so crudely expressed though) is the ultimate cop-out.

Hell, if I can post even cursory notes of wines I think are crap on Cellar-Tracker on an amateur basis, the least professionals can do is record some kind of thought on all the wines they taste; purely for reader calibration purposes if nothing else.
cheers,
Graeme


I agree Graeme. There is something I find very refreshing about JO. Thankfully I find my palate aligns best with his too. Maybe it is just the format of the book he uses but I do appreciate the way wines get re-evaluated and marked accordingly. It is useful having back vintages being easily accessible, even if just a score. Henschke for brett, Howard Park ex John Wade, Jasper Hill went from a number "1" wine down to like "3", Cape Mentelle from a "10" wine down and then back up again. It makes sense too. Winemakers change, wineries change, wines change. I think it had been mentioned before but when Oliver has a high score wine it is generally a safer bet that most. Shame he seems absolutely hopeless when it comes to a usable website.

Winefront doing a few cellar visits too, like that. Useful for the stuff that heads there and need reminding to get back out again. I have to admit the most useful thing with a high score for me is when it relates to a new wine/winery that I'm not familiar with and makes me want to try it. I'm always keen to try new stuff so at least the scores/notes give a bit of direction to wines I don't know.

Good work dingo, nice thread, thanks for the input too GW.
Cheers
TA