Golf Club Atlas

GolfClubAtlas.com => Golf Course Architecture => Topic started by: Don Mahaffey on December 02, 2017, 11:28:07 AM

Title: How we evaluate courses
Post by: Don Mahaffey on December 02, 2017, 11:28:07 AM
 We’re a data driven bunch.

We are almost unanimous in naming the greats, Cypress, Pine Valley, The Old Course…we have some outliers, but for the most part we generally agree on the greatest in the world.
But why do we think they are great? The experience? The golf? We break ‘em down, develop all these metrics, then apply them to other courses we want to grade.
What other art form uses this process to critique art? And does it really work in golf.
For example, we know we like playing TOC, we like how it makes us FEEL. But how do you measure that? We try and that means we must define that feeling analytically; so we can try and recreate that feeling, and then we talk about width and wild greens, firm and fast turf, and if a course has those things, done well, we might grade it high. But does breaking down the parts, taking those parts and applying them elsewhere, come anywhere close to the recreating the sum we have at TOC?
Kyle Harris’ post about green size got me thinking. In a million years, I could never tell you which greens are largest, or smallest, at Cypress Point. If I was given a similar site to work on, and I wanted to recreate the feeling of playing Cypress, would it do me good to go measure the greens and use that data when I build the new course? I don’t think so. Some “experts” might like it, people that study such things, but does doing something like that really create the experience you get when you play the great golf course you borrowed from?
It’s the sum that matters to me. How the parts are arranged to reach that sum is the key, IMO, and that isn’t so easily borrowed.
 
Title: Re: How we evaluate courses
Post by: Kyle Harris on December 02, 2017, 11:38:39 AM
We’re a data driven bunch.

We are almost unanimous in naming the greats, Cypress, Pine Valley, The Old Course…we have some outliers, but for the most part we generally agree on the greatest in the world.
But why do we think they are great? The experience? The golf? We break ‘em down, develop all these metrics, then apply them to other courses we want to grade.
What other art form uses this process to critique art? And does it really work in golf.
For example, we know we like how playing TOC, we like how it makes us FEEL. But how do you measure that? We try and that means we must define that feeling analytically; so we can try and recreate that feeling, and then we talk about width and wild greens, firm and fast turf, and if a course has those things, done well, we might grade it high. But does breaking down the parts, taking those parts and applying them elsewhere, come anywhere close to the recreating the sum we have at TOC?
Kyle Harris’ post about green size got me thinking. In a million years, I could never tell you which greens are largest, or smallest, at Cypress Point. If I was given a similar site to work on, and I wanted to recreate the feeling of playing Cypress, would it do me good to go measure the greens and use that data when I build the new course? I don’t think so. Some “experts” might like it, people that study such things, but does doing something like that really create the experience you get when you play the great golf course you borrowed from?
It’s the sum that matters to me. How the parts are arranged to reach that sum is the key, IMO, and that isn’t so easily borrowed.

Don,

The idea behind my post was that it took my years, and in some cases hundreds of plays, to realize the variance in the size of the greens. I also realized it was only when the outliers engendered a negative review that something like the green size was even mentioned!

Therefore, your point about Cypress Point is exactly the kind of experience I had - it was only in retrospect that I realized my favorites had some very well done variably-sized putting surfaces. So well done that I didn't even think about it.

As I am beginning to tell my more experience staff members: "If I don't notice it, you probably did it right."
Title: Re: How we evaluate courses
Post by: Peter Pallotta on December 02, 2017, 12:19:12 PM
Don:
you emphasized the right word, IMO - feel.
How do we quantify or evaluate any personal experience?
Years ago, a "10" reflected one young man's feelings about a given golf course; and, like most people at that age, he couldn't care less what anyone else felt about it.
Years later, despite that (now older) young man's reminders that all such evaluations are subjective, that "10" has become the gold standard - as if it's the truth, or a fact, or objectively provable and repeatable.
If I ever get a chance to play some of these great courses, I wonder if I'll be able to experience them with fresh eyes, to actually feel them in a personal/subjective way - unaffected by the 'objectification' (in ratings and magazines and here) and the external evaluations of the experience. 
I think I can do some of that now, with the little known courses. I had so much fun and such a great day of golf and was so impressed by a 1930 Stanley Thompson course I recently played — and later was very surprised to learn that ‘objectively’ it’s not on anyone’s list of top quality courses.
Peter   
Title: Re: How we evaluate courses
Post by: Tom_Doak on December 02, 2017, 06:14:31 PM
I think the most important thing is to evaluate a course for what it does, rather than what it doesn't do.  All of the rules and checklists propose that there are certain aspects of a course that are critical to its success, but ultimately the best courses stand out because they are different and violate everyone's silly rules.


For me the best object lesson of the last few years was to see the Himalayan Golf Club in Nepal.  Normally I'm a fan of firm and fast conditions, but those are impractical in that setting with such limited resources.  Instead, I found a golf course with narrow landing areas, and semi-shaggy fairways that limited bounces and roll - a different solution that was perfectly melded to local needs.  I don't believe that any professional architect (including myself) would have come up with such a practical solution.
Title: Re: How we evaluate courses
Post by: Sean_A on December 02, 2017, 08:33:27 PM
I think the most important thing is to evaluate a course for what it does, rather than what it doesn't do. 

Si.  The problem with ranking criteria is we are forced to look for features (and then evaluate what we looked for) rather than see what is there.  Even so, what bloody difference does it make if a course is great or good?  If this is the bolied down reason for playing the game I am not interested.  As Don states, the enjoyment of a course really comes down to how it makes one feel (and I would add think).  The thing is though, no two people share identical feelings about a course.  I am quite happy to understand that course B is not as good as course A, but I like course B more anyway.  People get caught up in these quality debates when we are usually nitpicking among the very best.  Why...because of how courses make people feel and think.

Ciao
Title: Re: How we evaluate courses
Post by: Ally Mcintosh on December 03, 2017, 03:45:44 AM
Completely agree it's down to an overall feel above everything else. Ranking categories are a complete sham.


However, you have to evaluate WHY it makes you feel that way (as a designer) if you are going to learn anything at all from it. If you are just playing golf, less so.
Title: Re: How we evaluate courses
Post by: Thomas Dai on December 03, 2017, 04:39:05 AM
Do I want to go back would seem a pretty good basis for evaluation.
Atb
Title: Re: How we evaluate courses
Post by: Sean_A on December 03, 2017, 05:08:09 AM
Do I want to go back would seem a pretty good basis for evaluation.
Atb


There are tons of courses I want to play again.  A more important question is which courses would I be willing to pay to play again....or in other words which courses have my full attention.  The number significantly drops at this point, but I am sure that to a certain degree not willing to pony up for an additional game is a comment on how you feel/think about the quality and style of the course.


Ciao
Title: Re: How we evaluate courses
Post by: PCCraig on December 03, 2017, 09:04:33 AM
All of the rules and checklists propose that there are certain aspects of a course that are critical to its success, but ultimately the best courses stand out because they are different and violate everyone's silly rules.



I absolutely love the above statement. Just terrific. I am going to have to noodle it a bit.
Title: Re: How we evaluate courses
Post by: Peter Pallotta on December 03, 2017, 10:12:03 AM
Pat - yes, that was a terrific line. So too was the Tom-Sean exchange about seeing what's there instead of looking for what's not.

(Brought to mind a Par 4 I know. As a "short par 4" it's meh; but as a "par 4 that just happens to be short" it's actually an interesting and nuanced golf hole.)

That might be a very good thread: What does your favourite/best golf course *not* have?
Title: Re: How we evaluate courses
Post by: Dave McCollum on December 03, 2017, 11:00:19 AM
Several times I’ve tried to comment on my feelings about ratings and given up each time.  One anecdotal feeling I think I’ve noticed is that when I’ve done my own research about where I wanted to play and picked my own courses, my “wow” factor is slightly elevated, indicating a predisposed bias before playing, or perhaps just a bit more knowledge going in.  I really don’t have much experience with others picking courses for me.  I did a couple of weeks in Ireland/N. Ireland:  the first week as an arranged tour, the second on my own.  I didn’t like two courses I played the first week as much as the others or all of the courses I picked.  They also happened to be the hardest (Waterville, Euro Club).  So, for me, my suspicion is not only is my playing experience subjective, it begins developing before I ever get there.  I suppose there is a special joy when built-up expectations don’t disappoint.
Title: Re: How we evaluate courses
Post by: Rich Goodale on December 03, 2017, 11:24:42 AM
Great post Don.


Vis a vis Art vs. Golf, you can observe the former but observe and interact with the latter.  This is why I play 20+ different golf courses/year and only visit the Louvre  and Musee d'Orsay every 5+ years or so.  It's all about form and function.  All art has form that one gets or doesn't get (or does or doesn't like), whereas all golf has a very strict function.  9 or 18 holes.  All 100-600 yard holes.  Fairways and greens.  3-6 hours per round.  Walk or pull or ride.  Sharing a beverage or three and chatter in the clubhouse or driving home to change nappies or watch football (American or ROTW).

I've been fortunate enough to play many of the "great" courses in my life, but very few of these do I care to revisit, per se.  These days it is all (to me) who I am playing with rather than where I am playing, whether it be Cypress Point or Auchterderran.

All golf courses are interesting, some more interesting than others.

Rich
Title: Re: How we evaluate courses
Post by: Jack Carney on December 03, 2017, 04:25:54 PM
Magazine categories are interesting attempts to put criteria into words, very hard to do. If we all created a system they would all be different - still good attempts. One category that we all allude to but don't define; and none of the magazine or rating systems do either; is fun! We all like fun courses and it remains outside theses systems but we will refer back to it in one way or another. Thats why we like course B regardless of it being rated lower than course A. Its more fun to play - Why? Again difficult to put into words! Just MO.
Title: Re: How we evaluate courses
Post by: Tom_Doak on December 03, 2017, 05:30:30 PM
All of the rules and checklists propose that there are certain aspects of a course that are critical to its success, but ultimately the best courses stand out because they are different and violate everyone's silly rules.



I absolutely love the above statement. Just terrific. I am going to have to noodle it a bit.


You don't have to think much farther than to think about the differences between The Old Course (or North Berwick) and Pine Valley.  They are almost two opposite poles, with all of the lesser courses falling somewhere in the boring middle.
Title: Re: How we evaluate courses
Post by: Steve Lang on December 03, 2017, 05:45:24 PM
 8)   The boring middle??????????????  Surely you jest.
Title: Re: How we evaluate courses
Post by: Rich Goodale on December 03, 2017, 06:34:00 PM
8)   The boring middle? ??? ??? ??? ??? ?  Surely you jest.


Hopefully Tom was jesting, given that virtually of his courses would be in the "boring middle," by his definition.






Title: Re: How we evaluate courses
Post by: Peter Pallotta on December 03, 2017, 07:16:57 PM
8)   The boring middle? ??? ??? ??? ??? ?  Surely you jest.
Hopefully Tom was jesting, given that virtually of his courses would be in the "boring middle," by his definition.

Just a guess, of course - but I don't think Tom was joking as much as exaggerating to make a point.

From reading books and course profiles and the rankings, I've (tentatively) concluded:

That if you don't understand what makes for greatness, your own work will never stand the test of time. But if you don't know how to tailor that greatness for the time & place in which you live, you won't have much of your own work to begin with.

But "boring" isn't the right word for that very fine line. 'Measured" might be a bit closer, it seems to me - but then again not really.

Peter 
Title: Re: How we evaluate courses
Post by: Steve Lang on December 03, 2017, 09:46:42 PM
 8)  I was expecting... "stop calling me Shirley!"
Title: Re: How we evaluate courses
Post by: Tom_Doak on December 04, 2017, 06:14:50 AM
8)   The boring middle? ??? ??? ??? ??? ?  Surely you jest.


In math terms, I was saying that those two courses are several standard deviations different than the norm, on either end of the extreme -- Pine Valley is islands of fairway with severe hazards all around, while St. Andrews is all fairway just punctuated by some deep bunkers.


The norm, the courses that follow all of the rules, are boring in my opinion.  You may think they're fair to play, but there is no point in traveling to see them, because there is nothing different about them.  To me, the great courses are the ones that demand you go see them, because there is something different about them.  So I try to stay out of "the boring middle".
Title: Re: How we evaluate courses
Post by: Jeff_Brauer on December 04, 2017, 08:53:32 AM

As I said somewhere, golfers tend to judge a hole (or courses) on difficulty, beauty or uniqueness, depending on golf skill and personality type.


IMHO, I agree its mostly feel, and would bet that when a rater goes to his ballot, he makes the numbers match his gut feel, even if the point system is designed to make them think twice and be objective.


Since Cypress was mentioned, I will say that its among my favorite courses for beauty, but Lanny Wadkins hates it because its too short and easy.  A magazine did a tour pro survey and they basically said the same thing, ranking it lower.


So for me, my preferences are for beauty, a few unique holes, and then difficulty.  Others may disagree, and c'est la vie, non?
Title: Re: How we evaluate courses
Post by: Ed Brzezowski on December 04, 2017, 09:26:10 AM
Reading this topic reminded me of the part in The Dead Poets Society about evaluating poetry on an X-Y axis. Only when done this way can a poem be properly evaluated.

Tastes change over the years as does ones playing ability. You start seeing things as you get older that were not as relevant when you could muscle a drive past them. The greats will always stand out, as they should, but can a change in playing ability change your thoughts?

Great topic.
Title: Re: How we evaluate courses
Post by: Ira Fishman on December 04, 2017, 09:49:00 AM
Ed,


There is no question that a change (hmm, decline) in my playing ability has affected the way I evaluate courses.  But I think for the better.  When younger, I paid little attention to and therefore did not appreciate green contours both as they affected chipping and putting.  Now, that the short game is one of the few things that I can improve with practice and focus, I devote more time and attention to evaluating and admiring green complexes.


Ira
Title: Re: How we evaluate courses
Post by: David Wuthrich on December 04, 2017, 12:08:21 PM

I'm not a data person.


When people ask, I answer Blondes, Brunettes and Redheads!


They are all different, they are all wonderful but some people prefer one over the other.


I happen to like them all !


There are difficult courses, beautiful courses, architecturally interesting courses, etc.


I try to judge each in its own category.
Title: Re: How we evaluate courses
Post by: Ulrich Mayring on December 07, 2017, 05:32:22 PM
Golf is per definition (the rule book) a very data-intensive sport played in definite categories. Success at the game is not determined by how you feel or the beauty of your swing. Rather, it's by hard, cold numbers: how many strokes did you take against your handicap or against your opponent?

Obviously, you can play without counting and revel in the beauty of a course and the challenge of certain shots. You could find more satisfaction in sinking a curling 10 footer for a 10 than a tap-in for birdie. But I suspect that neither the rulebook nor golf courses were made with that type of player in mind. Historically, going back as far as you like, golf was always about numbers.

That doesn't mean that judging golf courses must be done by numbers, just that it seems logical. If someone finds a better way, then I'm all ears. But the only alternative seems to be "don't judge, it doesn't work", which is legitimate, but not very courageous.

Ulrich
Title: Re: How we evaluate courses
Post by: Sean_A on December 07, 2017, 05:44:44 PM
I am not sure the rules of golf and evaluating courses are analogous...if so...how? 


My belief is numbers guys fit the numbers to how they feel and think about courses.  Just about any list author/editor will do an eye test after all is said and done and if something doesn't look right they will make the adjustments.  I tried the numbers gig for a while as an experiement and found that too often stuff didn't pass the eye test (which I consider far more important than any set of data).  It got to the point where I figured the system was broken, not the assigned numbers.  I never came close to figuring out how to make the system fit the eye test.  I just tried it again for another magazine and came up with some interesting results which I didn't buy, but that was mainly because of categories I didn't think mattered or cold easily be wrapped into the larger picture as small beer stuff.

Ciao
Title: Re: How we evaluate courses
Post by: Peter Pallotta on December 07, 2017, 09:06:42 PM
Just thinking of Don's original post, and the flip-side (maybe) of decades' worth of analysis and evaluation:

on the one hand, across the board the art-craft of gca has never been better: an industry chock-full of passionate, experienced, talented, educated, technically proficient, committed, responsible and practical professionals, with every mechanical and (sufficient) financial resource at their disposal -- and the work itself always good-to-very good-to excellent

but on the other hand: gone is the accidental, the mad-cap, the unintended, the savant, the slowly-evolving, the beginners, the constrained, the tentative and experimental -- with the work itself characterized by untold countless missteps, and by long-ago-plowed-under failures, but also by several instances of totally unique and enduring greatness (flaws and all)

Ah, well, maybe it's inevitable and in the natural order of things, i.e. you win some, you lose some.
Of course, maybe it's nothing of the sort - just a romantic's rendering of history via a false narrative

But many others (almost infinitely more qualified than me) have praised TOC to the moon while simultaneously recognizing that it would never be built today. Lots of reasons for that, I suppose -- but could one reason be the decades' worth of analysis and evaluation that we all value so highly?
     
Title: Re: How we evaluate courses
Post by: Ian Andrew on December 08, 2017, 08:38:03 AM

A long time ago my Dad was teaching me about water colours. I struggled with over-painting. He got me to paint less and less until I was down to just a few strokes. He explained that was the essence of what's necessary to add colour to a composition. He then told me I would need decades to figure out how to add four or five more strokes to achieve what I really wanted to paint.


Excellence is not perfection. In fact perfection is actually fairly cold and distant, when presented in most art forms. Excellence lies in those last four or five strokes/choices being charming and engaging rather than misplaced.


Excellence lies in the interaction of multiple elements creating an emotional response in the observer, or in our case the player.
You really think we can mathematically define that ... I don't.
Title: Re: How we evaluate courses
Post by: George Pazin on December 08, 2017, 12:11:44 PM

A long time ago my Dad was teaching me about water colours. I struggled with over-painting. He got me to paint less and less until I was down to just a few strokes. He explained that was the essence of what's necessary to add colour to a composition. He then told me I would need decades to figure out how to add four or five more strokes to achieve what I really wanted to paint.


Excellence is not perfection. In fact perfection is actually fairly cold and distant, when presented in most art forms. Excellence lies in those last four or five strokes/choices being charming and engaging rather than misplaced.


Excellence lies in the interaction of multiple elements creating an emotional response in the observer, or in our case the player.
You really think we can mathematically define that ... I don't.


Fantastic post, best on the thread, imho.


I think one problem most people, especially smart people, have is a tendency to overthink or over-analyze everything. More is always better, seemingly.


My own personal standard for golf courses is simple: do they ask interesting questions of the golfer? If the question is merely "what's my yardage to this or that?", I don't find that particularly compelling. The really special courses ask interesting questions on almost every shot (which also means that they accommodate most every shot as well).
Title: Re: How we evaluate courses
Post by: Ulrich Mayring on December 08, 2017, 05:46:58 PM
Well, if golf courses are to be judged on a purely emotional basis, then what sense does it make to rank them at all? Emotions are highly subjective and half of them aren't down to the golf course, but to personal dispositions on the day of play.

And if you answer: "well, it makes no sense", then why are you taking part in this thread? :)

As an aside, I have yet to see anyone writing course reviews without using numbers to establish some kind of ranking :)

Ulrich
Title: Re: How we evaluate courses
Post by: Ian Andrew on December 08, 2017, 06:09:16 PM
I didn’t say don’t rank them, but stop trying to quantify it like it’s science. It’s art.


When I go to a museum - something I do enjoy doing when I travel - I don’t try to apply science to why I connect with a painting. I just do.


But I may look at how the composition was created or find out more on the subject matter to deepen my connection. Or just learn something.


I still like lists because they possess ideas on what to see. It provided me with a great place to begin learning.
Title: Re: How we evaluate courses
Post by: Sean_A on December 08, 2017, 08:30:43 PM
Well, if golf courses are to be judged on a purely emotional basis...
Ulrich


I don't think this is an accurate representation.  Yes, how a course makes one feel is important, but also how it makes one think is at least as important. 

At the end of the day, one will do what one will do.  A formula/math based approach to decode how you feel and think is what makes you comfortable.  I notice you seem to focus on quirk, shot values, scenery and flow.  Three of those categories are not something I would personally lean heavily on in making a determination about course quality.  More importantly, all the criteria are subjectively important for you.  At some point, it is worthwhile recognizing that to each is own is perfectly fine and that because someone can back up rankings with numbers doesn't mean the system is more valid or better.  The numbers are simply a function of you feel and think.   

Ciao
Title: Re: How we evaluate courses
Post by: Ally Mcintosh on December 09, 2017, 04:33:44 AM
I think Sean is correct.


I liked Ian's post but don't agree with it 100%. Whilst it is primarily art, there is an element of science also.


I do believe there is some objectivity in evaluating courses. Despite thinking that it's first and foremost how a course makes you feel (and think) on a primal level that determines how good it is.
Title: Re: How we evaluate courses
Post by: Tom_Doak on December 09, 2017, 11:44:17 AM


I liked Ian's post but don't agree with it 100%. Whilst it is primarily art, there is an element of science also.


I do believe there is some objectivity in evaluating courses. Despite thinking that it's first and foremost how a course makes you feel (and think) on a primal level that determines how good it is.


You're halfway there, then.  You just have to let go of the side of the pool now :)  Trust me on this, your work will get much better once you stop trying to rationalize it.
Title: Re: How we evaluate courses
Post by: Peter Pallotta on December 09, 2017, 12:23:00 PM
I’ve often thought about this:

In an art-craft that isn’t wholly improvised, how much planning and rationalizing is necessary and beneficial, and how much of it is actually diminishing and counter productive (especially if you’re not playing it safe or satisfied with the merely good)?

In other words: when do you let go of the side of the pool? Too early and you (likely) drown, though on the other hand you might set a record; too late, and you’ll (likely) survive to swim another day, but you’ll never stand out from the crowd or on the top step of the podium.

Reminds me of the long departed hierarchy of values thread. Not everyone wants or needs most of all to stand on that top step; but if that’s your number one goal it sure does seem like a leap of faith is called for.

Watched an excellent documentary on John Coltrane the other night. He was making terrific music and an excellent living with giants like Miles and Monk - but that wasn’t his goal. And so he leaped, and for him it was literally a leap of faith. And can you believe it? “A Love Supreme” actually became a revered (not too surprising) and best selling (very surprising) album!

I think there’s a hunger out there for true, real, brave and inspiring greatness. But not enough of us ever take the leap.

Title: Re: How we evaluate courses
Post by: Ally Mcintosh on December 09, 2017, 01:42:28 PM
Perhaps you boys are defining art and science differently to me. Please don't confuse letting go of the side of the pool (i.e. being brave) with the belief that GCA is all art.


I design by feel (art). Occasionally, it does me well to make sure that my eye isn't deceiving me by running a few checks (science). Tom, I suspect you do exactly the same.


As for evaluating courses, I do this by how they make me feel on an emotional level. But an element of objectivity kicks in based on certain golfing attributes that the course may have. You have to know what elements help to make a good golf course in order to constructively evaluate it. Otherwise everyone's opinion should hold the same weight. And The K-Club would be Ireland's No.1 course.
Title: Re: How we evaluate courses
Post by: Tom_Doak on December 09, 2017, 04:53:26 PM
Ally:


If your definition of science is whether I check greens with a transit, yes I do; it's amazing to see how even the best shapers can be fooled by their own eyes.  And at some point I do a scorecard so I will know how short a course we are building in case the client has questions.


But calling any of the conventional wisdom about golf design "science" is demeaning to actual science.  Many golf course architects have had their own theories and formulas, but none have ever been mathematically proven true.
Title: Re: How we evaluate courses
Post by: Ally Mcintosh on December 09, 2017, 05:10:54 PM
Ally:


If your definition of science is whether I check greens with a transit, yes I do; it's amazing to see how even the best shapers can be fooled by their own eyes.  And at some point I do a scorecard so I will know how short a course we are building in case the client has questions.


But calling any of the conventional wisdom about golf design "science" is demeaning to actual science.  Many golf course architects have had their own theories and formulas, but none have ever been mathematically proven true.


Valid point. My definition of science was probably wrong or maybe just flippant.


I just don't want people to think that everything is creative and that there is no element of engineering involved. If I have a criticism of myself, I spend too much time on the former and sometime too little on the latter.
Title: Re: How we evaluate courses
Post by: Ulrich Mayring on December 09, 2017, 06:08:02 PM
Designing/building golf courses is a craft and it does have an element of art, no doubt. But evaluating / ranking golf courses? That's a different beast altogether.

I don't think that any system is superior to another, because it is based on numbers or has the "right" categories. But I do believe there is a value in committing yourself to something. Be that a scale from 1 to 10 or a fixed set of categories, but there should be something to make your work comparable, to allow others to judge where you are coming from and hold you accountable.

Those who believe in a primarily feel-based approach should not forget that emotions are a positive thing only 50% of the time. The other half is you hating a course or getting bored to death by it and are you really going to put those words out there? I believe criticism is important, but you should do it responsibly and give a factual appraisal that adheres to some kind of professional or personal standard. And that standard should be out there for others to look at.

And don't worry, no matter how formal you try to be, emotions will still influence you more than enough :)

Ulrich
Title: Re: How we evaluate courses
Post by: Sean_A on December 09, 2017, 06:33:34 PM
Ulrich

Don't confuse a number system with facts. If you want to critique a course you hit the highs and lows. There is no need to assign numbers using a subjective formula and that in no way makes a critique more valid. If anything I would say numbers are a scam because they are used in a subjective manner that at least on the surface is trying to pass for objectivity.

Ciao
Title: Re: How we evaluate courses
Post by: Peter Pallotta on December 09, 2017, 07:22:46 PM
Coincidental to Sean's post, I'd just been thinking about the value (and relevance to this thread) of his own, personal rating system, ie
3*  Don't miss for any reason
2*  Plan a trip around this course
1*  Worth the expense of an overnight stay
R    Worth a significant day trip (no more driving than it takes to play and have drinks)
r    A good fall back on course/trip filler
NR Not recommended

This approach is more telling (it seems to me) and more useful (as a guide to someone like me) than the traditional numerically-based systems (e.g. Top 100) where each course gets a score and then is ranked accordingly -- and where the difference in the scores between the 1st and, say, 15th courses is very often equal to or even greater than the difference between *all* the other courses on the list combined.
(E.g. if No. 1 scores a 9.4. and No 15 a 7.8. -- a difference of 1.6 points -- it sure seems that between No. 16 and No 50 you'll find the same 1.6 point/score difference.)

Now, this is not to disparage those lists or even the 'criteria' involved in rating courses (we've already done that endlessly and ad nauseam -- and I very much enjoyed Ben's Michigan list and Brian's Ohio list, especially the write-ups). Instead it's to note:

a) that, while like Sean's approach these kinds of lists do very well at highlighting the best of the best (about 10%), unlike Sean's list they do a relatively poor job of guiding a would-be traveler on deciding which course among the remaining 90% (presumably all good-to-very-good courses) is appreciably better or more interesting than any other; and to note
b) that this difference between the two approaches suggests something important about the topic at hand, i.e. how we evaluate golf courses

What it suggests is that the decades-long approach to course evaluation (ie one that consciously or not uses the same basic approach/set of expectations for every single course being evaluated) has led to the vast majority of (even very good) golf courses being virtually indistinguishable in terms of quality and interest.   

I wonder what the potential impact might've been on gca if, 60 years ago, the magazines and the rest of us had adopted the Arble Scale, which seems to preference *uniqueness* and *exceptional quality* above all else.  There wouldn't have been much place to hide, or even much room for nuance for any new course that was built:

Golfers might plan an entire trip (and pay for an overnight stay) simply to play course X; or they might make a long drive to play it; or they might skip it all together! That would've gotten one’s attention... :)
Title: Re: How we evaluate courses
Post by: Sean_A on December 09, 2017, 09:22:41 PM
Pietro

Thank you, the approach is meant to serve as a guide for travellers so it is good to get affirmative feedback.  My approach is a take on Doak's and heavily leans on the Michelin Guide...obviously. I don't think there is nearly enough distinction between courses to warrant a 1-10 scale....though I expect I am in the minority on this issue.  Perhaps the most important aspect of the approach is that I don't make a final judgement on the quality of a course.  This is in the main due to my belief that course quality is but one of at least a few reasons why people travel to play golf or why they choose to play certain courses.  From personal experience, if a course is of a certain minimal standard which I admit is a purely subjective matter, I can like it just as much as the best courses in the world.  In other words, course quality is not enough to guarantee that I will like it.  The enjoyment of a game and course revolves around so much other than the quality of a course that I never felt the need to worry about a best list...even if I could figure out how to go about it in a satisfactory manner. 

Ciao
Title: Re: How we evaluate courses
Post by: George Pazin on December 10, 2017, 01:48:42 PM
If anything I would say numbers are a scam because they are used in a subjective manner that at least on the surface is trying to pass for objectivity.

Ciao


I love this thought.
Title: Re: How we evaluate courses
Post by: Ulrich Mayring on December 10, 2017, 05:21:35 PM
Sean,

I don't understand. You are using numbers. Tom Doak is using numbers. Everyone is using numbers. Can you point me to any reviewer that is not using numbers?

And yet you say numbers are a scam?

Ulrich

PS: Apparently you even have categories: "the Arble Scale, which seems to preference *uniqueness* and *exceptional quality* above all else" - at least that is the perceived opinion.
Title: Re: How we evaluate courses
Post by: Sean_A on December 10, 2017, 07:57:50 PM
Sean,

I don't understand. You are using numbers. Tom Doak is using numbers. Everyone is using numbers. Can you point me to any reviewer that is not using numbers?

And yet you say numbers are a scam?

Ulrich

PS: Apparently you even have categories: "the Arble Scale, which seems to preference *uniqueness* and *exceptional quality* above all else" - at least that is the perceived opinion.

Ulrich

No, I don't use numbers in the same way you do....so far as I can tell.  My numbers are merely labels for convenience.  I don't run through a series of categories valued 1-10 (or whatever) with weighted values etc etc to determine a final formula number grade.  Besides, I am not ranking courses by quality so going through a hullabaloo such as that would be silly.  I don't give a fig if Muirfield is better than North Berwick.  I am offering my opinion to golfers willing to travel as to which courses they should consider playing.

Ciao
Title: Re: How we evaluate courses
Post by: Ulrich Mayring on December 11, 2017, 05:44:47 AM
Well, we are making progress here :)

So you are in fact using numbers, but the debate is on the correct way of using numbers.

You seem to prefer to assign one number arbitrarily, whereas I assign them arbitrarily in four categories and take the average of those. So the world of difference between our systems appears to be whether to assign one number arbitrarily or four and whether to use a scale of 1-6 or 1-10.

Ulrich
Title: Re: How we evaluate courses
Post by: Sean_A on December 11, 2017, 05:55:30 AM
Well, we are making progress here :)

So you are in fact using numbers, but the debate is on the correct way of using numbers.

You seem to prefer to assign one number arbitrarily, whereas I assign them arbitrarily in four categories and take the average of those. So the world of difference between our systems appears to be whether to assign one number arbitrarily or four and whether to use a scale of 1-6 or 1-10.

Ulrich

No, my numbers are a short cut.  There is no significance to the label in and of itself.  It is simply easier to write r rather than a good fall back course/trip filler.  So far as I can tell, Doak does the same thing.  You on the other hand use weighted categories whereby the numbers are indications of quality and are used to determine a final number quality score.  I think you know there is quite a difference  ;)

Ciao
Title: Re: How we evaluate courses
Post by: John Kirk on December 12, 2017, 05:52:00 PM
I'll bump this back to the top.

In the persistent search to find thoughtful ways to describe how golf courses should be evaluated, I will add the following thoughts.  I have not played Cypress or Pine Valley, but I have played approximately one-third of the most coveted courses in the U.S.

-  A great golf course should give the player a sense of place.  Whether it is the oak dotted coastal plains of south Texas, or the mixed deciduous forests of coastal southern Oregon, a great golf course should remind the player of where they are.  In the case of St. Andrews, Scotland, the sense of place is largely defined by the town itself.

-  With rare exceptions, a great golf course should have short, logical green to tee walks, and provide a compelling walk through the golfing park.

-  A great golf course provides a grand variety of shots that stimulate the player's senses in a positive fashion.  The shots offered consistently evoke excitement and anticipation, as opposed to evoking dread or fear.


Regarding the "math" discussion, any numbers used to evaluate a course are an approximation of an analog scale.  It's not really mathematical at all.  And finally, I know a few people here who are math whizzes, the kind of people who probably (or did) score 800 on their math SAT when they were kids.  Some or all of them have been, or are still course raters.  All of them treat course rating as an analog, intuitive exercise.  None of them are category driven, just a general sense of how good it is.
Title: Re: How we evaluate courses
Post by: Ulrich Mayring on December 12, 2017, 06:28:36 PM
Sean,

I honestly don't see the difference between what you describe as your way of rating courses and what I do. We are both using numbers as shortcuts for a narrative and we are both saying 3 is better than 2.

We are also both rating quality, only we may have different definitions for quality in golf courses. I haven't heard yours, but it appears to have something to do with trip planning.

Personally, I don't believe in second-guessing people which number would be worth a detour for them or even worth planning an entire trip around and which number they would see as "not recommended". As a player I would ask the rater this: not recommended for who and in which circumstances? I occasionally find myself playing courses that suck, but if I am itching to play golf that day and those are the courses available that day, then I will play those courses that suck and I will have fun.

I'm thinking that for me an 8 is better than a 5, but I try not to make any blanket recommendations. I've seen players hating courses that I praised and vice versa.

Ulrich
Title: Re: How we evaluate courses
Post by: Mark_Fine on December 12, 2017, 06:55:44 PM

With all due respect, it always boils down to numbers!  Whether you are judging paintings in a museum, deciding which is your favorite ski resort, discussing the best restaurants in town, talking about the prettiest girl at the dance, determining who is the class valedictorian, or deciding which golf courses are the best in Scotland - it all comes down to numbers!  Do we walk around the Louvre in Paris saying that painting is a 7 and that one is a 9?  No, but if someone asked us on the way out which ones were our favorites, many of us would have an answer.  If asked why we liked them so much we would hopefully be able to articulate the reasons for our preferences.  Stating what our favorite golf courses are is no different. 


I never really understood why there is so much resistance to the concept of numbers?  It is what it is.  Human nature loves to rank things and put things in some kind of order and that order is all determined by certain criteria and preferences.  And in the process we give values to those criteria and preferences (whether we want to admit it or not).  We may or may not write down a numerical value for each criteria but at the end of the day, we make a determination as to our favorites (our ranking) based on those combined values. 


When Peter says he uses a 3-r scale, it just another numerical scale.  If he was pressed to narrow his scale down even further (e.g. what are the best of the must see courses that he calls 3s?) he would be forced to figure out a way to determine what they are.  If he said he couldn’t then some would argue he is just not as knowledgable about his subject matter as he needs to be.  I don't know if Peter is married or not but if he is not, maybe it is because he was never able to narrow that list down either  ;D   


Ran once told me that he could tell me why his 42nd ranked course is better than his 43rd ranked course.  Frankly, I would struggle to do that.  I actually like the Doak Scale and I can get down to what is an 8.5 vs an 8 but that is where if I am honest with myself I get tapped out.  After that level of scrutiny, it is more flip a coin as to which 8.5 is better than another 8.5. 


There is nothing wrong with numbers.  It is more about "what formulates those numbers" that one ends up with.
Mark
Title: Re: How we evaluate courses
Post by: Sean_A on December 12, 2017, 07:32:24 PM
Sean,

I honestly don't see the difference between what you describe as your way of rating courses and what I do. We are both using numbers as shortcuts for a narrative and we are both saying 3 is better than 2.

We are also both rating quality, only we may have different definitions for quality in golf courses. I haven't heard yours, but it appears to have something to do with trip planning.

Personally, I don't believe in second-guessing people which number would be worth a detour for them or even worth planning an entire trip around and which number they would see as "not recommended". As a player I would ask the rater this: not recommended for who and in which circumstances? I occasionally find myself playing courses that suck, but if I am itching to play golf that day and those are the courses available that day, then I will play those courses that suck and I will have fun.

I'm thinking that for me an 8 is better than a 5, but I try not to make any blanket recommendations. I've seen players hating courses that I praised and vice versa.

Ulrich

Ulrich

I don't have any idea of what your narrative is for an 8 or 5.  My system offers a fairly clear narrative as to whether or not I think a course is worth the effort, time and money to play.  Additionally, I give a modest picture tour with brief descriptions to enable the reader to judge if my summary is accurate or worthwhile.  I accept that we all can't agree. 

I am not rating quality regardless of how our views may differ.  If I were focusing on a quality list my main criteria would be

-routing: quality of the walk, views, green sites and natural features

-the site: terrain and quality of soil/grasses

-the greens: firm and true or otherwise appropriate to the climate and design

-man-made features

I wouldn't bother with the house, cost, likely pace pf play, history etc...important aspects for many people when it comes to choosing where to play...but a quality list isn't about that.

My list of best courses would look very different to my favourites and different again to the recommendations.  As I said previously, I don't care how good a course is once a certain level of quality is achieved. 

My recomemendations are for the general population of golfers who are willing to travel...the scale make this obvious.  This population can choose to take or leave the recommendations.  We all play crap courses here and there, but do you recommend others to play those crap courses or rate the courses highly because you had fun?

We shall have to agree to disagree about the use of numbers. I emphatically do not use a number system or formula.  Trying to do that when evaluating what is actually in the ground instead of what I want to be in the ground does not, in my experience, work very well.  Or maybe a course is excpetional in one area and poor in another.  I may decide the exceptional far outweights the poor and recommend the course higher...or conversely decide the poor aspects outweigh the excpetional elements.  Each course is different and treated as unique.

Mark - I don't really care if you use numbers.  As you say, it seems to be human nature.   

Ciao
Title: Re: How we evaluate courses
Post by: Mark_Fine on December 12, 2017, 10:00:05 PM
Sean,
You stated "your system offers a fairly clear narrative as to whether or not you think a course is worth the effort, time and money to play."

I presume this system results in a list of courses that meet your criteria (does it not)?  I would also speculate that that list of courses would be compelling and enjoyable to play.  So if I asked you the rank that list of 10 or 20 or 100 courses that your system generates, could you do it?  Or would they all be the worth the exact same effort, time and money to play? 

I think you use numbers more than you might want to admit.  You just don't write them down  ;)
Mark


Title: Re: How we evaluate courses
Post by: John Kirk on December 12, 2017, 11:15:50 PM
Ran once told me that he could tell me why his 42nd ranked course is better than his 43rd ranked course.  Frankly, I would struggle to do that.  I actually like the Doak Scale and I can get down to what is an 8.5 vs an 8 but that is where if I am honest with myself I get tapped out.  After that level of scrutiny, it is more flip a coin as to which 8.5 is better than another 8.5. 


There is nothing wrong with numbers.  It is more about "what formulates those numbers" that one ends up with.
Mark

Hi Mark,

At some level you have proved my point with this paragraph.  Ran can tell you why he likes the 42nd best course better than the 43rd, but in all likelihood, they will be assigned the same number as a rating.  The digital system is an approximation of an analog rating methodology.

I like the Doak scale too, and I love to rate courses and songs with numbers or stars.  I like discrete groups of like-rated things.
Title: Re: How we evaluate courses
Post by: Sean_A on December 13, 2017, 03:51:09 AM
Sean,
You stated "your system offers a fairly clear narrative as to whether or not you think a course is worth the effort, time and money to play."

I presume this system results in a list of courses that meet your criteria (does it not)?  I would also speculate that that list of courses would be compelling and enjoyable to play.  So if I asked you the rank that list of 10 or 20 or 100 courses that your system generates, could you do it?  Or would they all be the worth the exact same effort, time and money to play? 

I think you use numbers more than you might want to admit.  You just don't write them down  ;)
Mark

Mark

No, there is no list other than all the courses I play.  If a course isn't good enough I simply give it an NR.  But to be honest this doesn't happen often because I take my own advice and avoid courses which aren't very good  8)  There is no need or desire on my part to rank the courses...the recommendation speaks for itself.  When it gets down to brass tacks it doesn't matter a tosh what recommendation I give to the big guns...they come self recommended if you follow.  It is the lesser courses where I can provide a service.  Perhaps golfers looking for better value, more experience, an adversion to tourists etc will use the information to their benefit.  A few taps of the fingers will find days worth of info and press on Muirfield, not so much for a great deal of courses worthy of attention and time.   

Ciao
Title: Re: How we evaluate courses
Post by: Mark_Fine on December 13, 2017, 07:28:12 AM
Sean,
I understand that you personally don't like to make lists or rank courses.  But you by your own admission have at least a few lists - courses you have played and courses that you don't play and courses that you played and give an NR. You have to admit you wouldn't know what to play or what was maybe worth playing if someone else didn't provide those lists/rankings for you.

All rankings/reviews are subjective.  No one will argue with that.  And they all are based on some preconceived criteria (good or bad) which is also subjective.  And the only way to come up with ANY kind of list is to give some kind of numerical value to that criteria.  You are doing the same with your three lists. If a course is worth playing it gets a value, if it is not it gets a different value and if you play it and then decide it should not have been played it gets yet another different value.  You could call them each A, B and NR if you want but the reality is they have each been valued in your own way. 

Golf Digest used to rank their Top 100 courses alphabetically early on.  That was kind of like Peter's scale by saying these are all the top courses worth playing.  Then they moved to dividing them up even further from which is #1 to which is #100.  Their list was also the "Toughest" courses but thankfully they got away from that criteria. 

No one's criteria is perfect and never will be.  They are all just lists that have been organized based on numerical weighted values 😊
Title: Re: How we evaluate courses
Post by: Ulrich Mayring on December 13, 2017, 08:00:27 AM
Sean,

you may not produce a ranking list explicitly, but if we take all your reviews and sort them by your numbers, then there is your ranking. So you are doing all the legwork. If you were to just write textual and pictorial reviews and not assign any number, then I would buy your argument, because you would explicitly make it very hard for the reader to compare courses.

Numbers are for relative comparisons, but by themselves they are not mathematics. They are just stand-ins for lists like "great", "good", "mediocre", "bad". If you wonder in your head whether you should spend your vacation in area A with one great course and two mediocre ones or rather in area B with three good courses, then the mathematics start. But you (the reader of the ranking list) are doing the mathematics and you are, for example, subconsciously assigning a 7 to the great course, 5s to the good ones and 3s to the mediocre ones. Then you add it all up and arrive at a conclusion - that's your mathematics, whether you do them algebraically or symbolically.

And perhaps we can agree that doing these mathematics are not the business of the rater, because he cannot know the individual's weightings.

cheers,

Ulrich
Title: Re: How we evaluate courses
Post by: George Pazin on December 13, 2017, 01:13:13 PM
Ulrich -


In reading your posts and Sean's, my own interpretation of what Sean says is that his list is HIS list. By relying on masses of subjective numbers, others are trying to pass off their own list as A list, as opposed to THEIR list.


But maybe that's just me...


I think few, if any, of us are as objective as we believe. I know I certainly am not the least bit objective. My criteria are my own, no one else's, and few would likely agree with them.
Title: Re: How we evaluate courses
Post by: Kalen Braley on December 13, 2017, 02:00:49 PM
I think what Sean is trying to say, which I agree with, is...


When you make a purely subjective list that you somehow rank 1 to 100, others may come along and infer its actually objective as well as make god knows what other judgements of the courses based purely on its relative numerical ranking.  This is the bs part, especially when talking about the top 10 rated courses in the world.


No one could ever prove objectively why CPC should be ranked higher than Pine Valley or visa versa....
Title: Re: How we evaluate courses
Post by: Peter Pallotta on December 13, 2017, 02:13:00 PM
I got us sidetracked by suggesting that Sean preferenced uniqueness and exceptional quality. I don’t know why I noted the second value; I think my pen ran away with itself (and I confused my appreciation for the choices with his rationale for making them). But I noted the first as to say that Sean’s were recommendations based a personal and subjective preference, not evaluations of specific qualities/features.

More generally, I assume that what Sean is saying (to Mark and Ulrich) is that he doesn’t evaluate — numerically or otherwise — the constituent parts (eg green sites, shot values) of a golf course, but instead tries to focus on the whole, ie the whole experience.

And if that whole experience is enjoyable for him, he then shares his belief that it is “worth the expense of an overnight stay”.. the short-form code for which phrase is a *1.
Title: Re: How we evaluate courses
Post by: MikeJones on December 13, 2017, 03:30:58 PM
 

We are almost unanimous in naming the greats, Cypress, Pine Valley, The Old Course…we have some outliers, but for the most part we generally agree on the greatest in the world.



In a phrase 'herd mentality'


I'm sure there are hundreds of great courses that do not register on people's radar here because they are from a lesser known architects or in an area of less than natural beauty. It's easier to appreciate and comment on well known courses because there is more information on them and people are much more likely to have played them.

Title: Re: How we evaluate courses
Post by: Sean_A on December 13, 2017, 03:55:00 PM
George, Kalen and Pietro..for sure...I don't even try to pass off my recommendations as objective...which is what ranking numbers, weighted categories etc is all about...trying to inject an element of validity where it none exists.  Validity comes from people trusting another person's judgement most of the time...that is as good as it will ever get in this game...and most have people they trust in this way when it comes to courses, wine, books, music and on and on.  Granted, this trust is often on a more personal level, but nothing is perfect. I am merely attempting to provide a different perspective to the run of the mill rankings which reproduce at least 85% of the same courses over and over. 

Ciao
Title: Re: How we evaluate courses
Post by: Mark_Fine on December 13, 2017, 04:08:55 PM
But Sean, you said there are courses you won't even bother to go see?  I have played a lot of golf courses and I can't think of one that I didn't learn at least something from playing.  Maybe it was just one hole that was worth seeing or one design idea or maybe it was what not to do  ;)


I can't imagine you would swing by Mullen Nebraska and not be happy that someone told you to make sure you see Sand Hills when you are out that way.  Good thing it made it on at least someone's list  ;D
Title: Re: How we evaluate courses
Post by: Sean_A on December 13, 2017, 04:14:11 PM
But Sean, you said there are courses you won't even bother to go see?  I have played a lot of golf courses and I can't think of one that I didn't learn at least something from playing.  Maybe it was just one hole that was worth seeing or one design idea or maybe it was what not to do  ;)


I can't imagine you would swing by Mullen Nebraska and not be happy that someone told you to make sure you see Sand Hills when you are out that way.  Good thing it made it on at least someone's list  ;D

Mark

My recommendations are not about learning something.  They are more about what the course can provide as an experience, a day out...enjoyment...and if that experience is worth the effort and expense. 

Ciao
Title: Re: How we evaluate courses
Post by: Kalen Braley on December 13, 2017, 04:21:29 PM
I think Tom D once said years ago that a "better" ranking system would be something like


In Alphabetic order:
The top 10,
The next 15
The next 75...


That way people could still get an idea of the best of the best wthout quibbling over 3rd vs 6th.


P.S.  As far as I'm concerned, I think the vast vast majority of follks would be thrilled to death with playing any course in the top 100, instead of "it was only #80"
Title: Re: How we evaluate courses
Post by: Mark_Fine on December 13, 2017, 05:18:22 PM
Kalen,
And what you said it for the most part how most Top 100 lists should be looked at.  Anything in the Top 100 or Top 200 is probably worth at least one round (even by Sean's criteria)  ;D   Yet many on this site will argue FOREVER that they can't believe a course that is 52nd is ranked higher than a course which is ranked 73rd  ???    If it is one person's list, then it is very understandable but when it is a collection of opinions, it is not worth arguing about (though it can be fun) which is what makes the lists a lightning rod for debate! 
Mark
Title: Re: How we evaluate courses
Post by: Peter Pallotta on December 13, 2017, 05:48:12 PM
On the other hand, those who’ve played 100s of courses and work in the industry could knock off the false humility and tell the rest of us what they really think!  :)
And I mean what they *really* think, in their heart of hearts, and using not the pedestrian and utilitarian standards of the day, but the timeless and transcendent ones!
Sure, sure, I know — it’s all subjective (wink wink) and courses serve various purposes and types of golfers (ahem) and we have no idea (no, none at all) what environmental or budgetary restrictions the architect was working under.
Yes, yes — we all have valid tastes and opinions...until you experts get behind closed doors and start telling it like it is  :P
Title: Re: How we evaluate courses
Post by: Ulrich Mayring on December 13, 2017, 07:09:02 PM
I don't think anyone is trying to pass off his rankings as objective. I do however believe that some people perceive some rankings as objective because of a "star factor". And there may be some "stars", who are not always enthusiastically objecting to that :)

Subjective rankings are the only useful rankings, but they are even more useful, when the rater puts his biases out in the open. I'm all for strong opinions, but without full disclosure they tend to be self-serving.

Ulrich Mayring