At 11:54 AM 1/6/98 -0800, Roger wrote:
>What does the following quote from your recent email mean to you?:
>"Only 3 of 46 studies reviewed in 1971 showed a significant relationship
between GPA and performance; half the studies showed no correlation
I know that many of those of us who have been involved in some kind of
grading system are having to wrestle with the questions you have raised.
For one thing, your question,
>Would all the studies have used similiar or consistent performance measures?
I think is really at the core of how those of us in academic situations use
grades. The studies done ten or so years ago showed that there _weren't_
consistent performance measures then being used in the grading. And even
today, I am not sure how consistent college grading is with any performance
measures. Ohmer, whom I cited in my last posting, seemed well aware of the
complexity of the grade "problem":
"There is certainly no single recommendation for improving the grading
game" Ohmer wrote; "the magnitude of the grading system, the complexities
of subject matter, smugly recalcitrant faculties, and the varying aims of
colleges and universities signal the necessity for several approaches if we
hope to make even a dent in the academic armour surrounding the grade issue."
There were other indicators of problems with grading consistency. In the
period from 1965 to 1980, average GPA's rose nationwide, while SAT scores
fell. By 1990, as I recall, U.S. high school students had the highest
confidence in their ability to do math of all the industrialized nations,
yet had one of the lowest scores in standardized math tests.
Although employers and graduate schools still seem to give a certain weight
to the meaning of GPA's, many (most?) faculty still have no con