Whose Metrics? 

October 12, 2009 

SAN FRANCISCO -- Accountability was much discussed here last week at the
annual meeting of the Association of Community College Trustees, and
everyone agreed - in theory - that institutions should have to report on
measures that demonstrate their quality or lack thereof. There were
presentations on, and general support expressed for, the effort started by
the Bill & Melinda Gates Foundation and the Lumina Foundation for Education
-- announced just before the meeting -- to develop a voluntary national
accountability system for
<>  community

But what should the measures be? Two educators from Oregon started a
discussion of their state's accountability by asking the trustees in the
audience how they could tell if their institutions were successful. The
answers were varied and full of nuance, with trustees talking about the
different groups their institutions serve and the range of goals for
students and communities. Then the Oregonians asked the trustees if they
thought their legislators understood those complicated measures. The crowd
laughed and some suggested that maybe 10 percent did, if that.

The aim of the question wasn't to dispute the need of state policy makers to
have measures they understand and trust. Rather, Oregon's community college
leaders have decided they need measures beyond what the state is asking for
if they are to meet the state goals.

Camille Preus, commissioner of the Oregon Department of Community Colleges
and Workforce Development, discussed intense efforts over the last three
years in her state to create the right measures.

The larger context in Oregon -- as in the United States -- is a goal of
increasing college attainment. President Obama has set a goal of all
Americans obtaining at least one year of postsecondary education. In Oregon,
legislators, the governor, and state boards that oversee education have
agreed on a goal, by 2025, of having 40 percent of the adult public have a
bachelor's degree, 40 percent having an associate degree or a certificate
recognized by employers, and the remaining 20 percent at least a high school

To get there, the Legislature in 2007 -- a year in which community colleges
were given a healthy increase in state support -- also agreed on a set of
"key performance measures" for the colleges, requiring reporting by each of
the 17 community college measures, and giving them year-by-year goals.

This "disaggregation of data," Preus said, strengthened the power of the
data but also raised problems. It became impossible for colleges to hide any
weaknesses. But she also said that there was potential for unfairness as
legislators could look at the 17 colleges and want to know "why college X
isn't doing what college Y is doing."

She gave as an example the legislatively set goal for the percentage of
students at each Oregon community college who the next year have transferred
to an institution in the Oregon university system (this year's goal is 15.2
percent). Meeting such a goal is much easier, she noted, for the community
colleges that are in the same community as four-year institutions than it is
for her most remote institution, which is 250 miles from a university.

Among the other measures adopted by the lawmakers as key indicators:

*	Percentage of students enrolled in either remedial or ESL programs
who complete them. (The goal for 2010 is 63.7 percent, up from 47 percent in
*	The percentage of students in nursing who complete the program.
(Next year's goals is 73.7 percent.) 
*	The number of professional / technical degrees awarded each year.
(Next year's target is 5,101.) 
*	The percentage of students in associate degree programs who earn
associate degrees. (The target for next year is 31.6 percent.) 

The measures are almost all "outcome measures," in the parlance of
accountability. Preus said that there was nothing wrong with that, and that
these measures were important. "But these were the legislators' measures,
not ours," she said.

What the measures prompted was an in-depth study (with outside consulting
help) on what was actually going on at the community colleges, and the
results were in some ways disturbing, she said. Some of what was discovered
wouldn't have shown up in the state-required reporting, but raised real
questions about the ability of the state's education system to meet the
goals it set for itself.

For example, she said that the study drew attention to the relatively high
educational attainment of those who move to Oregon, higher in fact than the
state's averages -- a good sign about the state's ability to attract talent,
but also a sign that it wasn't reaching desired levels of educational
attainment for those who are growing up there.

"We found lots of areas where we are not as good as we thought we were," she

A series of meetings involving the community colleges led to a sense that to
meet the broader goals set by the state, new measures were also needed. Many
of these are "process point" measures, that report on student progress prior
to their end goal, rather than outcome measures. And yet Preus said these
measures are the ones that identify the ways individual colleges need to
change what they are doing, in a way that the state-set goals do not.

So the community colleges have now agreed to collect (and discuss
internally) the following "student success indicators":

*	High school students enrolling directly into college. 
*	The percentages of those enrolling at college and non-college levels
of work in math, reading and writing. 
*	The credits earned each year toward an associate of arts degree. 
*	The credits earned each year toward a career or technical
*	Semester to semester persistence rates. 
*	Fall to fall persistence rates. 
*	The percentage of GED students who advance to the next level of
their programs. 
*	GED fall to fall persistence. 
*	The percentage of English as a Second Language students who advance
to college level work. 

Laura Massey, director of institutional effectiveness at Portland Community
College, said that much of this data already existed at each college (as it
did for the state's measures), but was not necessarily being shared in
consistent ways.

Preus said that now that the colleges have agreed on these indicators, they
are going to collect and compare the data. Eventually, she said, goals will
likely be agreed upon for each of those measures, but this may be a process
led by the colleges, not the legislators. Asked if the goals might vary from
college to college, Massey (the institutional representative of the two)
nodded enthusiastically and Preus agreed that would likely be the case.

Another stage of the process, she said, was talking to legislators about the
new measures -- and not doing so in any way that suggests a lack of
commitment to the state-set measures.

In meetings with legislators, Preus said that she stresses that the colleges
are committed to the state agenda, and are in fact making progress as well
as reporting the required information every semester. "I think that gets the
colleges some credibility," she said. It's also important to remember, she
added, that the legislative priorities deserve respect.

But she is also talking about the additional measures, and why the colleges
are collecting that information. While the colleges could have tried --
without their own measures -- to have noted what was lacking in the state
measures, Preus doubts that would have worked. "You are going to max out in
your 30-second elevator talk before you explain," she said.

-  <mailto:[log in to unmask]> Scott Jaschik 

C Copyright 2009 Inside Higher Ed 

Related Stories

*	 <> Remediation
Worries and Successes
October 9, 2009 
*	 <> Half-Learned
October 9, 2009 
*	 <> Crowding Out
For-Profit Colleges
October 8, 2009 
*	 <> Served, Yes,
But Well-Served?
October 8, 2009 
*	 <>
Community College Accountability
October 7, 2009 




Community Colleges Praised for Technology Efforts

The e.Republic Center for Digital Education and Converge Magazine last week
named the most technology-savvy community colleges in the country, based on
a recent survey of community college officials. Montgomery
<>  County Community College, in Pennsylvania, received
the highest marks among colleges with more than 7,500 students, whose top
finishers also included two <>  colleges
<>  from Virginia and two <>
from Maryland <> . Laramie County Community
<>  College in Wyoming took the top spot among
mid-sized institutions (3,000 to 7,500 students), while Panola
<>  College in Texas won the small-college category.
The survey asked voters to assess the community colleges based on several
metrics, including their use of distance education, available technology
training, and the extent to which they had integrated Web 2.0 tools. The
full rankings are available on the center's Web
<>  site.






Dan Kern

AD21, Reading

East Central College

1964 Prairie Dell Road

Union, MO  63084-4344

Phone:  (636) 583-5195

Extension:  2426

Fax:  (636) 584-0513

Email:  [log in to unmask]


Cowardice asks the question, 'Is it safe?' Expediency asks the question, 'Is
it politic?' Vanity asks the question, 'Is it popular?' But, conscience asks
the question, 'Is it right?' And there comes a time when one must take a
position that is neither safe, nor politic, nor popular but one must take it
because one's conscience tells one that it is right. (Martin Luther King,

Instruction begins when you, the teacher, learn from the learner. Put

yourself in his place so that you may understand what he learns and

the way he understands it. (Kierkegaard)


To freely bloom - that is my definition of success. -Gerry Spence, lawyer
(b. 1929)    [Benjamin would be proud, I think.]


To access the LRNASST-L archives or User Guide, or to change your
subscription options (including subscribe/unsubscribe), point your web browser to

To contact the LRNASST-L owner, email [log in to unmask]