Skip repetitive navigational links
View: Next message | Previous More Hitsmessage
Next in topic | Previous More Hitsin topic
Next by same author | Previous More Hitsby same author
Previous page (April 2012) | Back to main LRNASST-L page
Join or leave LRNASST-L (or change settings)
Reply | Post a new message
Search
Log in
Options:   Chronologically | Most recent first
Proportional font | Non-proportional font

Subject:

Common Core

From:

Norman Stahl <[log in to unmask]>

Reply-To:

Open Forum for Learning Assistance Professionals <[log in to unmask]>

Date:

Fri, 6 Apr 2012 15:43:12 -0400

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (63 lines)

Please pay attention to anything that comes your way about the common core.......








ED WEEK


Published Online: April 5, 2012

Common-Core-Test Group Gives Higher Ed. Voting Rights
By Catherine Gewertz
Alexandria, Va.



A group of states that is designing tests for the common academic standards has taken a key step to ensure that the assessments reflect students’ readiness for college-level work: It gave top higher education officials from its leading states voting power on test-design questions that are closest to the heart of the college-readiness question.
At its quarterly meeting on April 3, the governing board of the Partnership for Assessment of Readiness for College and Careers, or PARCC, voted unanimously to give members of its advisory committee on college readiness voting power on four issues: how to describe the expected performance levels on the tests, who will set the cutoff scores for the tests, what evidence will be used to decide the cutoff scores, and, crucially, what the cutoff scores will be.
The move puts the highest-ranking officials from one college or university system in most of PARCC’s 24 member states at the voting table, alongside its governing board—the K-12 schools chiefs from each member state—when it comes to the most pivotal questions about crafting tests that reflect college readiness.
Richard M. Freeland, the commissioner of higher education in Massachusetts and co-chairman of PARCC’s college-readiness advisory committee, told the governing board that getting an active voice in the test-shaping process was something “we enthusiastically endorse and are happy to put our energy behind.”
The consortium is “taking a huge step in operationalizing” a definition of college readiness that reflects higher education’s expectations, Mitchell D. Chester, the commissioner of K-12 education in Massachusetts and the chairman of PARCC’s governing board, told the meeting participants.
Support Pivotal
PARCC’s decision illustrates the importance that states are placing on higher education’s embrace of the common-standards tests as proxies for college readiness. Colleges and universities pledged support to the idea. But their willingness to actually use the final tests as proxies for readiness—to let students skip remedial work and go right into entry-level, credit-bearing courses—is considered pivotal to the success of the common-standards initiative, which rests on the idea that mastery of those expectations will prepare students for college study.
“This verges on being historic,” said David T. Conley, an Oregon researcher widely known for his work to define college readiness. “In the U.S., on this scope and scale, it’s unprecedented to have this level of partnership between postsecondary systems and high school on a measurement of readiness.”
PARCC and another group of states, the SMARTER Balanced Assessment Consortium, have $360 million in federal Race to the Top money to design assessment systems for the Common Core State Standards. The standards, which cover English/language arts and mathematics, have been adopted by 46 states and the District of Columbia.
When the U.S. Department of Education offered test-design funding to groups of states, in April 2010, it asked for assessment systems that can serve many purposes. Those include measuring student achievement as well as student growth, judging teacher and school performance, offering formative feedback to help teachers guide instruction, and providing gauges of whether students are ready—or are on track to be ready—to make smooth transitions into college and good jobs.
Leaders of both consortia recognize that much is riding on the support of higher education, since the common-standards initiative rests on the claim that mastery of the standards—and passage of tests that embody them—indicate readiness for credit-bearing entry-level coursework. If colleges decline to use the tests to let students skip remedial work, that could undermine the claim that the tests reflect readiness for credit-bearing study.
That thinking was woven through the Education Department’s initial invitation to the states to band together to design the tests. To win grants in that competition, the consortia had to show that they had enlisted substantial support from their public college and university systems. Both did so.
The Challenge of Consensus
Whether those higher education systems maintain their support for the final tests remains to be seen, however. Skeptics have noted that getting states’ K-12 systems and their diverse array of college and university systems to agree on cutoff scores that connote proficiency in college-level skills, for instance, will be challenging.
“This cut-score thing is going to be a nightmare,” Chester E. Finn Jr., the president of the Thomas B. Fordham Institute, a Washington think tank, said at an August 2010 meeting of the National Assessment Governing Board, which sets policy for the National Assessment of Educational Progress, or NAEP. “I’m trying to envision Georgia and Connecticut trying to agree on a cut score for proficiency, and I’m envisioning an argument.”
PARCC’s college-readiness committee will not only vote on test-design issues, but it also already plays an active role in the consortium’s strategy to engage higher education colleagues in dialogue about the assessment and enlist their support, PARCC officials said. The consortium’s higher education leadership team, which includes additional college and university leaders, is also playing a leading role in that dialogue and engagement.
The SMARTER Balanced Assessment Consortium’s nine-member executive committeeincludes two higher education representatives with full voting power: Charles Lenth, the vice president for policy analysis and academic affairs for the State Higher Education Executive Officers, a Boulder, Colo.-based group, and Beverly L. Young, the assistant vice chancellor of academic affairs for the California State University system.
In addition, the consortium has appointed higher education representatives from each member state to provide input into test development and coordinate outreach to colleges and universities in their states. Higher education representatives also take part in 10 “work groups” that focus on key issues, such as psychometrics, technology, and accessibility and accommodations.
Mr. Conley, who advises the SMARTER Balanced group, said it is important to have higher education representatives at the table during test design to create a shared concept of the skills necessary to college success and how to measure those on a test. But he cautioned that those ideas must also have the support of college faculty members—not just their leadership—if the idea of shared standards is to succeed.
The consortium’s governance structure “is designed to ensure input from higher education through representation on the executive committee, collaboration with higher education state leads, and participation in state-led work groups,” said consortium spokesman Eddie Arnold.
Bumpy Road Ahead
Discussion at the PARCC governing board meeting offered hints about the difficulty of getting consensus on critical issues of test design.
Soliciting feedback from board members, Mary Ann Snider, Rhode Island’s chief of educator quality, asked how many performance levels they thought the tests should have: three, four, five, or some other number. Most states voted for four levels, largely mirroring the current practice in most PARCC states. Ms. Snider asked when indicators of being “on track” for college readiness should first appear on test results: in elementary, middle, or high school. Most members voted for elementary school.
She also asked whether the tests should show only how well students have mastered material from their current grade levels, or how well they’ve mastered content from the previous grade level, too. Responses came back deeply divided.
That question attempted to explore an important part of the dialogue about the new assessments: how to design them so they show parents, teachers, and others how students are progressing over time, rather than provide only a snapshot of a given moment. But the prospect of having a given grade’s tests reflect students’ mastery of earlier grades’ content raised some doubts on the board.
“If I’m a 5th grade teacher, am I now responsible for 4th grade content in my evaluation?” asked James Palmer, an interim division administrator in student assessment at the Illinois state board of education.
Some board members noted that indicators of mastery of the previous year’s content would be helpful in adjusting instruction. But others expressed doubt about whether a summative test was the best way to do that. Perhaps, they said, that function is better handled by other portions of the planned assessment system, such as its optional midyear assessments.
Gayle Potter, the director of student assessment in Arkansas, said it’s important to give parents and teachers important information about where students are in their learning. But she also said she worried about “giving teachers mixed signals” about their responsibility for lower grades’ content.

Special coverage on the alignment between K-12 schools and postsecondary education is supported in part by a grant from the Lumina Foundation for Education, atwww.luminafoundation.org.



Norman Stahl
[log in to unmask]


~~~~~~~~~~~~~~~
To access the LRNASST-L archives or User Guide, or to change your
subscription options (including subscribe/unsubscribe), point your web browser to
http://www.lists.ufl.edu/archives/lrnasst-l.html

To contact the LRNASST-L owner, email [log in to unmask]

Advanced Options


Options

Log In

Log In

Get Password

Get Password


Search Archives

Search Archives


Subscribe or Unsubscribe

Subscribe or Unsubscribe


Archives

December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011, Week 3
January 2011, Week 2
January 2011, Week 1
January 2011
December 2010, Week 5
December 2010, Week 4
December 2010, Week 3
December 2010, Week 2
December 2010, Week 1
November 2010, Week 5
November 2010, Week 4
November 2010, Week 3
November 2010, Week 2
November 2010, Week 1
October 2010, Week 5
October 2010, Week 4
October 2010, Week 3
October 2010, Week 2
October 2010, Week 1
September 2010, Week 5
September 2010, Week 4
September 2010, Week 3
September 2010, Week 2
September 2010, Week 1
August 2010, Week 5
August 2010, Week 4
August 2010, Week 3
August 2010, Week 2
August 2010, Week 1
July 2010, Week 5
July 2010, Week 4
July 2010, Week 3
July 2010, Week 2
July 2010, Week 1
June 2010, Week 5
June 2010, Week 4
June 2010, Week 3
June 2010, Week 2
June 2010, Week 1
May 2010, Week 4
May 2010, Week 3
May 2010, Week 2
May 2010, Week 1
April 2010, Week 5
April 2010, Week 4
April 2010, Week 3
April 2010, Week 2
April 2010, Week 1
March 2010, Week 5
March 2010, Week 4
March 2010, Week 3
March 2010, Week 2
March 2010, Week 1
February 2010, Week 4
February 2010, Week 3
February 2010, Week 2
February 2010, Week 1
January 2010, Week 5
January 2010, Week 4
January 2010, Week 3
January 2010, Week 2
January 2010, Week 1
December 2009, Week 5
December 2009, Week 4
December 2009, Week 3
December 2009, Week 2
December 2009, Week 1
November 2009, Week 5
November 2009, Week 4
November 2009, Week 3
November 2009, Week 2
November 2009, Week 1
October 2009, Week 5
October 2009, Week 4
October 2009, Week 3
October 2009, Week 2
October 2009, Week 1
September 2009, Week 5
September 2009, Week 4
September 2009, Week 3
September 2009, Week 2
September 2009, Week 1
August 2009, Week 5
August 2009, Week 4
August 2009, Week 3
August 2009, Week 2
August 2009, Week 1
July 2009, Week 5
July 2009, Week 4
July 2009, Week 3
July 2009, Week 2
July 2009, Week 1
June 2009, Week 5
June 2009, Week 4
June 2009, Week 3
June 2009, Week 2
June 2009, Week 1
May 2009, Week 5
May 2009, Week 4
May 2009, Week 3
May 2009, Week 2
May 2009, Week 1
April 2009, Week 5
April 2009, Week 4
April 2009, Week 3
April 2009, Week 2
April 2009, Week 1
March 2009, Week 5
March 2009, Week 4
March 2009, Week 3
March 2009, Week 2
March 2009, Week 1
February 2009, Week 4
February 2009, Week 3
February 2009, Week 2
February 2009, Week 1
January 2009, Week 5
January 2009, Week 4
January 2009, Week 3
January 2009, Week 2
January 2009, Week 1
December 2008, Week 5
December 2008, Week 4
December 2008, Week 3
December 2008, Week 2
December 2008, Week 1
November 2008, Week 5
November 2008, Week 4
November 2008, Week 3
November 2008, Week 2
November 2008, Week 1
October 2008, Week 5
October 2008, Week 4
October 2008, Week 3
October 2008, Week 2
October 2008, Week 1
September 2008, Week 5
September 2008, Week 4
September 2008, Week 3
September 2008, Week 2
September 2008, Week 1
August 2008, Week 5
August 2008, Week 4
August 2008, Week 3
August 2008, Week 2
August 2008, Week 1
July 2008, Week 5
July 2008, Week 4
July 2008, Week 3
July 2008, Week 2
July 2008, Week 1
June 2008, Week 5
June 2008, Week 4
June 2008, Week 3
June 2008, Week 2
June 2008, Week 1
May 2008, Week 5
May 2008, Week 4
May 2008, Week 3
May 2008, Week 2
May 2008, Week 1
April 2008, Week 5
April 2008, Week 4
April 2008, Week 3
April 2008, Week 2
April 2008, Week 1
March 2008, Week 5
March 2008, Week 4
March 2008, Week 3
March 2008, Week 2
March 2008, Week 1
February 2008, Week 5
February 2008, Week 4
February 2008, Week 3
February 2008, Week 2
February 2008, Week 1
January 2008, Week 5
January 2008, Week 4
January 2008, Week 3
January 2008, Week 2
January 2008, Week 1
December 2007
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007
December 2006
November 2006
October 2006
September 2006
August 2006
July 2006
June 2006
May 2006
April 2006
March 2006
February 2006
January 2006
December 2005
November 2005
October 2005
September 2005
August 2005
July 2005
June 2005
May 2005
April 2005
March 2005
February 2005
January 2005
December 2004
November 2004
October 2004
September 2004
August 2004
July 2004
June 2004
May 2004
April 2004
March 2004
February 2004
January 2004
December 2003
November 2003
October 2003
September 2003
August 2003
July 2003
June 2003
May 2003
April 2003
March 2003
February 2003
January 2003
December 2002
November 2002
October 2002
September 2002
August 2002
July 2002
June 2002
May 2002
April 2002
March 2002
February 2002
January 2002
December 2001
November 2001
October 2001
September 2001
August 2001
July 2001
June 2001
May 2001
April 2001
March 2001
February 2001
January 2001
December 2000
November 2000
October 2000
September 2000
August 2000
July 2000
June 2000
May 2000
April 2000
March 2000
February 2000
January 2000
December 1999
November 1999
October 1999
September 1999
August 1999
July 1999
June 1999
May 1999
April 1999
March 1999
February 1999
January 1999
December 1998
November 1998
October 1998
September 1998
August 1998
July 1998
June 1998
May 1998
April 1998
March 1998
February 1998
January 1998
December 1997
November 1997
October 1997
September 1997
August 1997
July 1997
June 1997
May 1997
April 1997
March 1997
February 1997
January 1997
December 1996
November 1996
October 1996
September 1996
August 1996
July 1996
June 1996
May 1996
April 1996
March 1996
February 1996
January 1996
December 1995
November 1995
October 1995
September 1995
August 1995
July 1995
June 1995
May 1995
April 1995
March 1995
February 1995
January 1995

ATOM RSS1 RSS2



LISTS.UFL.EDU

CataList Email List Search Powered by the LISTSERV Email List Manager