Print

Print


Thank you everyone for your suggestions!  Your ideas have been very
helpful and I look forward to more conversations on program assessment.

Thanks,
angela

-----Original Message-----
From: Open Forum for Learning Assistance Professionals
[mailto:[log in to unmask]] On Behalf Of Diana Calhoun Bell
Sent: Wednesday, September 24, 2008 12:26 PM
To: [log in to unmask]
Subject: Re: Assessment of the Learning Center impact on an institution

Angela,
In addition to what Sal wrote, some at my institution thought that the
good students would self-select into our programs, so in order to
disprove
this idea, we gather ACT scores and high school gpa's of our
participants
and compare those to the general student population. What we found is
that
students who participate in our programs have a lower average ACT score
than the general population and virutally the same gpa as the general
population.
Hope this helps,
diana bell

> Angela,
>
> My background is in program evaluation, so I do have a few suggestions
for
> you. First, document the goal and objectives of your program. I use
our SI
> Program as an example: The goal of SI is to help students become
> independent
> learners. Our objectives (how we measure our efforts) are as follows:
1)
> Improve student performance within targeted courses with more As, Bs,
and
> Cs; 2) Reduce the rate of attrition within targeted courses--less Ds,
Fs,
> and withdrawals; and 3) Ultimately increase retention and graduation
rates
> at our University.
>
> Second, develop a documentation plan that states what you are
evaluating
> and
> why. The why is critical as it outlines your expected evaluation
> results/outcomes and how you intend to use them.
>
> Again, using our SI Program as an example, I wanted to present
empirical
> data and effect assurance of the impact and value of Supplemental
> Instruction, so I conducted a summative evaluation of our SI Program's
> pilot
> year. Conducted immediately following the implementation of a program,
> summative evaluations analyze expected program outcomes in an effort
to
> "sum
> up" program results. In essence, a summative evaluation answers the
> question: Did the program do what it set out to do?
>
> While my summative evaluation focused solely on one service (SI), it
could
> be expanded to include others as well. The key is to use a host of
program
> variables. A summative evaluation should encompass a variety of tools
and
> measurement techniques to compare data (e.g., reactions and
> accomplishments)
> of participant and non-participant groups. It sounds like you already
have
> some good data. The next step may be to develop additional instruments
> (e.g., surveys or questionnaires) to collect evidence from other areas
of
> the program. Once you have collected and recorded all your data, the
fun
> part--analysis and interpretation--can begin.
>
> The task of evaluating a program is not easy and it takes an
incredible
> amount of time and effort. Having said that, the rewards are well
worth
> it.
> There are tons of reference materials on program evaluation (what it
is
> and
> how to do it). With some more time, I could come up with some authors
and
> books. I'm sure your college has some of these references in the
library
> stacks.
>
> sal
>
>
>
> -----Original Message-----
> From: Open Forum for Learning Assistance Professionals
> [mailto:[log in to unmask]] On Behalf Of Angela Nadeau
> Sent: Wednesday, September 17, 2008 2:59 PM
> To: [log in to unmask]
> Subject: Assessment of the Learning Center impact on an institution
>
> Hello All,
>
> I am the coordinator of a small Learning Center which provides
tutoring,
> facilitated study groups and academic coaching.  The center just
opened
> a year ago and I would like to assess the impact that the Learning
> Center has had on the institution.  We have a lot of data in terms of
> usage statistics, however, the center was designed to improve
retention
> and support the "soft skills" of academe.  We have looked also at pass
> rates for students that utilized the tutoring services; however, that
> hasn't shown a correlation.
>
> Has anyone designed assessments based on the institutional impact of
> academic support programs?  For instance, how do you measure and/or
> correlate academic support to retention and persistence?  Is this sort
> of assessment even possible?
>
> Any thoughts and suggestions would be greatly appreciated!
>
> Thank you,
> Angela
>
>  ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
>
> Angela Nadeau, M.A., LCPC
> Learning Center Coordinator
> York County Community College
> 112 College Drive
> Wells, Maine 04090
> (207) 646-9282 ext. 274
> email: [log in to unmask]
> website: http://www.yccc.edu/learningCenter/
> <http://www.yccc.edu/learningCenter/>  <http://www.yccc.edu>
>
>
>
> ~~~~~~~~~~~~~~~
> To access the LRNASST-L archives or User Guide, or to change your
> subscription options (including subscribe/unsubscribe), point your web
> browser to
> http://www.lists.ufl.edu/archives/lrnasst-l.html
>
> To contact the LRNASST-L owner, email [log in to unmask]
>


Dr. Diana C. Bell
Academic Resource Center Director
136 Madison Hall
University of Alabama in Huntsville
Huntsville, AL  35899
256.824.3142

"We cannot learn something without eating it, yet we cannot really learn
it either without letting it eat us."
                              (Peter Elbow)

~~~~~~~~~~~~~~~
To access the LRNASST-L archives or User Guide, or to change your
subscription options (including subscribe/unsubscribe), point your web
browser to
http://www.lists.ufl.edu/archives/lrnasst-l.html

To contact the LRNASST-L owner, email [log in to unmask]

~~~~~~~~~~~~~~~
To access the LRNASST-L archives or User Guide, or to change your
subscription options (including subscribe/unsubscribe), point your web browser to
http://www.lists.ufl.edu/archives/lrnasst-l.html

To contact the LRNASST-L owner, email [log in to unmask]