Print

Print


Hi,

Assessment in this field is always a difficult topic because there are so many factors that tie into academic student success.

Data that I tracked and collected includes:
- Number of student visits total with breakdowns by class, gender, major, college within the school, and subject for which the students are seeking tutoring
- Number of visits per student holistically and by class in a given semester
- Number of visits per course for high D, F, and W classes because it serves as the springboard to help us launch focused tutoring, SI, and similar support
- Final grades students earned in the courses for which they saw tutoring, including the breakdown of number of visits for that course (If you have the ability, it’s also interesting to look at correlations between final grade outcomes and when the student committed to regular tutoring visits. Obviously we can make an argument for better outcomes if they start coming for tutoring earlier rather than wait until final exams have almost begun.)
- Retention: are students who regularly use tutoring services still enrolled a year later, two years later, and so forth? What percentage of them are still enrolled at a given point? What percentage of them graduate from your institution?
- Track class visits you conduct for promotional purposes. What percentage of students in the class try out your services for the first time after that visit?
- We would look closely at spikes in utilization after different promotional programs as well. If we run a program out on the quad and then see a 15% jump in utilization in the next couple weeks, we presume there’s a connection and that this is an event we want to repeat.
- Look at the same tracking data in relation to attendance, frequency, and final course grade for focused programs such as SI and embedded tutoring.

I also track data in relation to individual tutors and divisions I supervise. That information is more for my benefit in holding discussions with the individual tutors for potential areas of improvement.

Additionally, we track for feedback ideally following individual sessions and certainly at the end of the term from students who used our services. Areas for which we want to measure satisfaction include that the student feels the session is beneficial, the student believes the tutor listened carefully and respectfully, the tutor provided thoughtful guidance, the student left the session with a better understanding of the content, the student is likely to recommend the tutor, the students is likely to recommend tutoring via that department/program, and asking what they thought was the best part of the session and where there was room for improvement. I try to collect both quantitative and qualitative data because we will be asked to hand it over periodically to justify maintaining or increasing funding for your department.

I hope this information is helpful.

PS Most of the scheduling and reporting software on the market can run reports for the data points I noted above.

Sincerely,
Debbie Malewicki, President
Integrity 1st Learning Support Solutions, LLC
446A Blake Street – Suite 101
New Haven, CT 06515
(203) 800-4100
[log in to unmask]
www.Integrity1stLSS.com
Facebook: @Integrity1stLSS

From: Open Forum for Learning Assistance Professionals <[log in to unmask]> on behalf of Rafalson, Lisa <[log in to unmask]>
Sent: Tuesday, July 7, 2020 12:58:19 PM
To: [log in to unmask] <[log in to unmask]>
Subject: Re: Assessment and Evaluation
 

James,
I appreciate the information you provided and will review the resource you provided.
I have thought about a similar analysis comparing the GPA of students who were tutored vs. those that were not by course. The lack of causality is an obvious problem and the multiple confounding variables too.
Thank so much,
Lisa


Lisa Rafalson, PhD

Assistant Vice President for Academic Affairs

Associate  Professor,Health Services Administration

American Council on Education Fellow 2018-19

D’Youville College

Office: 716-829-8489

Fax: 716-829-7780

she/hers/her

[log in to unmask]" style="margin:0px; width:2.6145in; height:0.302in">





From: Open Forum for Learning Assistance Professionals <[log in to unmask]> on behalf of Jamie Bondar <[log in to unmask]>
Sent: Tuesday, July 7, 2020 11:16 AM
To: [log in to unmask] <[log in to unmask]>
Subject: Re: Assessment and Evaluation
 

Hi Ruth,

 

There is a growing body of literature on this subject. I would use Norton & Agee’s “Assessment of Learning Assistance Programs: Supporting Professionals in the Field” as a starting point.

 

There really is no easy answer to the assessment question. It’s very difficult to prove a causal relationship between tutoring and retention and graduation—even though that relationship is quite intuitive. If a student utilizes tutoring and then passes the course for which they received tutoring, can we say for sure that the tutoring is what caused the student to pass? There are a host of other factors involved that make that claim difficult to make. The student could have attended office hours, changed their study habits, got assistance from a friend or may have already been from a privileged demographic group and therefore had economic advantages like having parents who attended college or having more time to study because of not having to work, etc.

 

Regarding quantitative data, there are many gifted statisticians in the learning center community who have tried/are trying to solve this problem by coming up with very elaborate methods to control for the types of external factors that I mention above, but I don’t think there’s any one silver bullet.

 

I’m currently working with my university’s office of institutional research and assessment on a model in which we’ll compare outcomes of students who utilize tutoring vs. those who don’t by GPA cohorts. For example, among students with GPAs between 3.0 and 3.5, how did tutored students do in a given class vs. untutored students. The idea is that this type of segmenting helps to minimize the effect of some of the external factors I mention above by comparing students with their GPA peers, but there are a host of problems that still remain with this model. For example, how many times does a student need to be tutored in order to see a tutoring effect?

 

Qualitative data is much easier to come by (student surveys being the most obvious option), but this type of assessment is less empirical because it relies on students’ assessment of their own learning and is anecdotal at best.

 

Assessment is one of my biggest challenges as a learning center director. My background is as an English professor and writing center person, so my training is definitely not in statistics and educational assessment, but I am now a student of these subjects. I’m very interested to hear what others are doing and if you know of more helpful articles or resources.

 

Best,

 

 

Jamie

 

 

 

 

Jamie P. Bondar

Director, Tutoring & Peer-to-Peer Success Services

Center for Learning & Academic Success

Senior Lecturer, English Department

Suffolk University

Boston, MA

617.573.8571

he/him/his

Suffolk.edu/clas

@JPBondar

 

 

 

 

 

 

From: Open Forum for Learning Assistance Professionals <[log in to unmask]> On Behalf Of Fries, Ruth A
Sent: Monday, July 6, 2020 4:23 PM
To: [log in to unmask]
Subject: Assessment and Evaluation

 

Good afternoon,

 

I hope you all had a wonderful holiday weekend.

 

We are in the process of evaluating our Student Learning Outcomes and the data we collect to measure these outcomes.  Would you be willing to share your Student Learning Outcomes and how you measure them?  What data do you collect and analyze to show the impact your centers have on retention and graduation? 

 

Thank you so much for your input into these very important areas. 

 

Have a great day,

Ruth

 

 

cid:<a href=[log in to unmask]" width="136" height="93" border="0" data-outlook-trace="F:1|T:1" src="cid:[log in to unmask]" style="width:1.4166in; height:.9687in">

Ruth Fries, MAED

Director, Disability Services/Academic Achievement

Adjunct Professor, School of Education

P: 651-628-3241  | F: 651-628-2065
[log in to unmask]  |   www.unwsp.edu

Responsibility | Belief | Developer | Harmony | Connectedness

 

Equipping Christ-centered learners and leaders 

to invest in others and impact the world.

 

 

~~~~~~~~~~~~~~~ To access the LRNASST-L archives or User Guide, or to change your subscription options (including subscribe/unsubscribe), point your web browser to http://www.lists.ufl.edu/archives/lrnasst-l.html To contact the LRNASST-L owner, email [log in to unmask]

~~~~~~~~~~~~~~~ To access the LRNASST-L archives or User Guide, or to change your subscription options (including subscribe/unsubscribe), point your web browser to http://www.lists.ufl.edu/archives/lrnasst-l.html To contact the LRNASST-L owner, email [log in to unmask]
D’YOUVILLE CONFIDENTIALITY NOTICE: This communication, including any attachments, may contain sensitive and/or confidential information and is intended only for the individual or entity to which it is addressed. Any unauthorized review, dissemination, distribution, or copying of this communication is strictly prohibited. If you are not the intended recipient, please contact the sender by reply email and delete and destroy all copies of the original message. ~~~~~~~~~~~~~~~ To access the LRNASST-L archives or User Guide, or to change your subscription options (including subscribe/unsubscribe), point your web browser to http://www.lists.ufl.edu/archives/lrnasst-l.html To contact the LRNASST-L owner, email [log in to unmask]
~~~~~~~~~~~~~~~ To access the LRNASST-L archives or User Guide, or to change your subscription options (including subscribe/unsubscribe), point your web browser to http://www.lists.ufl.edu/archives/lrnasst-l.html To contact the LRNASST-L owner, email [log in to unmask]