The Assessment Diaries: Beyond Satisfaction

Desalina Allen

A post by NACE Guest Blogger, Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

I almost waited to write this post until my assessment of our recent Dining for Success (DFS) etiquette dinner was complete.  Almost. I wanted to write about the learning demonstrated via our pre- and post-assessment after I was sure they actually demonstrated students learned something. Then I realized that I promised to provide a realistic overview of what it’s like to work with assessment day-to-day.

You know how sometimes when you start something new, more experienced people say things like “it’s not so scary” or “you’ll get the hang of it in no time”?  I may be guilty of saying these exact things when introducing assessment to others.  But I have a confession: The assessment of this event is scaring me.

Our DFS program has always been a hit with employers and students.  How do we know they like it?  We give out a post event survey that basically measures satisfaction with the event (and allows students to rate their overall learning):

The truth is, how could you not like an event like this? They get a great (oftentimes free) meal at a popular local restaurant, a chance to network, and tons of dining and interview tips. This is why moving away from a satisfaction survey is so scary – students are generally satisfied with our events and it’s rewarding (and easy) to share a summary of these surveys (95% of students reported that they would recommend this event to friends!).   

The problem is that, as educators, satisfaction isn’t all that we care about.  We want students to walk away having learned something from our events and learning can be challenging to measure. So, in an effort to make sure students were actually walking away with new information we prioritized topics of importance, introduced more structured activities to teach these topics, and provided enhanced training for our employers and staff.  

In assessment lingo: we set learning goals!  Here they are:

Students will be able to….

  • Identify the proper table arrangements at a formal dinner (including placement of silverware, bread plate, water and wine glass)

  • List two guidelines regarding what to order during a mealtime interview

  • List three appropriate discussion topics for a networking event

  • List three topics to avoid discussing during a networking event

  • List appropriate ways to follow up with professionals after events

To evaluate these goals, we measured students’ current level of knowledge with a pre event survey sent out with registration confirmations: you can view it here. Then at the end of the event, we had students fill out a nearly identical paper survey and encouraged input from employers and career services staff.  We also asked them ONE satisfaction question (because, hey, satisfaction is also important).

We are still tabulating the students’ responses and it’s nerve wracking.  I’m hoping I can share some really great improvements in their knowledge but there is always a risk that this doesn’t show up clearly in the survey results.  

Being that this is the first time we’ve approached the assessment of this event with pre and post surveys I’m sure there will be changes we need to make to the process.  I’ll be sharing the results and what we learned from this process in a follow up post but would love readers to share their experience setting and evaluating learning goals.  Has it worked for you? Have you evaluated programs this way? Any tips for pre and post surveys? What were the results? Any feedback on the learning goals or survey?