The Assessment Diaries: Beyond Satisfaction Follow-Up

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

The results are in!  I recently shared assessment plans for our Dining for Success etiquette dinner.  We moved away from a satisfaction survey to a pre-dinner and post-dinner skills assessment for the first time and, as I shared in my previous post, I was a little nervous about the results.  Here is what we found:

Section One:  Understanding Formal Place Settings

Let’s face it.  We could all use a refresher on how not to steal your future boss’ bread plate, and our students were no different.  Before and after the event they were asked to identify each plate, cup, and piece of silverware in this photo:

Then, at the beginning of the event we had all utensils, plates, and glasses piled in the center of the table and asked each student to organize their place setting. We noticed a bit of uncertainty during the activity and our employer volunteers stepped in often to help, which tells us that students were not completely clear about formal place settings.

This experience conflicts with what we found via the assessment. We didn’t see much of a difference between pre and post results. In fact, most students correctly identified the items (with #6 dinner fork and #5 salad fork being confused just a few times).  We did see a slight drop in the number of blank responses, which could be interpreted to mean that students felt more certain about formal place settings after the event.

Section Two:  Appropriate vs. Inappropriate Table Topics

Students were asked to list three appropriate topics to discuss at mealtime interviews or networking event as well as three topics to avoid.  During the event, we provided employer volunteers with a list of suggestions and encouraged them to supplement based on their experience.

On the pre and post surveys, students were instructed to leave questions blank if they did not know the answer. Comparing responses revealed a significant increase in the number of students who answered these questions after the event.  We also noticed that a wider variety of more detailed topics were listed in the post surveys.  For example, students most often listed “career,” “food,” and “hobbies” in the pre-dinner survey, while post-dinner survey responses included things like “the professional’s background,” “the industry,” “new projects,” and “current events.”

Section Three: Ordering Food

While guests were only offered two entrèe options, employer volunteers were encouraged to share basic guidelines regarding how and what to order during sit-down dinners or interviews.  Almost all of the pre survey responses revolved around not ordering food that is too messy or difficult to eat.  Post survey results again provided more breadth and detail.  Student mentioned avoiding “smelly” food, considering price, and following the lead of the interviewer/host.  One student even suggested not ordering meat if your host is a vegetarian…discuss!

Section Four: Following Up

How should students follow up with an individual after a networking event or meal time interview?  Turns out, most students already understood the basics (insert career counselor sigh of relief here).  On the pre-event survey, many students responded that you should send a follow up thank you via e-mail (or in some cases, USPS), however after the event students included details like “within 24-48 hours” and mentioned LinkedIn for the first time.  

What we learned

Overall, we were happy with the improvements we saw between the pre and post-event surveys.  And, of course, we found that 97 percent of students were satisfied with the event!  Here are a few key takeaways and thoughts regarding the survey for next year’s event:

  • The table setting question may not have accurately measured students’ level of comfort with formal dining before and after the event.  The way the image was laid out may have been too simple.  For future surveys, we are considering having students draw a diagram or place items around a plate to more accurately reflect our table setting activity.

  • Students understand the basics regarding discussion topics, ordering, and following up after events, but the activities and discussions gave them a more broad and anecdotal understanding of how to navigate during mealtime events and interviews.

  • We will consider measuring different skills/content areas each year.  Our event also included activities revolving around introducing yourself and handling sticky situations that were not assessed in the pre- or post-event surveys.  It would be interesting to see how students’ understanding of these topics changed as a result of the event.

The Assessment Diaries: Beyond Satisfaction

Desalina Allen

A post by NACE Guest Blogger, Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

I almost waited to write this post until my assessment of our recent Dining for Success (DFS) etiquette dinner was complete.  Almost. I wanted to write about the learning demonstrated via our pre- and post-assessment after I was sure they actually demonstrated students learned something. Then I realized that I promised to provide a realistic overview of what it’s like to work with assessment day-to-day.

You know how sometimes when you start something new, more experienced people say things like “it’s not so scary” or “you’ll get the hang of it in no time”?  I may be guilty of saying these exact things when introducing assessment to others.  But I have a confession: The assessment of this event is scaring me.

Our DFS program has always been a hit with employers and students.  How do we know they like it?  We give out a post event survey that basically measures satisfaction with the event (and allows students to rate their overall learning):

The truth is, how could you not like an event like this? They get a great (oftentimes free) meal at a popular local restaurant, a chance to network, and tons of dining and interview tips. This is why moving away from a satisfaction survey is so scary – students are generally satisfied with our events and it’s rewarding (and easy) to share a summary of these surveys (95% of students reported that they would recommend this event to friends!).   

The problem is that, as educators, satisfaction isn’t all that we care about.  We want students to walk away having learned something from our events and learning can be challenging to measure. So, in an effort to make sure students were actually walking away with new information we prioritized topics of importance, introduced more structured activities to teach these topics, and provided enhanced training for our employers and staff.  

In assessment lingo: we set learning goals!  Here they are:

Students will be able to….

  • Identify the proper table arrangements at a formal dinner (including placement of silverware, bread plate, water and wine glass)

  • List two guidelines regarding what to order during a mealtime interview

  • List three appropriate discussion topics for a networking event

  • List three topics to avoid discussing during a networking event

  • List appropriate ways to follow up with professionals after events

To evaluate these goals, we measured students’ current level of knowledge with a pre event survey sent out with registration confirmations: you can view it here. Then at the end of the event, we had students fill out a nearly identical paper survey and encouraged input from employers and career services staff.  We also asked them ONE satisfaction question (because, hey, satisfaction is also important).

We are still tabulating the students’ responses and it’s nerve wracking.  I’m hoping I can share some really great improvements in their knowledge but there is always a risk that this doesn’t show up clearly in the survey results.  

Being that this is the first time we’ve approached the assessment of this event with pre and post surveys I’m sure there will be changes we need to make to the process.  I’ll be sharing the results and what we learned from this process in a follow up post but would love readers to share their experience setting and evaluating learning goals.  Has it worked for you? Have you evaluated programs this way? Any tips for pre and post surveys? What were the results? Any feedback on the learning goals or survey?