The Assessment Diaries: Quick and Qualitative

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

Some of the assessment activities I have shared take time to develop (like the Pre/Post Dining Etiquette Survey) and/or require staff buy-in, training and socialization (like the Resume Rubrics).  Just last week, I decided super last minute that I wanted to assess a networking presentation for international students…last minute, as in, 20 minutes before the event. This exercise is proof that assessment doesn’t have to take hours and hours of your time—sometimes a quick pre/post writing exercise can give you insight into what needs to be changed about a program.

I need to invoke my earlier reminder that I promised to be honest when sharing my experiences with assessment, and this post is no different.  I’d like to say I was happy with these results, when instead I was disappointed to find that I probably assessed the wrong learning goal. I started with the fact that I wanted students to gain a more nuanced understanding of networking. Here’s what I did:

Twenty minutes before the presentation I grabbed some colorful paper—yellow would be used for my pre-assessment and pink for the post assessment. This color choice was not at all based on any carefully planned and research-supported theory that bright paper makes people happy; in fact, I did it to make sure I could keep the two “surveys” separate.

At the beginning of the event, I asked the students to spend two minutes writing about networking. It could have been their definition of networking or just words that come to mind; grammar and complete sentences not necessary. I then did the same thing at the end of the event.

I could have just looked through and summarized key trends from each sample, but I decided to get fancy, transcribe the text, and enter it into Wordle, a tool that generates word clouds.

Here’s the Pre-Workshop Wordle

Screen Shot 2014-02-11 at 3.18.09 PM.png

And the Post:

Screen Shot 2014-02-11 at 3.20.44 PM.png

While the results show that I focused on the importance of relationships, I don’t think I can claim that students gained a more in-depth understanding of networking.  What I did learn is that it seems like students already had a handle on the definition of networking, so perhaps I needed to assess their comfort level actually knowing how to network!

While this wasn’t the most successful assessment attempt, I do think that it can be great when you are trying to compare students’ knowledge of more difficult to assess topics (think professionalism, diversity, self-awareness).

Would you try it?

Read more of Desalina Allen’s blogs on assessment!

The Assessment Diaries: Beyond Satisfaction

Desalina Allen

A post by NACE Guest Blogger, Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

I almost waited to write this post until my assessment of our recent Dining for Success (DFS) etiquette dinner was complete.  Almost. I wanted to write about the learning demonstrated via our pre- and post-assessment after I was sure they actually demonstrated students learned something. Then I realized that I promised to provide a realistic overview of what it’s like to work with assessment day-to-day.

You know how sometimes when you start something new, more experienced people say things like “it’s not so scary” or “you’ll get the hang of it in no time”?  I may be guilty of saying these exact things when introducing assessment to others.  But I have a confession: The assessment of this event is scaring me.

Our DFS program has always been a hit with employers and students.  How do we know they like it?  We give out a post event survey that basically measures satisfaction with the event (and allows students to rate their overall learning):

The truth is, how could you not like an event like this? They get a great (oftentimes free) meal at a popular local restaurant, a chance to network, and tons of dining and interview tips. This is why moving away from a satisfaction survey is so scary – students are generally satisfied with our events and it’s rewarding (and easy) to share a summary of these surveys (95% of students reported that they would recommend this event to friends!).   

The problem is that, as educators, satisfaction isn’t all that we care about.  We want students to walk away having learned something from our events and learning can be challenging to measure. So, in an effort to make sure students were actually walking away with new information we prioritized topics of importance, introduced more structured activities to teach these topics, and provided enhanced training for our employers and staff.  

In assessment lingo: we set learning goals!  Here they are:

Students will be able to….

  • Identify the proper table arrangements at a formal dinner (including placement of silverware, bread plate, water and wine glass)

  • List two guidelines regarding what to order during a mealtime interview

  • List three appropriate discussion topics for a networking event

  • List three topics to avoid discussing during a networking event

  • List appropriate ways to follow up with professionals after events

To evaluate these goals, we measured students’ current level of knowledge with a pre event survey sent out with registration confirmations: you can view it here. Then at the end of the event, we had students fill out a nearly identical paper survey and encouraged input from employers and career services staff.  We also asked them ONE satisfaction question (because, hey, satisfaction is also important).

We are still tabulating the students’ responses and it’s nerve wracking.  I’m hoping I can share some really great improvements in their knowledge but there is always a risk that this doesn’t show up clearly in the survey results.  

Being that this is the first time we’ve approached the assessment of this event with pre and post surveys I’m sure there will be changes we need to make to the process.  I’ll be sharing the results and what we learned from this process in a follow up post but would love readers to share their experience setting and evaluating learning goals.  Has it worked for you? Have you evaluated programs this way? Any tips for pre and post surveys? What were the results? Any feedback on the learning goals or survey?