The Assessment Diaries: Beyond Satisfaction Follow-Up

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

The results are in!  I recently shared assessment plans for our Dining for Success etiquette dinner.  We moved away from a satisfaction survey to a pre-dinner and post-dinner skills assessment for the first time and, as I shared in my previous post, I was a little nervous about the results.  Here is what we found:

Section One:  Understanding Formal Place Settings

Let’s face it.  We could all use a refresher on how not to steal your future boss’ bread plate, and our students were no different.  Before and after the event they were asked to identify each plate, cup, and piece of silverware in this photo:

Then, at the beginning of the event we had all utensils, plates, and glasses piled in the center of the table and asked each student to organize their place setting. We noticed a bit of uncertainty during the activity and our employer volunteers stepped in often to help, which tells us that students were not completely clear about formal place settings.

This experience conflicts with what we found via the assessment. We didn’t see much of a difference between pre and post results. In fact, most students correctly identified the items (with #6 dinner fork and #5 salad fork being confused just a few times).  We did see a slight drop in the number of blank responses, which could be interpreted to mean that students felt more certain about formal place settings after the event.

Section Two:  Appropriate vs. Inappropriate Table Topics

Students were asked to list three appropriate topics to discuss at mealtime interviews or networking event as well as three topics to avoid.  During the event, we provided employer volunteers with a list of suggestions and encouraged them to supplement based on their experience.

On the pre and post surveys, students were instructed to leave questions blank if they did not know the answer. Comparing responses revealed a significant increase in the number of students who answered these questions after the event.  We also noticed that a wider variety of more detailed topics were listed in the post surveys.  For example, students most often listed “career,” “food,” and “hobbies” in the pre-dinner survey, while post-dinner survey responses included things like “the professional’s background,” “the industry,” “new projects,” and “current events.”

Section Three: Ordering Food

While guests were only offered two entrèe options, employer volunteers were encouraged to share basic guidelines regarding how and what to order during sit-down dinners or interviews.  Almost all of the pre survey responses revolved around not ordering food that is too messy or difficult to eat.  Post survey results again provided more breadth and detail.  Student mentioned avoiding “smelly” food, considering price, and following the lead of the interviewer/host.  One student even suggested not ordering meat if your host is a vegetarian…discuss!

Section Four: Following Up

How should students follow up with an individual after a networking event or meal time interview?  Turns out, most students already understood the basics (insert career counselor sigh of relief here).  On the pre-event survey, many students responded that you should send a follow up thank you via e-mail (or in some cases, USPS), however after the event students included details like “within 24-48 hours” and mentioned LinkedIn for the first time.  

What we learned

Overall, we were happy with the improvements we saw between the pre and post-event surveys.  And, of course, we found that 97 percent of students were satisfied with the event!  Here are a few key takeaways and thoughts regarding the survey for next year’s event:

  • The table setting question may not have accurately measured students’ level of comfort with formal dining before and after the event.  The way the image was laid out may have been too simple.  For future surveys, we are considering having students draw a diagram or place items around a plate to more accurately reflect our table setting activity.

  • Students understand the basics regarding discussion topics, ordering, and following up after events, but the activities and discussions gave them a more broad and anecdotal understanding of how to navigate during mealtime events and interviews.

  • We will consider measuring different skills/content areas each year.  Our event also included activities revolving around introducing yourself and handling sticky situations that were not assessed in the pre- or post-event surveys.  It would be interesting to see how students’ understanding of these topics changed as a result of the event.

Fixated on “First Destinations”

kevin grubbA post by NACE Guest Blogger, Kevin Grubb.
Assistant Director at Villanova University’s Career Center.
Twitter: @kevincgrubb
LinkedIn: http://www.linkedin.com/in/kevingrubb
Blog: “social @ edu”.

 That’s my official meditation for today at the NACE conference.  This morning, I attended a session hosted by the NACE First Destination Task Force where we discussed what’s been happening at the association and beyond with our increasingly critical surveys about where our graduates go after they leave our institutions.  With national attention being paid to these data and the numbers in the spotlight more often than ever, there’s no doubt this is a hot topic for career services attendees at the conference.  Here’s a breakdown of the session and some commentary by one of your faithful bloggers.

NACE has already released a position statement about these First Destinations surveys in July 2012, and we kicked off the session with a review of the principles laid out in this statement.  The short version of that is:

  • Post graduate success is the mission of entire institution, not just career services
  • All graduates of institutions should be tracked in these surveys
  • Career services should have central role in collecting this information
  • Outcomes should be inclusive, not just about immediate employment
  • Human subject & institutional research protocols should be observed when collecting information
  • Data may come from various reliable sources
  • Data collection should be on-going, with the final collection efforts completed by 6-9 months from graduation
  • Data should be reported in aggregate and should protect individual confidentiality
  • Outcome data should consider: response rates, academic program breakdown of data, job titles, employers, salary data, further academic study (what program and what institution)

The NACE Task Force is working on a version of a standardized first destination survey which can be used by all institutions.  The Task Force’s plan is to have all institutions be using this survey for the graduating class of 2014.  So, with that in mind, the Task Force needed to do quite a bit more beyond what has been set forth in the position statement.  Namely:

  • There would need to be a core set of questions to be asked universally and consistently
  • There would need to be establish definitions for standard measures (i.e. defining what “full-time employment” really means)
  • There would need to be an agreed upon appropriate time frame for data collection
  • There would need to be suggested response rate requirements to ensure that the data reported is statistically valid and reliable

This is all no small order.  What about entrepreneurs?  What about graduates in the summer, the fall, or schools on different academic calendars?  How can we standardize all of this?  Questions about the intricacies of this are abundant, and rightfully so.

The Task Force was ready to share a bit about where they are in the process, so here’s what was learned.

New Language for First Destination Surveys

  • Perhaps we can lay the “p” word to rest?  The suggestion is to call it “career outcomes” rather than “placement.”
  • Recognizing that information about post graduate career outcomes comes from various sources (not just our surveys), the suggestion is to consider “knowledge rates” rather than “response rates.”  For instance, say a faculty member or employer lets a career services office know a student was hired and reports job title & employer information.  That’s knowledge, not a “response.”
  • When the data collection period ends, we can “close the books.”  Ongoing data collection can and should happen after graduation, and the profession should consider counting early, mid and later in academic year graduates (not just traditional “Spring” grads).  However, knowing that spring graduation is the largest for a majority of institutions, we can consider closing the books six months after that date, which is approximately December 1.  NACE would consider reaching out for information by the end of December, and then could share aggregate data in January to legislators, those involved in public policy, and those in trends reporting.

Suggestions for type and amount of information to collect

  • The Task Force suggested a knowledge rate range between 65% and 85%.  This is to serve as an initial guidepost for us, and should help us find a workable range that is achievable, valid, and reliable.  Over time as we develop this, the suggested knowledge rate range may increase
  • The outcome measures to be provided include information such as (this is not the whole picture here): percentage of graduates employed full-time, those pursuing further study, those still seeking employment, and those not seeking employment.  While information should be collected for graduate and undergraduate students, there should also be separate information for the undergraduate and graduate levels as well
  • For the employment category, examples of information to collect include: job title, employer, salary (both base salary & guaranteed first year compensation, which includes signing bonuses)
  • For the further study category, the name of the academic program and institution name should be collected
  • If a student is working and pursuing further study, it is suggested that the data be categorized by the graduate’s primary pursuit.

A few more dimensions the Task Force is considering:

  • A way to measure a graduate’s satisfaction with their outcome?  Meaning: is this where they wanted to be?
  • For those who are reported as being employed full-time, is the employment related to their degree?
  • For now, the further study category is intended for those who are pursuing a graduate degree.  What about other types of study?  Certification programs?  Those who want to earn another undergraduate degree?

Suffice it to say, there are still many questions about this process yet to be answered.  But, I think I can safely say there is agreement that this is important work which needs doing.  It’s a challenge, no doubt.  Life doesn’t fit into defined categories easily, and so it follows that neither does one’s career plans.  At a time when many want to know, “is college worth it?,” these first destination data points can be key indicators of a piece of the puzzle that is an answer to that question.