An Insider’s Look at First-Destination Surveys

Vanessa Newton

Vanessa Newton, Program Analyst, University of Kansas
Twitter: https://twitter.com/vlnewt
LinkedIn: www.linkedin.com/in/vanessaliobanewton
Blog: www.wellnessblogging.com

 

(Part 1 of 4 on early adoption of the NACE First-Destination Survey Standards.)

When the NACE First-Destination Survey Standards and Protocols were released early this year, I went through the continuum of emotions. Happiness? Check. Worry? Yup. Frustration? You betcha! I had all the feelings. But, when I settled in to figure out how to implement these new standards and protocols, I learned a few things along the way. So, today is the first post in a four-post series written by me and my colleague, Katrina Zaremba, communications coordinator, giving you an inside look as to how the University Career Center at the University of Kansas (KU) is implementing these new standards.

First things first—no, this isn’t a reference to a slightly annoying song—if you have not read the standards and protocols, I would highly recommend you do that first. I’ll wait. Go on…oh you already read them? Well then… Ok, let’s get into the top three things that we changed/implemented at KU based off the standards and protocols, shall we?

One of the first things I did was change our survey. Previously, we just asked if graduates were employed full-time or part-time, attending graduate school, seeking employment, or not seeking employment. The additions that were made to this question excited me greatly. I loved the phrasing of “continuing education” versus “attending graduate school” since some of our graduates were not going on to graduate school, rather just getting additional schooling or a certificate. It was an easy change to add the additional categories and I think it will be interesting to see the data we get back and how it differs from previous years.

The second thing I needed to do was change how we defined our graduating class. We previously defined them as December, May, and August graduates, and now we define them as August, December, and May graduates. I’ll admit, it is a small change and a relatively easy one to make, but I really appreciated that the standards defined the graduating class. Small change, big impact, in my opinion.

Finally, we implemented “knowledge rate” last year, but we now have a very intense goal to reach to get to a 65 percent knowledge rate. We had a ~19 percent response rate from the surveys we sent to students and then bumped up our knowledge rate to ~40 percent with gaining information from LinkedIn and other reputable sources (a.k.a. some of our staff knew the graduates or the university paper wrote a story on where graduates go after they leave KU—two out of the three hadn’t responded to the survey and we couldn’t find them on LinkedIn…success!). We have been active and alert for any information regarding graduates and where they are going after they leave.

So there it is. Changing the survey, defining our graduating class, and implementing knowledge rate plus keying into ways that you can achieve that 65 percent. These are small changes/steps that you can take to ease into implementing the standards and protocols at your school.

Stay tuned for more posts from Katrina and me—we have a great series planned, giving you an inside look at our marketing, data analysis, and reporting, and providing some after thoughts once the first destination season has finished.

Feel free to use the comment section to leave your feedback and tips as well. Let’s open the conversation and share our stories! If there is interest, we may even do a bonus Q & A post in regard to first-destination surveys!

For more information on first-destination surveys, see the Advocacy section of NACEWeb.

 

3 thoughts on “An Insider’s Look at First-Destination Surveys

  1. Vanessa, this is fantastic! I went through similar emotions implementing the standards at my University, too. However, once you find what works for your institution, it really does make life MUCH easier! Plus, better statistics help to garner more attention and support from others on campus.

  2. Vanessa ~
    Thank you for sharing your story. As we work to adopt the new NACE Standards and Protocols, I feel a similar range of emotions. Although we did not need to change how we defined our graduating class and are fortunate enough to have a solid continual knowledge rate, I have invested in changing our survey and am now working on implementing the changes in reporting. I look forward to reading your blogs to follow about your next steps in this process!

  3. We’ve been modifying our alumni survey process here, and this was a great post about other process changes. We also start with December for our graduation window. I’d be interested in hearing about any “bumps” along the way with cleaning / verifying some of the knowledge data and any processes that were developed to be more consistent with weird cases. We had pulled data from LinkedIn, which is a bit messier than other data sources. I also sent our graduating class to the National Student Clearinghouse for subsequent enrollment for further knowledge data.

    Looking forward to other posts in this series.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s