The Assessment Diaries: Beyond Satisfaction

Desalina Allen

A post by NACE Guest Blogger, Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

I almost waited to write this post until my assessment of our recent Dining for Success (DFS) etiquette dinner was complete.  Almost. I wanted to write about the learning demonstrated via our pre- and post-assessment after I was sure they actually demonstrated students learned something. Then I realized that I promised to provide a realistic overview of what it’s like to work with assessment day-to-day.

You know how sometimes when you start something new, more experienced people say things like “it’s not so scary” or “you’ll get the hang of it in no time”?  I may be guilty of saying these exact things when introducing assessment to others.  But I have a confession: The assessment of this event is scaring me.

Our DFS program has always been a hit with employers and students.  How do we know they like it?  We give out a post event survey that basically measures satisfaction with the event (and allows students to rate their overall learning):

The truth is, how could you not like an event like this? They get a great (oftentimes free) meal at a popular local restaurant, a chance to network, and tons of dining and interview tips. This is why moving away from a satisfaction survey is so scary – students are generally satisfied with our events and it’s rewarding (and easy) to share a summary of these surveys (95% of students reported that they would recommend this event to friends!).   

The problem is that, as educators, satisfaction isn’t all that we care about.  We want students to walk away having learned something from our events and learning can be challenging to measure. So, in an effort to make sure students were actually walking away with new information we prioritized topics of importance, introduced more structured activities to teach these topics, and provided enhanced training for our employers and staff.  

In assessment lingo: we set learning goals!  Here they are:

Students will be able to….

  • Identify the proper table arrangements at a formal dinner (including placement of silverware, bread plate, water and wine glass)

  • List two guidelines regarding what to order during a mealtime interview

  • List three appropriate discussion topics for a networking event

  • List three topics to avoid discussing during a networking event

  • List appropriate ways to follow up with professionals after events

To evaluate these goals, we measured students’ current level of knowledge with a pre event survey sent out with registration confirmations: you can view it here. Then at the end of the event, we had students fill out a nearly identical paper survey and encouraged input from employers and career services staff.  We also asked them ONE satisfaction question (because, hey, satisfaction is also important).

We are still tabulating the students’ responses and it’s nerve wracking.  I’m hoping I can share some really great improvements in their knowledge but there is always a risk that this doesn’t show up clearly in the survey results.  

Being that this is the first time we’ve approached the assessment of this event with pre and post surveys I’m sure there will be changes we need to make to the process.  I’ll be sharing the results and what we learned from this process in a follow up post but would love readers to share their experience setting and evaluating learning goals.  Has it worked for you? Have you evaluated programs this way? Any tips for pre and post surveys? What were the results? Any feedback on the learning goals or survey?

The Assessment Diaries: 5 Questions to Ask Before Creating a Survey

Desalina Allen

A post by NACE Guest Blogger, Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

In addition to some of the methods I’ve already mentioned surveys can be a great way to collect both quantitative and qualitative information from students, employers and other key career services stakeholders. There are definitely questions you should ask yourself before deciding that a survey is the right collection method, but I’ll save those for another post.

For now, let’s assume you are dead set on surveying and you just don’t want to end up like this guy:

Image courtesy of GifBin.com

Here are five questions to ask yourself before you start designing and distributing your survey:

What information do I absolutely need to collect? Consider whether you already have access to accurate information on students like major, department and graduation date before asking these questions in your survey.  If you do, you can ask for a student ID and match up the two sets of information.  Many of the online survey software platforms also allow you to upload a list of survey recipients and send each one a customized hyperlink so you don’t need to collect name and contact information. When we survey, we rarely ask for school, major or grad date because we often have this information updated via our Career Services Management System and/or registrar records.  Two or three fewer questions, now that’s exciting.

What is your population? When you review your results or write your report, what is the group that you are trying to describe?  Will it be students who attended a resume seminar (more specifically: a resume seminar on December 13 or any resume seminar throughout the year)? Is it all juniors, or only juniors who have completed summer internships?  Having a clear understanding of  your population, will help you answer the next question which is:

How many responses do I need? Depending on your survey method, budget and population size you may not get responses from everyone.  This is OK – statistics allows you to describe your population without having data from everyone. This chart is really helpful – find the approximate size of your population on the far left column and then find the corresponding number of responses necessary to describe that population.  For example if you are trying to describe a population of 25,000 undergraduate students, you may only need between 700 and 10,000 responses – depending on how certain you want assumptions to be.  You should also be sure that there is not a difference in the group that did and did not respond to your survey.  For example, if all of your responses came from people who attended a particular event, your results may be skewed as these people may differ from the total population.  Finally, do some benchmarking and check past reports to get an idea about the response rate that is considered reasonable.  In the example above, a 40 percent response rate (10,000/25,000) may be acceptable for a student satisfaction survey but not for your annual first destination survey.

How will I collect this information?  Websites like SurveyMonkey offer free accounts and many institutions have licenses for software such as Qualtrics (my platform of choice). Of course there is always the old fashioned paper and pencil method, which is still a very effective way to collect information. Career Service professionals may also check to see if their existing Career Services Manager system offers surveying features (Symplicity’s NACElink system offers this as an add-on).

Will multiple methods be required to achieve the desired number of responses? Using one method of surveying may not be enough to achieve your target response rate or get the information you need.  Consider using a combination of paper forms, online surveying, phone surveying, in-person interviews, and even online research. My fellow NACE guest blogger, Kevin Grubb, mentioned that the new NACE position statement on first destination surveys will now use the term “knowledge rate” instead of response rate as we often collect information from faculty, employers, and even LinkedIn research to gather information about our students career outcomes.

What do you think? Add your thoughts in the comments section!

The Assessment Diaries: It’s Not Just Data

Desalina Allen

A post by NACE Guest Blogger, Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

 

I have to admit that I’m pretty left brained when it comes to my work.  In fact, the thought of spending a quiet afternoon in front of Microsoft Excel, coffee in-hand, warms my heart (did I mention that I love coffee?).

photo credit: Shereen M via photopincc

It’s for that reason that when I first started learning about assessment I often equated it with data collection – as I’m sure many others do as well. Don’t get me wrong, it’s important to know how many and what types of students are using your services.  But, in addition to those metrics, it’s also valuable to think about demonstrating your offices’ success using qualitative information. Like J.K. Rowling said, “there’s always room for a story that can transport people to another place,” and who wouldn’t want advice from someone who lives in a house like this:

So what exactly is qualitative information? Basically, anything other than numerical data. It’s been on my mind because it seems that lately we have received quite a few requests for student success stories.  This isn’t surprising – stories supplement, support and strengthen the metrics we already share – and, unlike me, not everyone finds joy in looking at pie charts all day.

photo credit: mark.groves via photopin cc

Here are some examples of ways you can collect and organize qualitative information and how these methods support your assessment objectives:

  • Focus Groups or Advisory Boards:  These two methods are great ways to better understand your students’ needs.  They function well if you’ve sent out a survey and want help explaining some of the findings or if you feel (like many of us do) that your students are suffering from survey fatigue and won’t respond to one more request.  Focus groups tend to be groups brought together one time around a specific topic whereas advisory boards could meet throughout the academic year.  In both cases, be thoughtful about who you invite to the table (Do you want students from a particular background or school? Is it open to everyone or might you want to conduct interviews first?).  You’ll also want to think critically about who should be facilitating.  Consider both staff members and unbiased professionals who are specially trained.  Either way, be sure to document the planning, take notes/transcribe, and be ready to plan follow-up actions based on what you learned.

  • Word Association Exercises (Pre and Post):  Have students write down or share words they associate with a particular topic before and after an event or presentation to help measure if your core message came across.  For example, in a seminar on interviewing students may start the session offering words like “scary” or “questioning” and end sharing words like “preparation,” “practice” or “conversation.”  Keep track of the terms shared and use an application like wordle to look at the pre and post results side-by-side.

  • Observation:  You don’t need to bring in a team of consultants every time you need an external perspective.  Consider asking a trusted career services professional to attend your career fair, observe a workshop or review your employer services offerings and provide written feedback and suggestions. Offer your expertise on another topic to avoid paying a fee.  Keep notes on changes you have implemented based on the observation.

  • Benchmarking:  There are many reasons to benchmark.  For assessment purposes knowing what other schools are doing and how they compare to you helps give others context.  Being able to say that your program is the first of it’s kind or that it’s modeled off of an award winning one developed by a colleague may make more of an impact when combined with your standard student satisfaction survey results.

  • Staff:  We all are lucky enough to receive the occasional thank you note or email from a student who has really benefited from the programs and resources provided by the career center.  Come up with a standardized way to be able to quickly track those students.  It could be something as easy as a document on a shared drive or even a flag in your student management system.  Be sure to ask students’ permission, saying something like, “I’m so happy to hear our mock interview meeting helped you land that internship!  We are always looking for students who are willing to share their positive experiences, would you be comfortable sharing this information in the future should we receive a request?”

I’m sure there are many more ways to collect this type of information – please leave your questions and share your own experiences below!

The Importance of Social Media and Measuring ROI in Career Services Practices

Heather TranenA post by Guest Blogger, Heather Tranen
Associate Director, Global Communications & Strategic Outreach, NYU Wasserman Center for Career Development
Twitter: @htranen
LinkedIn: http://www.linkedin.com/in/heathertranen

 

 

Social media continues to grow in scope and power. There are so many platforms out there, and our students are all over them. To this generation, it’s almost as if things don’t actually happen unless they are filming, photographing, tweeting or status updating it.Screen Shot 2013-09-18 at 7.40.15 PM.png

Gen Y overshares and hyperconsumes content in the online space. They feel like things aren’t actually happening unless it’s happening on social media.

“They take technology for granted. They live through social media. They want the world their way, and they want it now.” – Forbs on Gen Y

As career services professionals, we need to navigate our communication strategies to both speak their language, and teach them to become fluent in the language of the professional world. Through social media, we can engage students in the space they are comfortable with, and then lure them into our office to connect to the tangible resources they need to be successful after college – a bait and switch of sorts.

These days, most understand that social media is here to stay. However, whether or not there is value in it remains questionable by many. Therefore, measuring ROI is crucial. Knowing the difference between vanity and actionable metrics is extremely important!

Vanity Metrics: It’s always nice to have a large following and fans to make us feel super important and liked. These vanity metrics are often how supervisors judge whether we are doing a good job. Yes, these are important. However, who are these individuals following or liking us? Are they strangers, or actually connections who are engaging and utilizing our resources?

Actionable Metrics: What really matters is whether our campaign translated into “performance” outcomes. Who retweeted us, who became more aware of our resources and came to the office to utilize them? These are the questions we should all ask when engaging with students in the social media space.

Metrics and ROI are becoming increasingly important in higher education.  I recommend looking at platforms like Hootsuite, Twitonomy, Klout, and Facebook admin pages to help you gather a valuable measurement of your engagement in the online space. Correlating the timing of your social media messaging with spikes in attendance or counseling requests also serves as a more abstract way of showing the impact of your social media practices, and proving you are social media all-stars!

The Assessment Diaries: An Introduction

Desalina Allen

A post by NACE Guest Blogger, Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina






The vampires are doing it….

And even that Carrie from Sex in the City (don’t pretend you aren’t watching)….

So why not chronicle the exciting, interesting and sometimes challenging task of Assessment in Career Services?

Assessment is indeed a hot topic thanks to the increasing pressure on institutions to demonstrate student outcomes and learning coming from parents, students, the media, accreditors, and the government.  Many of us have attended a workshop or presentations providing an overview of assessment or strategies for writing learning goals and objectives.  What happens when you get back to the office and have to put that theory into practice? How do you deal with lack of time, ever-changing technology, and the need to motivate others to help you?

While assessment is formally part of my role at NYU’s Wasserman Center for Career Development, it’s still really difficult to devote time and energy to the medium and long term planning and brainstorming necessary to set up our office for future success.  Student counseling, employer outreach, program planning and daily troubleshooting often seem to be higher priorities when I look at my email in the morning.  But assessment does need to be made a priority and nothing feels better than having accurate information to share when you get that call from a Dean, faculty member, parent or colleague requesting data or a success story.

With this series,  I hope to provide a realistic perspective on what it’s like not only to think about, but to also enact these strategies day-to-day.  The more I learn about and implement assessment strategies, the more I notice ways in which we can improve the way we collect, share and evaluate information. I’ll share my own tips, mistakes and challenges. I may even reach out to professionals in and outside of the career services world for advice and commentary.  I’ll look forward to your comments, feedback and ideas.

photo credit: h.koppdelaney via photopin cc