The Assessment Diaries: Quick and Qualitative

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

Some of the assessment activities I have shared take time to develop (like the Pre/Post Dining Etiquette Survey) and/or require staff buy-in, training and socialization (like the Resume Rubrics).  Just last week, I decided super last minute that I wanted to assess a networking presentation for international students…last minute, as in, 20 minutes before the event. This exercise is proof that assessment doesn’t have to take hours and hours of your time—sometimes a quick pre/post writing exercise can give you insight into what needs to be changed about a program.

I need to invoke my earlier reminder that I promised to be honest when sharing my experiences with assessment, and this post is no different.  I’d like to say I was happy with these results, when instead I was disappointed to find that I probably assessed the wrong learning goal. I started with the fact that I wanted students to gain a more nuanced understanding of networking. Here’s what I did:

Twenty minutes before the presentation I grabbed some colorful paper—yellow would be used for my pre-assessment and pink for the post assessment. This color choice was not at all based on any carefully planned and research-supported theory that bright paper makes people happy; in fact, I did it to make sure I could keep the two “surveys” separate.

At the beginning of the event, I asked the students to spend two minutes writing about networking. It could have been their definition of networking or just words that come to mind; grammar and complete sentences not necessary. I then did the same thing at the end of the event.

I could have just looked through and summarized key trends from each sample, but I decided to get fancy, transcribe the text, and enter it into Wordle, a tool that generates word clouds.

Here’s the Pre-Workshop Wordle

Screen Shot 2014-02-11 at 3.18.09 PM.png

And the Post:

Screen Shot 2014-02-11 at 3.20.44 PM.png

While the results show that I focused on the importance of relationships, I don’t think I can claim that students gained a more in-depth understanding of networking.  What I did learn is that it seems like students already had a handle on the definition of networking, so perhaps I needed to assess their comfort level actually knowing how to network!

While this wasn’t the most successful assessment attempt, I do think that it can be great when you are trying to compare students’ knowledge of more difficult to assess topics (think professionalism, diversity, self-awareness).

Would you try it?

Read more of Desalina Allen’s blogs on assessment!

NACE Flash Poll: Is “Career” in Your Institution’s Curriculum?

kevin grubbNACE Ambassador Kevin Grubb
Assistant Director at Villanova University’s Career Center.
Twitter: @kevincgrubb
LinkedIn: http://www.linkedin.com/in/kevingrubb
Blog: “social @ edu”

One of the latest trends in career services is the establishment of a career or professional development class embedded into curriculum. Courses may be required, optional, for credit or non-credit bearing. With the importance of career outcomes rising for colleges and universities, this is a new possible solution for providing career education to all students.

NACE blog readers, is “career” in your institution’s curriculum? Share your answer in this poll and tell us about your career course in a comment. What do you teach and how do you do it?

For more information on this topic, check out NACE’s Career Course Syllabi.

The Assessment Diaries: Rubric Roundup

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

I recently wrote about the problem with asking students to assess their own learning. In a nutshell—studies show we are not able to accurately measure our own learning and that we even tend to overestimate what we have learned.

This concept can definitely be applied to resumes.  Even the best resume presentation or one-on-one review isn’t always enough to really teach students what makes an excellent vs. just an OK resume.  We already know this anecdotally—when students come back for two, three, or four reviews and haven’t yet mastered some of the basics it demonstrates just how complex marketing yourself on paper can be.  Thus, we cannot use students’ self-reported learning after these events or meetings as evidence that they really learned. 

As career services professionals, we could critique resumes in our sleep.  I know I’ve easily reviewed thousands of resumes in my five short years working in career development! For this reason, when we want an accurate understanding of how well our students are marketing themselves via their resumes it makes more sense for us as experts to evaluate them.

Enter resume rubrics. Rubrics are a way to standardize the way we define and measure something.  They also make our evaluation techniques more transparent and clear to students and can be a useful tool in training new staff members.

When I started developing a rubric for NYU Wasserman, I found it extremely helpful to look at examples from NACE and other schools.  I then created a draft and brought it first to my assessment team and then to staff as a whole for feedback.  Several revisions later, we had a document that made explicit what we look for in a resume.  More specifically, we defined what makes an excellent resume vs. a good resume vs. one that needs improvement.

Once you have your rubric, you can track and report on how your students are doing as a whole (or by class year, major, etc.).  If you have enough time and patience, you can also follow a student’s progress over time or after a specific resume intervention. For example, evaluate a student’s resume before a workshop and then encourage them to come back with changes and evaluate it again.  Did they improve? Which topics were still difficult to grasp? Might you need to spend more time addressing those during the workshop?

Below you will find some examples of resume rubrics that I have found helpful, as well as the rubric we use at NYU.  Do you use rubrics at your institution?  If so, please share them in the comments section!

Examples:
NACE (Resume), ReadWriteThink (Resume and Cover Letter), Amherst Career Center (Resume), Illinois State (Resume), Liberty University (Online Resume)

NYU Wasserman:

Don’t miss “The Mystery of the Resume Writing Assessment” Part 1 and Part 2.

Read more of Desalina Allen’s blogs on assessment!

The Assessment Diaries: The Mystery of the Resume Writing Assessment (Part 2)

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

When we last left off, you were shocked at the fact that your post-resume writing seminar survey results could have been so misleading.  Students reported to have learned the basics of resume writing but, when you followed up with an in-person meeting with one of your attendees, it was obvious that the tips and guidelines you provided were not applied.

Have you ever created or taken a survey with a question or questions like the ones below?

This seminar improved my understanding of resume writing basics:
Strongly Disagree/Disagree/Neutral/Agree/Strongly Agree

I learned something from this seminar:
True/False

As a result of this seminar, I now understand what employers look for in a resume:
Strongly Disagree/Disagree/Neutral/Agree/Strongly Agree

What is the problem here?  Well, if you are simply looking for evidence that students believe they have learned something from your event there is no problem at all.  But, if you are trying to collect evidence that students actually learned something well then …..

Why? Because studies show* that students are not able to accurately measure their own growth or learning.  Not only do they incorrectly estimate growth, they tend to overestimate it.  It makes sense, right? If someone asks you after a presentation or a class if you learned something, how do you really know if you did?  

As a result of this, we cannot use students’ self-reported growth as evidence of growth.  Instead, we have to utilize other assessment methods to really prove they learned something.  How? By doing a pre- and post-assessment of student knowledge (like I did for our etiquette dinner) and comparing results, or coming up with a standardized way to evaluate resumes (via a rubric) and look at the change over time.

Last year, one of our learning goals was to ensure that students were learning career- related skills like resume writing.  We did away with our post seminar surveys and instead created resume rubrics to use with students.  I’ll be sharing that experience in my next few posts, along with helpful resources if your office is look to create your own resume rubrics!
*Thank you to Sonia DeLuca Fernandez, our Director of Research and Assessment for Student Affairs, for this article that can be found in Research and Practice in Assessment, Volume 8.

The Assessment Diaries: The Mystery of the Resume Writing Assessment (Part 1)

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

In career services, most of us are used to facilitating workshops that teach our students or alumni skills.  The topic could be leadership, networking, career research, or social media and the job search.  Oftentimes, after these events we send out surveys to determine just how much students learned.  We ask if students feel more comfortable with the topic and understand some of the key take-a-ways. We may even throw in a satisfaction question or two.

Today,  I want you to imagine that you’re getting ready to facilitate one of those workshops and the topic is: Resume writing!  Don’t get too excited….

You know how when you start a presentation, especially one you’ve done often, you pretty immediately get a sense of how the audience will respond?  Sometimes you walk in and students are just waiting for you with that expression on their face that tells you even if Eddie Murphy were giving this presentation they might sleep through the entire thing?

Well, on this day you experience the exact opposite. Students are eager, smiling, even awake. They raise their hand when you ask for input and they actually laugh at your pathetic resume jokes (that you’ve managed to add just to keep yourself interested). You talk about clarity of format, keeping it to a page, customizing it for each position and you look around only to see heads nodding vigorously.

After the presentation you review the post event surveys. Students are giving you high marks across the board: they now understand resume basics, they feel they can apply these concepts to their own resumes, they even write comments about how great of a presenter you are.

That night, you check your e-mail and you have a very sweet request from one of the participants:  She notes that she learned a lot from the presentation but wants to come in tomorrow for a quick resume review just make sure everything is OK before she applies to a position. You reply “Sure!” thinking to yourself, “this should take only 15 minutes.”

Fast forward to tomorrow.  The student is seated in front of you.  As she reaches into her backpack to pull out her resume, your view switches to slow motion.  Suddenly, you catch a glimmer of light bouncing off of the object she’s taking out….

…..wait

…what the

….is that

….is that a staple??  

So, obviously this is a HUGE exaggeration (cue sarcastic snickers), but what went wrong here? Didn’t you talk about page length? Weren’t you clear about editing out non-relevant content? Surely you touched on including pictures. How could it be that after all of your hard work and intuition the student just didn’t get the point?  What about all of your positive survey results? Could they have misled you?

Stay tuned for part 2 of The Mystery of the Resume Writing Assessment where I’ll discuss the post-event assessment.  In the meantime…any guesses, comments, or thoughts on why this approach doesn’t always work? Leave them in the comments section below!

The Assessment Diaries: Beyond Satisfaction Follow-Up

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

The results are in!  I recently shared assessment plans for our Dining for Success etiquette dinner.  We moved away from a satisfaction survey to a pre-dinner and post-dinner skills assessment for the first time and, as I shared in my previous post, I was a little nervous about the results.  Here is what we found:

Section One:  Understanding Formal Place Settings

Let’s face it.  We could all use a refresher on how not to steal your future boss’ bread plate, and our students were no different.  Before and after the event they were asked to identify each plate, cup, and piece of silverware in this photo:

Then, at the beginning of the event we had all utensils, plates, and glasses piled in the center of the table and asked each student to organize their place setting. We noticed a bit of uncertainty during the activity and our employer volunteers stepped in often to help, which tells us that students were not completely clear about formal place settings.

This experience conflicts with what we found via the assessment. We didn’t see much of a difference between pre and post results. In fact, most students correctly identified the items (with #6 dinner fork and #5 salad fork being confused just a few times).  We did see a slight drop in the number of blank responses, which could be interpreted to mean that students felt more certain about formal place settings after the event.

Section Two:  Appropriate vs. Inappropriate Table Topics

Students were asked to list three appropriate topics to discuss at mealtime interviews or networking event as well as three topics to avoid.  During the event, we provided employer volunteers with a list of suggestions and encouraged them to supplement based on their experience.

On the pre and post surveys, students were instructed to leave questions blank if they did not know the answer. Comparing responses revealed a significant increase in the number of students who answered these questions after the event.  We also noticed that a wider variety of more detailed topics were listed in the post surveys.  For example, students most often listed “career,” “food,” and “hobbies” in the pre-dinner survey, while post-dinner survey responses included things like “the professional’s background,” “the industry,” “new projects,” and “current events.”

Section Three: Ordering Food

While guests were only offered two entrèe options, employer volunteers were encouraged to share basic guidelines regarding how and what to order during sit-down dinners or interviews.  Almost all of the pre survey responses revolved around not ordering food that is too messy or difficult to eat.  Post survey results again provided more breadth and detail.  Student mentioned avoiding “smelly” food, considering price, and following the lead of the interviewer/host.  One student even suggested not ordering meat if your host is a vegetarian…discuss!

Section Four: Following Up

How should students follow up with an individual after a networking event or meal time interview?  Turns out, most students already understood the basics (insert career counselor sigh of relief here).  On the pre-event survey, many students responded that you should send a follow up thank you via e-mail (or in some cases, USPS), however after the event students included details like “within 24-48 hours” and mentioned LinkedIn for the first time.  

What we learned

Overall, we were happy with the improvements we saw between the pre and post-event surveys.  And, of course, we found that 97 percent of students were satisfied with the event!  Here are a few key takeaways and thoughts regarding the survey for next year’s event:

  • The table setting question may not have accurately measured students’ level of comfort with formal dining before and after the event.  The way the image was laid out may have been too simple.  For future surveys, we are considering having students draw a diagram or place items around a plate to more accurately reflect our table setting activity.

  • Students understand the basics regarding discussion topics, ordering, and following up after events, but the activities and discussions gave them a more broad and anecdotal understanding of how to navigate during mealtime events and interviews.

  • We will consider measuring different skills/content areas each year.  Our event also included activities revolving around introducing yourself and handling sticky situations that were not assessed in the pre- or post-event surveys.  It would be interesting to see how students’ understanding of these topics changed as a result of the event.

The Assessment Diaries Poll: Does your office actively seek candidates with assessment-related skills?

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

Today, I’m taking a break from sharing my assessment experience and looking to the NACE community for some feedback. I’ve already alluded to some of the skills that are important to develop when working with assessment but have included more details here.

Assessment-Related Skills/Competencies:

  • Familiarity with assessment design, including writing learning goals

  • Experience conducting qualitative research via focus groups , interviews or benchmarking

  • Knowledge of survey methodology and survey software

  • Ability to analyze quantitative information using Microsoft Excel or other statistical software such as SPSS, STATA

  • Comfortable summarizing and reporting qualitative and quantitative research findings to audiences with various backgrounds

These skills and competencies can definitely be learned (I’m still working on them myself) but my guess is that they aren’t everyone’s cup of tea. Which leads me to my question: Does your office actively seek candidates with assessment-related skills?  This could mean including them in a job description, creating a position with official assessment responsibilities, or screening for these skills via the resume review process.

Please respond and include any comments below:

P.S. I found this really thorough overview of assessment skills via ACPA

The Assessment Diaries: Beyond Satisfaction

Desalina Allen

A post by NACE Guest Blogger, Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

I almost waited to write this post until my assessment of our recent Dining for Success (DFS) etiquette dinner was complete.  Almost. I wanted to write about the learning demonstrated via our pre- and post-assessment after I was sure they actually demonstrated students learned something. Then I realized that I promised to provide a realistic overview of what it’s like to work with assessment day-to-day.

You know how sometimes when you start something new, more experienced people say things like “it’s not so scary” or “you’ll get the hang of it in no time”?  I may be guilty of saying these exact things when introducing assessment to others.  But I have a confession: The assessment of this event is scaring me.

Our DFS program has always been a hit with employers and students.  How do we know they like it?  We give out a post event survey that basically measures satisfaction with the event (and allows students to rate their overall learning):

The truth is, how could you not like an event like this? They get a great (oftentimes free) meal at a popular local restaurant, a chance to network, and tons of dining and interview tips. This is why moving away from a satisfaction survey is so scary – students are generally satisfied with our events and it’s rewarding (and easy) to share a summary of these surveys (95% of students reported that they would recommend this event to friends!).   

The problem is that, as educators, satisfaction isn’t all that we care about.  We want students to walk away having learned something from our events and learning can be challenging to measure. So, in an effort to make sure students were actually walking away with new information we prioritized topics of importance, introduced more structured activities to teach these topics, and provided enhanced training for our employers and staff.  

In assessment lingo: we set learning goals!  Here they are:

Students will be able to….

  • Identify the proper table arrangements at a formal dinner (including placement of silverware, bread plate, water and wine glass)

  • List two guidelines regarding what to order during a mealtime interview

  • List three appropriate discussion topics for a networking event

  • List three topics to avoid discussing during a networking event

  • List appropriate ways to follow up with professionals after events

To evaluate these goals, we measured students’ current level of knowledge with a pre event survey sent out with registration confirmations: you can view it here. Then at the end of the event, we had students fill out a nearly identical paper survey and encouraged input from employers and career services staff.  We also asked them ONE satisfaction question (because, hey, satisfaction is also important).

We are still tabulating the students’ responses and it’s nerve wracking.  I’m hoping I can share some really great improvements in their knowledge but there is always a risk that this doesn’t show up clearly in the survey results.  

Being that this is the first time we’ve approached the assessment of this event with pre and post surveys I’m sure there will be changes we need to make to the process.  I’ll be sharing the results and what we learned from this process in a follow up post but would love readers to share their experience setting and evaluating learning goals.  Has it worked for you? Have you evaluated programs this way? Any tips for pre and post surveys? What were the results? Any feedback on the learning goals or survey?

The Assessment Diaries: 5 Questions to Ask Before Creating a Survey

Desalina Allen

A post by NACE Guest Blogger, Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

In addition to some of the methods I’ve already mentioned surveys can be a great way to collect both quantitative and qualitative information from students, employers and other key career services stakeholders. There are definitely questions you should ask yourself before deciding that a survey is the right collection method, but I’ll save those for another post.

For now, let’s assume you are dead set on surveying and you just don’t want to end up like this guy:

Image courtesy of GifBin.com

Here are five questions to ask yourself before you start designing and distributing your survey:

What information do I absolutely need to collect? Consider whether you already have access to accurate information on students like major, department and graduation date before asking these questions in your survey.  If you do, you can ask for a student ID and match up the two sets of information.  Many of the online survey software platforms also allow you to upload a list of survey recipients and send each one a customized hyperlink so you don’t need to collect name and contact information. When we survey, we rarely ask for school, major or grad date because we often have this information updated via our Career Services Management System and/or registrar records.  Two or three fewer questions, now that’s exciting.

What is your population? When you review your results or write your report, what is the group that you are trying to describe?  Will it be students who attended a resume seminar (more specifically: a resume seminar on December 13 or any resume seminar throughout the year)? Is it all juniors, or only juniors who have completed summer internships?  Having a clear understanding of  your population, will help you answer the next question which is:

How many responses do I need? Depending on your survey method, budget and population size you may not get responses from everyone.  This is OK – statistics allows you to describe your population without having data from everyone. This chart is really helpful – find the approximate size of your population on the far left column and then find the corresponding number of responses necessary to describe that population.  For example if you are trying to describe a population of 25,000 undergraduate students, you may only need between 700 and 10,000 responses – depending on how certain you want assumptions to be.  You should also be sure that there is not a difference in the group that did and did not respond to your survey.  For example, if all of your responses came from people who attended a particular event, your results may be skewed as these people may differ from the total population.  Finally, do some benchmarking and check past reports to get an idea about the response rate that is considered reasonable.  In the example above, a 40 percent response rate (10,000/25,000) may be acceptable for a student satisfaction survey but not for your annual first destination survey.

How will I collect this information?  Websites like SurveyMonkey offer free accounts and many institutions have licenses for software such as Qualtrics (my platform of choice). Of course there is always the old fashioned paper and pencil method, which is still a very effective way to collect information. Career Service professionals may also check to see if their existing Career Services Manager system offers surveying features (Symplicity’s NACElink system offers this as an add-on).

Will multiple methods be required to achieve the desired number of responses? Using one method of surveying may not be enough to achieve your target response rate or get the information you need.  Consider using a combination of paper forms, online surveying, phone surveying, in-person interviews, and even online research. My fellow NACE guest blogger, Kevin Grubb, mentioned that the new NACE position statement on first destination surveys will now use the term “knowledge rate” instead of response rate as we often collect information from faculty, employers, and even LinkedIn research to gather information about our students career outcomes.

What do you think? Add your thoughts in the comments section!

The Assessment Diaries: It’s Not Just Data

Desalina Allen

A post by NACE Guest Blogger, Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

 

I have to admit that I’m pretty left brained when it comes to my work.  In fact, the thought of spending a quiet afternoon in front of Microsoft Excel, coffee in-hand, warms my heart (did I mention that I love coffee?).

photo credit: Shereen M via photopincc

It’s for that reason that when I first started learning about assessment I often equated it with data collection – as I’m sure many others do as well. Don’t get me wrong, it’s important to know how many and what types of students are using your services.  But, in addition to those metrics, it’s also valuable to think about demonstrating your offices’ success using qualitative information. Like J.K. Rowling said, “there’s always room for a story that can transport people to another place,” and who wouldn’t want advice from someone who lives in a house like this:

So what exactly is qualitative information? Basically, anything other than numerical data. It’s been on my mind because it seems that lately we have received quite a few requests for student success stories.  This isn’t surprising – stories supplement, support and strengthen the metrics we already share – and, unlike me, not everyone finds joy in looking at pie charts all day.

photo credit: mark.groves via photopin cc

Here are some examples of ways you can collect and organize qualitative information and how these methods support your assessment objectives:

  • Focus Groups or Advisory Boards:  These two methods are great ways to better understand your students’ needs.  They function well if you’ve sent out a survey and want help explaining some of the findings or if you feel (like many of us do) that your students are suffering from survey fatigue and won’t respond to one more request.  Focus groups tend to be groups brought together one time around a specific topic whereas advisory boards could meet throughout the academic year.  In both cases, be thoughtful about who you invite to the table (Do you want students from a particular background or school? Is it open to everyone or might you want to conduct interviews first?).  You’ll also want to think critically about who should be facilitating.  Consider both staff members and unbiased professionals who are specially trained.  Either way, be sure to document the planning, take notes/transcribe, and be ready to plan follow-up actions based on what you learned.

  • Word Association Exercises (Pre and Post):  Have students write down or share words they associate with a particular topic before and after an event or presentation to help measure if your core message came across.  For example, in a seminar on interviewing students may start the session offering words like “scary” or “questioning” and end sharing words like “preparation,” “practice” or “conversation.”  Keep track of the terms shared and use an application like wordle to look at the pre and post results side-by-side.

  • Observation:  You don’t need to bring in a team of consultants every time you need an external perspective.  Consider asking a trusted career services professional to attend your career fair, observe a workshop or review your employer services offerings and provide written feedback and suggestions. Offer your expertise on another topic to avoid paying a fee.  Keep notes on changes you have implemented based on the observation.

  • Benchmarking:  There are many reasons to benchmark.  For assessment purposes knowing what other schools are doing and how they compare to you helps give others context.  Being able to say that your program is the first of it’s kind or that it’s modeled off of an award winning one developed by a colleague may make more of an impact when combined with your standard student satisfaction survey results.

  • Staff:  We all are lucky enough to receive the occasional thank you note or email from a student who has really benefited from the programs and resources provided by the career center.  Come up with a standardized way to be able to quickly track those students.  It could be something as easy as a document on a shared drive or even a flag in your student management system.  Be sure to ask students’ permission, saying something like, “I’m so happy to hear our mock interview meeting helped you land that internship!  We are always looking for students who are willing to share their positive experiences, would you be comfortable sharing this information in the future should we receive a request?”

I’m sure there are many more ways to collect this type of information – please leave your questions and share your own experiences below!