The Assessment Diaries: Quick and Qualitative

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

Some of the assessment activities I have shared take time to develop (like the Pre/Post Dining Etiquette Survey) and/or require staff buy-in, training and socialization (like the Resume Rubrics).  Just last week, I decided super last minute that I wanted to assess a networking presentation for international students…last minute, as in, 20 minutes before the event. This exercise is proof that assessment doesn’t have to take hours and hours of your time—sometimes a quick pre/post writing exercise can give you insight into what needs to be changed about a program.

I need to invoke my earlier reminder that I promised to be honest when sharing my experiences with assessment, and this post is no different.  I’d like to say I was happy with these results, when instead I was disappointed to find that I probably assessed the wrong learning goal. I started with the fact that I wanted students to gain a more nuanced understanding of networking. Here’s what I did:

Twenty minutes before the presentation I grabbed some colorful paper—yellow would be used for my pre-assessment and pink for the post assessment. This color choice was not at all based on any carefully planned and research-supported theory that bright paper makes people happy; in fact, I did it to make sure I could keep the two “surveys” separate.

At the beginning of the event, I asked the students to spend two minutes writing about networking. It could have been their definition of networking or just words that come to mind; grammar and complete sentences not necessary. I then did the same thing at the end of the event.

I could have just looked through and summarized key trends from each sample, but I decided to get fancy, transcribe the text, and enter it into Wordle, a tool that generates word clouds.

Here’s the Pre-Workshop Wordle

Screen Shot 2014-02-11 at 3.18.09 PM.png

And the Post:

Screen Shot 2014-02-11 at 3.20.44 PM.png

While the results show that I focused on the importance of relationships, I don’t think I can claim that students gained a more in-depth understanding of networking.  What I did learn is that it seems like students already had a handle on the definition of networking, so perhaps I needed to assess their comfort level actually knowing how to network!

While this wasn’t the most successful assessment attempt, I do think that it can be great when you are trying to compare students’ knowledge of more difficult to assess topics (think professionalism, diversity, self-awareness).

Would you try it?

Read more of Desalina Allen’s blogs on assessment!

The Assessment Diaries: Rubric Roundup

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

I recently wrote about the problem with asking students to assess their own learning. In a nutshell—studies show we are not able to accurately measure our own learning and that we even tend to overestimate what we have learned.

This concept can definitely be applied to resumes.  Even the best resume presentation or one-on-one review isn’t always enough to really teach students what makes an excellent vs. just an OK resume.  We already know this anecdotally—when students come back for two, three, or four reviews and haven’t yet mastered some of the basics it demonstrates just how complex marketing yourself on paper can be.  Thus, we cannot use students’ self-reported learning after these events or meetings as evidence that they really learned. 

As career services professionals, we could critique resumes in our sleep.  I know I’ve easily reviewed thousands of resumes in my five short years working in career development! For this reason, when we want an accurate understanding of how well our students are marketing themselves via their resumes it makes more sense for us as experts to evaluate them.

Enter resume rubrics. Rubrics are a way to standardize the way we define and measure something.  They also make our evaluation techniques more transparent and clear to students and can be a useful tool in training new staff members.

When I started developing a rubric for NYU Wasserman, I found it extremely helpful to look at examples from NACE and other schools.  I then created a draft and brought it first to my assessment team and then to staff as a whole for feedback.  Several revisions later, we had a document that made explicit what we look for in a resume.  More specifically, we defined what makes an excellent resume vs. a good resume vs. one that needs improvement.

Once you have your rubric, you can track and report on how your students are doing as a whole (or by class year, major, etc.).  If you have enough time and patience, you can also follow a student’s progress over time or after a specific resume intervention. For example, evaluate a student’s resume before a workshop and then encourage them to come back with changes and evaluate it again.  Did they improve? Which topics were still difficult to grasp? Might you need to spend more time addressing those during the workshop?

Below you will find some examples of resume rubrics that I have found helpful, as well as the rubric we use at NYU.  Do you use rubrics at your institution?  If so, please share them in the comments section!

Examples:
NACE (Resume), ReadWriteThink (Resume and Cover Letter), Amherst Career Center (Resume), Illinois State (Resume), Liberty University (Online Resume)

NYU Wasserman:

Don’t miss “The Mystery of the Resume Writing Assessment” Part 1 and Part 2.

Read more of Desalina Allen’s blogs on assessment!