The Assessment Diaries: Quick and Qualitative

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

Some of the assessment activities I have shared take time to develop (like the Pre/Post Dining Etiquette Survey) and/or require staff buy-in, training and socialization (like the Resume Rubrics).  Just last week, I decided super last minute that I wanted to assess a networking presentation for international students…last minute, as in, 20 minutes before the event. This exercise is proof that assessment doesn’t have to take hours and hours of your time—sometimes a quick pre/post writing exercise can give you insight into what needs to be changed about a program.

I need to invoke my earlier reminder that I promised to be honest when sharing my experiences with assessment, and this post is no different.  I’d like to say I was happy with these results, when instead I was disappointed to find that I probably assessed the wrong learning goal. I started with the fact that I wanted students to gain a more nuanced understanding of networking. Here’s what I did:

Twenty minutes before the presentation I grabbed some colorful paper—yellow would be used for my pre-assessment and pink for the post assessment. This color choice was not at all based on any carefully planned and research-supported theory that bright paper makes people happy; in fact, I did it to make sure I could keep the two “surveys” separate.

At the beginning of the event, I asked the students to spend two minutes writing about networking. It could have been their definition of networking or just words that come to mind; grammar and complete sentences not necessary. I then did the same thing at the end of the event.

I could have just looked through and summarized key trends from each sample, but I decided to get fancy, transcribe the text, and enter it into Wordle, a tool that generates word clouds.

Here’s the Pre-Workshop Wordle

Screen Shot 2014-02-11 at 3.18.09 PM.png

And the Post:

Screen Shot 2014-02-11 at 3.20.44 PM.png

While the results show that I focused on the importance of relationships, I don’t think I can claim that students gained a more in-depth understanding of networking.  What I did learn is that it seems like students already had a handle on the definition of networking, so perhaps I needed to assess their comfort level actually knowing how to network!

While this wasn’t the most successful assessment attempt, I do think that it can be great when you are trying to compare students’ knowledge of more difficult to assess topics (think professionalism, diversity, self-awareness).

Would you try it?

Read more of Desalina Allen’s blogs on assessment!

The Assessment Diaries: Rubric Roundup

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

I recently wrote about the problem with asking students to assess their own learning. In a nutshell—studies show we are not able to accurately measure our own learning and that we even tend to overestimate what we have learned.

This concept can definitely be applied to resumes.  Even the best resume presentation or one-on-one review isn’t always enough to really teach students what makes an excellent vs. just an OK resume.  We already know this anecdotally—when students come back for two, three, or four reviews and haven’t yet mastered some of the basics it demonstrates just how complex marketing yourself on paper can be.  Thus, we cannot use students’ self-reported learning after these events or meetings as evidence that they really learned. 

As career services professionals, we could critique resumes in our sleep.  I know I’ve easily reviewed thousands of resumes in my five short years working in career development! For this reason, when we want an accurate understanding of how well our students are marketing themselves via their resumes it makes more sense for us as experts to evaluate them.

Enter resume rubrics. Rubrics are a way to standardize the way we define and measure something.  They also make our evaluation techniques more transparent and clear to students and can be a useful tool in training new staff members.

When I started developing a rubric for NYU Wasserman, I found it extremely helpful to look at examples from NACE and other schools.  I then created a draft and brought it first to my assessment team and then to staff as a whole for feedback.  Several revisions later, we had a document that made explicit what we look for in a resume.  More specifically, we defined what makes an excellent resume vs. a good resume vs. one that needs improvement.

Once you have your rubric, you can track and report on how your students are doing as a whole (or by class year, major, etc.).  If you have enough time and patience, you can also follow a student’s progress over time or after a specific resume intervention. For example, evaluate a student’s resume before a workshop and then encourage them to come back with changes and evaluate it again.  Did they improve? Which topics were still difficult to grasp? Might you need to spend more time addressing those during the workshop?

Below you will find some examples of resume rubrics that I have found helpful, as well as the rubric we use at NYU.  Do you use rubrics at your institution?  If so, please share them in the comments section!

Examples:
NACE (Resume), ReadWriteThink (Resume and Cover Letter), Amherst Career Center (Resume), Illinois State (Resume), Liberty University (Online Resume)

NYU Wasserman:

Don’t miss “The Mystery of the Resume Writing Assessment” Part 1 and Part 2.

Read more of Desalina Allen’s blogs on assessment!

The Assessment Diaries: The Mystery of the Resume Writing Assessment (Part 2)

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

When we last left off, you were shocked at the fact that your post-resume writing seminar survey results could have been so misleading.  Students reported to have learned the basics of resume writing but, when you followed up with an in-person meeting with one of your attendees, it was obvious that the tips and guidelines you provided were not applied.

Have you ever created or taken a survey with a question or questions like the ones below?

This seminar improved my understanding of resume writing basics:
Strongly Disagree/Disagree/Neutral/Agree/Strongly Agree

I learned something from this seminar:
True/False

As a result of this seminar, I now understand what employers look for in a resume:
Strongly Disagree/Disagree/Neutral/Agree/Strongly Agree

What is the problem here?  Well, if you are simply looking for evidence that students believe they have learned something from your event there is no problem at all.  But, if you are trying to collect evidence that students actually learned something well then …..

Why? Because studies show* that students are not able to accurately measure their own growth or learning.  Not only do they incorrectly estimate growth, they tend to overestimate it.  It makes sense, right? If someone asks you after a presentation or a class if you learned something, how do you really know if you did?  

As a result of this, we cannot use students’ self-reported growth as evidence of growth.  Instead, we have to utilize other assessment methods to really prove they learned something.  How? By doing a pre- and post-assessment of student knowledge (like I did for our etiquette dinner) and comparing results, or coming up with a standardized way to evaluate resumes (via a rubric) and look at the change over time.

Last year, one of our learning goals was to ensure that students were learning career- related skills like resume writing.  We did away with our post seminar surveys and instead created resume rubrics to use with students.  I’ll be sharing that experience in my next few posts, along with helpful resources if your office is look to create your own resume rubrics!
*Thank you to Sonia DeLuca Fernandez, our Director of Research and Assessment for Student Affairs, for this article that can be found in Research and Practice in Assessment, Volume 8.