You’ll find this blog by Bless Vaidian—and more from the NACE Blog Team—at http://community.naceweb.org/blogs/bless-vaidian/2017/10/09/data-collection-toward-a-100-percent-knowledge-rate
Some of the assessment activities I have shared take time to develop (like the Pre/Post Dining Etiquette Survey) and/or require staff buy-in, training and socialization (like the Resume Rubrics). Just last week, I decided super last minute that I wanted to assess a networking presentation for international students…last minute, as in, 20 minutes before the event. This exercise is proof that assessment doesn’t have to take hours and hours of your time—sometimes a quick pre/post writing exercise can give you insight into what needs to be changed about a program.
I need to invoke my earlier reminder that I promised to be honest when sharing my experiences with assessment, and this post is no different. I’d like to say I was happy with these results, when instead I was disappointed to find that I probably assessed the wrong learning goal. I started with the fact that I wanted students to gain a more nuanced understanding of networking. Here’s what I did:
Twenty minutes before the presentation I grabbed some colorful paper—yellow would be used for my pre-assessment and pink for the post assessment. This color choice was not at all based on any carefully planned and research-supported theory that bright paper makes people happy; in fact, I did it to make sure I could keep the two “surveys” separate.
At the beginning of the event, I asked the students to spend two minutes writing about networking. It could have been their definition of networking or just words that come to mind; grammar and complete sentences not necessary. I then did the same thing at the end of the event.
I could have just looked through and summarized key trends from each sample, but I decided to get fancy, transcribe the text, and enter it into Wordle, a tool that generates word clouds.
Here’s the Pre-Workshop Wordle
And the Post:
While the results show that I focused on the importance of relationships, I don’t think I can claim that students gained a more in-depth understanding of networking. What I did learn is that it seems like students already had a handle on the definition of networking, so perhaps I needed to assess their comfort level actually knowing how to network!
While this wasn’t the most successful assessment attempt, I do think that it can be great when you are trying to compare students’ knowledge of more difficult to assess topics (think professionalism, diversity, self-awareness).
Would you try it?
One of the latest trends in career services is the establishment of a career or professional development class embedded into curriculum. Courses may be required, optional, for credit or non-credit bearing. With the importance of career outcomes rising for colleges and universities, this is a new possible solution for providing career education to all students.
NACE blog readers, is “career” in your institution’s curriculum? Share your answer in this poll and tell us about your career course in a comment. What do you teach and how do you do it?
For more information on this topic, check out NACE’s Career Course Syllabi.
I recently wrote about the problem with asking students to assess their own learning. In a nutshell—studies show we are not able to accurately measure our own learning and that we even tend to overestimate what we have learned.
This concept can definitely be applied to resumes. Even the best resume presentation or one-on-one review isn’t always enough to really teach students what makes an excellent vs. just an OK resume. We already know this anecdotally—when students come back for two, three, or four reviews and haven’t yet mastered some of the basics it demonstrates just how complex marketing yourself on paper can be. Thus, we cannot use students’ self-reported learning after these events or meetings as evidence that they really learned.
As career services professionals, we could critique resumes in our sleep. I know I’ve easily reviewed thousands of resumes in my five short years working in career development! For this reason, when we want an accurate understanding of how well our students are marketing themselves via their resumes it makes more sense for us as experts to evaluate them.
Enter resume rubrics. Rubrics are a way to standardize the way we define and measure something. They also make our evaluation techniques more transparent and clear to students and can be a useful tool in training new staff members.
When I started developing a rubric for NYU Wasserman, I found it extremely helpful to look at examples from NACE and other schools. I then created a draft and brought it first to my assessment team and then to staff as a whole for feedback. Several revisions later, we had a document that made explicit what we look for in a resume. More specifically, we defined what makes an excellent resume vs. a good resume vs. one that needs improvement.
Once you have your rubric, you can track and report on how your students are doing as a whole (or by class year, major, etc.). If you have enough time and patience, you can also follow a student’s progress over time or after a specific resume intervention. For example, evaluate a student’s resume before a workshop and then encourage them to come back with changes and evaluate it again. Did they improve? Which topics were still difficult to grasp? Might you need to spend more time addressing those during the workshop?
Below you will find some examples of resume rubrics that I have found helpful, as well as the rubric we use at NYU. Do you use rubrics at your institution? If so, please share them in the comments section!
When we last left off, you were shocked at the fact that your post-resume writing seminar survey results could have been so misleading. Students reported to have learned the basics of resume writing but, when you followed up with an in-person meeting with one of your attendees, it was obvious that the tips and guidelines you provided were not applied.
Have you ever created or taken a survey with a question or questions like the ones below?
This seminar improved my understanding of resume writing basics:
Strongly Disagree/Disagree/Neutral/Agree/Strongly Agree
I learned something from this seminar:
As a result of this seminar, I now understand what employers look for in a resume:
Strongly Disagree/Disagree/Neutral/Agree/Strongly Agree
What is the problem here? Well, if you are simply looking for evidence that students believe they have learned something from your event there is no problem at all. But, if you are trying to collect evidence that students actually learned something well then …..
Why? Because studies show* that students are not able to accurately measure their own growth or learning. Not only do they incorrectly estimate growth, they tend to overestimate it. It makes sense, right? If someone asks you after a presentation or a class if you learned something, how do you really know if you did?
As a result of this, we cannot use students’ self-reported growth as evidence of growth. Instead, we have to utilize other assessment methods to really prove they learned something. How? By doing a pre- and post-assessment of student knowledge (like I did for our etiquette dinner) and comparing results, or coming up with a standardized way to evaluate resumes (via a rubric) and look at the change over time.
Last year, one of our learning goals was to ensure that students were learning career- related skills like resume writing. We did away with our post seminar surveys and instead created resume rubrics to use with students. I’ll be sharing that experience in my next few posts, along with helpful resources if your office is look to create your own resume rubrics!
*Thank you to Sonia DeLuca Fernandez, our Director of Research and Assessment for Student Affairs, for this article that can be found in Research and Practice in Assessment, Volume 8.
In career services, most of us are used to facilitating workshops that teach our students or alumni skills. The topic could be leadership, networking, career research, or social media and the job search. Oftentimes, after these events we send out surveys to determine just how much students learned. We ask if students feel more comfortable with the topic and understand some of the key take-a-ways. We may even throw in a satisfaction question or two.
Today, I want you to imagine that you’re getting ready to facilitate one of those workshops and the topic is: Resume writing! Don’t get too excited….
You know how when you start a presentation, especially one you’ve done often, you pretty immediately get a sense of how the audience will respond? Sometimes you walk in and students are just waiting for you with that expression on their face that tells you even if Eddie Murphy were giving this presentation they might sleep through the entire thing?
Well, on this day you experience the exact opposite. Students are eager, smiling, even awake. They raise their hand when you ask for input and they actually laugh at your pathetic resume jokes (that you’ve managed to add just to keep yourself interested). You talk about clarity of format, keeping it to a page, customizing it for each position and you look around only to see heads nodding vigorously.
After the presentation you review the post event surveys. Students are giving you high marks across the board: they now understand resume basics, they feel they can apply these concepts to their own resumes, they even write comments about how great of a presenter you are.
That night, you check your e-mail and you have a very sweet request from one of the participants: She notes that she learned a lot from the presentation but wants to come in tomorrow for a quick resume review just make sure everything is OK before she applies to a position. You reply “Sure!” thinking to yourself, “this should take only 15 minutes.”
Fast forward to tomorrow. The student is seated in front of you. As she reaches into her backpack to pull out her resume, your view switches to slow motion. Suddenly, you catch a glimmer of light bouncing off of the object she’s taking out….
….is that a staple??
So, obviously this is a HUGE exaggeration (cue sarcastic snickers), but what went wrong here? Didn’t you talk about page length? Weren’t you clear about editing out non-relevant content? Surely you touched on including pictures. How could it be that after all of your hard work and intuition the student just didn’t get the point? What about all of your positive survey results? Could they have misled you?
Stay tuned for part 2 of The Mystery of the Resume Writing Assessment where I’ll discuss the post-event assessment. In the meantime…any guesses, comments, or thoughts on why this approach doesn’t always work? Leave them in the comments section below!
The results are in! I recently shared assessment plans for our Dining for Success etiquette dinner. We moved away from a satisfaction survey to a pre-dinner and post-dinner skills assessment for the first time and, as I shared in my previous post, I was a little nervous about the results. Here is what we found:
Section One: Understanding Formal Place Settings
Let’s face it. We could all use a refresher on how not to steal your future boss’ bread plate, and our students were no different. Before and after the event they were asked to identify each plate, cup, and piece of silverware in this photo:
Then, at the beginning of the event we had all utensils, plates, and glasses piled in the center of the table and asked each student to organize their place setting. We noticed a bit of uncertainty during the activity and our employer volunteers stepped in often to help, which tells us that students were not completely clear about formal place settings.
This experience conflicts with what we found via the assessment. We didn’t see much of a difference between pre and post results. In fact, most students correctly identified the items (with #6 dinner fork and #5 salad fork being confused just a few times). We did see a slight drop in the number of blank responses, which could be interpreted to mean that students felt more certain about formal place settings after the event.
Section Two: Appropriate vs. Inappropriate Table Topics
Students were asked to list three appropriate topics to discuss at mealtime interviews or networking event as well as three topics to avoid. During the event, we provided employer volunteers with a list of suggestions and encouraged them to supplement based on their experience.
On the pre and post surveys, students were instructed to leave questions blank if they did not know the answer. Comparing responses revealed a significant increase in the number of students who answered these questions after the event. We also noticed that a wider variety of more detailed topics were listed in the post surveys. For example, students most often listed “career,” “food,” and “hobbies” in the pre-dinner survey, while post-dinner survey responses included things like “the professional’s background,” “the industry,” “new projects,” and “current events.”
Section Three: Ordering Food
While guests were only offered two entrèe options, employer volunteers were encouraged to share basic guidelines regarding how and what to order during sit-down dinners or interviews. Almost all of the pre survey responses revolved around not ordering food that is too messy or difficult to eat. Post survey results again provided more breadth and detail. Student mentioned avoiding “smelly” food, considering price, and following the lead of the interviewer/host. One student even suggested not ordering meat if your host is a vegetarian…discuss!
Section Four: Following Up
How should students follow up with an individual after a networking event or meal time interview? Turns out, most students already understood the basics (insert career counselor sigh of relief here). On the pre-event survey, many students responded that you should send a follow up thank you via e-mail (or in some cases, USPS), however after the event students included details like “within 24-48 hours” and mentioned LinkedIn for the first time.
What we learned
Overall, we were happy with the improvements we saw between the pre and post-event surveys. And, of course, we found that 97 percent of students were satisfied with the event! Here are a few key takeaways and thoughts regarding the survey for next year’s event:
The table setting question may not have accurately measured students’ level of comfort with formal dining before and after the event. The way the image was laid out may have been too simple. For future surveys, we are considering having students draw a diagram or place items around a plate to more accurately reflect our table setting activity.
Students understand the basics regarding discussion topics, ordering, and following up after events, but the activities and discussions gave them a more broad and anecdotal understanding of how to navigate during mealtime events and interviews.
We will consider measuring different skills/content areas each year. Our event also included activities revolving around introducing yourself and handling sticky situations that were not assessed in the pre- or post-event surveys. It would be interesting to see how students’ understanding of these topics changed as a result of the event.