Data Collection Toward a 100 Percent Knowledge Rate

BlessVaiBless Vaidian, Director, Career Counseling for Pace Career Services – Westchester, and Founder, Career Transitions Guide
Linkedin: https://www.linkedin.com/in/blessvaidian
Twitter: https://twitter.com/BlessCareers
Blog: http://careertransitionsguide.com

Is a 100 percent knowledge rate possible with a first-destination survey? That’s to be determined each year and with each effort. Due-diligence requires universities to extend maximum effort to try to achieve a 100 percent knowledge rate for all our students. The task of collecting and reporting data is a huge undertaking trusted to many career offices. Whether you are trying to meet the NACE deadline for data collection or your own office deadline, creating a systematic approach and incorporating “best practices” into your labor makes capturing career outcomes more manageable.

Lay the Foundation

Its essential to be able to analyze data with ease, as well as know ahead of time what questions to include in your outreach attempts to students. Follow the suggestions outlined by NACE in your database fields and match it to your first destination surveys. Bring in your school’s technology department to help create the database, as well as the electronic surveys that capture the responses fed into it. Once that’s done, a time line for when, where, and how you will collect data can be drawn out. Cap and Gown surveys, employer surveys, surveys to the campus community, classroom visits, social media searches, follow-up student surveys, calls and e-mails have to be systematically laid out on a timeline. Learn assessment best practices by attending conferences and events to know how others are capturing information. Make sure you use the NACE links on the topic and talk to Ed Koc, NACE’s Director of Research, Public Policy, and Legislative Affairs or his great team if you have questions. Koc is offering a webinar on the first-destination initiative in early January for NACE members. A solid foundation and plan of action will serve you well in the long run.

Designate a Point Person

If the college community knows that career outcome information has to be sent to a designated individual within their school, then more outcomes can be captured. Often university staff members possess career outcome information and never pass it onto career services. The human resources and admissions departments within your school may have first-destination information on numerous students who were hired or went onto graduate school at your institution. The designated point person should monitor the first destination survey numbers, solicit information from university sources consistently, and create a strategy for follow-up with graduates. It takes many people, numerous efforts, and even call-centers to capture data for bigger schools. But designate an expert to manage the whole process, set the timeline, and be the “face” of the initiative in order to drive the results.

It’s Not a Career Services Issue, It’s a University Issue

Helping students find opportunities and creating a path for successful outcomes is not just a career services goal. Higher education is a partnership of many units working collaboratively to ensure retention and capture every student’s career outcome. Long before first-destination surveys go out, building relationships with the campus community is where data collection really starts for career services. Meetings with the university community to build bridges, foster relationships, and outline the process is crucial. Students share career outcome information with professors, academic advisers, financial aid representatives, leaders of student organizations, and college staff. These sources become vital in the collection process and have to be included in the journey.

Keep the Community Vested

It is essential to make survey efforts and progress visible to the campus community. Every dean, faculty member, and university staff  member should know what the career office does. Career outcome and knowledge rate information should be displayed in infographics, charts, and reports on a regular basis with college partners. If others understand what goes on behind the scenes and where the numbers are, then they will be more apt to assist with first-destination information. It also keeps departments interested and looking forward to the next update.

Mandate Attendance 

Universities that promote, encourage, or even mandate attendance at career service events and one-to-one meetings with a career counselor can create more successful outcomes. Students that have worked with career offices feel more comfortable sharing career outcomes, and should be told that post-graduate follow-up will take place after graduation. Career services also helps students find pre-professional experience through internships that build resumes and lead to full-time offers. They offer networking opportunities with employers and alumni that have job leads every semester. Increased student engagement with career centers increases the “knowledge rate,” and also increases “outcomes.” Its a simple formula.

Multiple out-reach efforts to capture information throughout the year are made to graduating seniors, college partners, and employers to track career outcomes. I would love to hear your school’s best practices and ideas to reach that “100 percent knowledge rate.” Wishing each of you success in reaching your university’s goal and capturing outcomes. 

The Assessment Diaries: Quick and Qualitative

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

Some of the assessment activities I have shared take time to develop (like the Pre/Post Dining Etiquette Survey) and/or require staff buy-in, training and socialization (like the Resume Rubrics).  Just last week, I decided super last minute that I wanted to assess a networking presentation for international students…last minute, as in, 20 minutes before the event. This exercise is proof that assessment doesn’t have to take hours and hours of your time—sometimes a quick pre/post writing exercise can give you insight into what needs to be changed about a program.

I need to invoke my earlier reminder that I promised to be honest when sharing my experiences with assessment, and this post is no different.  I’d like to say I was happy with these results, when instead I was disappointed to find that I probably assessed the wrong learning goal. I started with the fact that I wanted students to gain a more nuanced understanding of networking. Here’s what I did:

Twenty minutes before the presentation I grabbed some colorful paper—yellow would be used for my pre-assessment and pink for the post assessment. This color choice was not at all based on any carefully planned and research-supported theory that bright paper makes people happy; in fact, I did it to make sure I could keep the two “surveys” separate.

At the beginning of the event, I asked the students to spend two minutes writing about networking. It could have been their definition of networking or just words that come to mind; grammar and complete sentences not necessary. I then did the same thing at the end of the event.

I could have just looked through and summarized key trends from each sample, but I decided to get fancy, transcribe the text, and enter it into Wordle, a tool that generates word clouds.

Here’s the Pre-Workshop Wordle

Screen Shot 2014-02-11 at 3.18.09 PM.png

And the Post:

Screen Shot 2014-02-11 at 3.20.44 PM.png

While the results show that I focused on the importance of relationships, I don’t think I can claim that students gained a more in-depth understanding of networking.  What I did learn is that it seems like students already had a handle on the definition of networking, so perhaps I needed to assess their comfort level actually knowing how to network!

While this wasn’t the most successful assessment attempt, I do think that it can be great when you are trying to compare students’ knowledge of more difficult to assess topics (think professionalism, diversity, self-awareness).

Would you try it?

Read more of Desalina Allen’s blogs on assessment!

NACE Flash Poll: Is “Career” in Your Institution’s Curriculum?

kevin grubbNACE Ambassador Kevin Grubb
Assistant Director at Villanova University’s Career Center.
Twitter: @kevincgrubb
LinkedIn: http://www.linkedin.com/in/kevingrubb
Blog: “social @ edu”

One of the latest trends in career services is the establishment of a career or professional development class embedded into curriculum. Courses may be required, optional, for credit or non-credit bearing. With the importance of career outcomes rising for colleges and universities, this is a new possible solution for providing career education to all students.

NACE blog readers, is “career” in your institution’s curriculum? Share your answer in this poll and tell us about your career course in a comment. What do you teach and how do you do it?

For more information on this topic, check out NACE’s Career Course Syllabi.

The Assessment Diaries: Rubric Roundup

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

I recently wrote about the problem with asking students to assess their own learning. In a nutshell—studies show we are not able to accurately measure our own learning and that we even tend to overestimate what we have learned.

This concept can definitely be applied to resumes.  Even the best resume presentation or one-on-one review isn’t always enough to really teach students what makes an excellent vs. just an OK resume.  We already know this anecdotally—when students come back for two, three, or four reviews and haven’t yet mastered some of the basics it demonstrates just how complex marketing yourself on paper can be.  Thus, we cannot use students’ self-reported learning after these events or meetings as evidence that they really learned. 

As career services professionals, we could critique resumes in our sleep.  I know I’ve easily reviewed thousands of resumes in my five short years working in career development! For this reason, when we want an accurate understanding of how well our students are marketing themselves via their resumes it makes more sense for us as experts to evaluate them.

Enter resume rubrics. Rubrics are a way to standardize the way we define and measure something.  They also make our evaluation techniques more transparent and clear to students and can be a useful tool in training new staff members.

When I started developing a rubric for NYU Wasserman, I found it extremely helpful to look at examples from NACE and other schools.  I then created a draft and brought it first to my assessment team and then to staff as a whole for feedback.  Several revisions later, we had a document that made explicit what we look for in a resume.  More specifically, we defined what makes an excellent resume vs. a good resume vs. one that needs improvement.

Once you have your rubric, you can track and report on how your students are doing as a whole (or by class year, major, etc.).  If you have enough time and patience, you can also follow a student’s progress over time or after a specific resume intervention. For example, evaluate a student’s resume before a workshop and then encourage them to come back with changes and evaluate it again.  Did they improve? Which topics were still difficult to grasp? Might you need to spend more time addressing those during the workshop?

Below you will find some examples of resume rubrics that I have found helpful, as well as the rubric we use at NYU.  Do you use rubrics at your institution?  If so, please share them in the comments section!

Examples:
NACE (Resume), ReadWriteThink (Resume and Cover Letter), Amherst Career Center (Resume), Illinois State (Resume), Liberty University (Online Resume)

NYU Wasserman:

Don’t miss “The Mystery of the Resume Writing Assessment” Part 1 and Part 2.

Read more of Desalina Allen’s blogs on assessment!

The Assessment Diaries: The Mystery of the Resume Writing Assessment (Part 2)

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

When we last left off, you were shocked at the fact that your post-resume writing seminar survey results could have been so misleading.  Students reported to have learned the basics of resume writing but, when you followed up with an in-person meeting with one of your attendees, it was obvious that the tips and guidelines you provided were not applied.

Have you ever created or taken a survey with a question or questions like the ones below?

This seminar improved my understanding of resume writing basics:
Strongly Disagree/Disagree/Neutral/Agree/Strongly Agree

I learned something from this seminar:
True/False

As a result of this seminar, I now understand what employers look for in a resume:
Strongly Disagree/Disagree/Neutral/Agree/Strongly Agree

What is the problem here?  Well, if you are simply looking for evidence that students believe they have learned something from your event there is no problem at all.  But, if you are trying to collect evidence that students actually learned something well then …..

Why? Because studies show* that students are not able to accurately measure their own growth or learning.  Not only do they incorrectly estimate growth, they tend to overestimate it.  It makes sense, right? If someone asks you after a presentation or a class if you learned something, how do you really know if you did?  

As a result of this, we cannot use students’ self-reported growth as evidence of growth.  Instead, we have to utilize other assessment methods to really prove they learned something.  How? By doing a pre- and post-assessment of student knowledge (like I did for our etiquette dinner) and comparing results, or coming up with a standardized way to evaluate resumes (via a rubric) and look at the change over time.

Last year, one of our learning goals was to ensure that students were learning career- related skills like resume writing.  We did away with our post seminar surveys and instead created resume rubrics to use with students.  I’ll be sharing that experience in my next few posts, along with helpful resources if your office is look to create your own resume rubrics!
*Thank you to Sonia DeLuca Fernandez, our Director of Research and Assessment for Student Affairs, for this article that can be found in Research and Practice in Assessment, Volume 8.

The Assessment Diaries: The Mystery of the Resume Writing Assessment (Part 1)

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

In career services, most of us are used to facilitating workshops that teach our students or alumni skills.  The topic could be leadership, networking, career research, or social media and the job search.  Oftentimes, after these events we send out surveys to determine just how much students learned.  We ask if students feel more comfortable with the topic and understand some of the key take-a-ways. We may even throw in a satisfaction question or two.

Today,  I want you to imagine that you’re getting ready to facilitate one of those workshops and the topic is: Resume writing!  Don’t get too excited….

You know how when you start a presentation, especially one you’ve done often, you pretty immediately get a sense of how the audience will respond?  Sometimes you walk in and students are just waiting for you with that expression on their face that tells you even if Eddie Murphy were giving this presentation they might sleep through the entire thing?

Well, on this day you experience the exact opposite. Students are eager, smiling, even awake. They raise their hand when you ask for input and they actually laugh at your pathetic resume jokes (that you’ve managed to add just to keep yourself interested). You talk about clarity of format, keeping it to a page, customizing it for each position and you look around only to see heads nodding vigorously.

After the presentation you review the post event surveys. Students are giving you high marks across the board: they now understand resume basics, they feel they can apply these concepts to their own resumes, they even write comments about how great of a presenter you are.

That night, you check your e-mail and you have a very sweet request from one of the participants:  She notes that she learned a lot from the presentation but wants to come in tomorrow for a quick resume review just make sure everything is OK before she applies to a position. You reply “Sure!” thinking to yourself, “this should take only 15 minutes.”

Fast forward to tomorrow.  The student is seated in front of you.  As she reaches into her backpack to pull out her resume, your view switches to slow motion.  Suddenly, you catch a glimmer of light bouncing off of the object she’s taking out….

…..wait

…what the

….is that

….is that a staple??  

So, obviously this is a HUGE exaggeration (cue sarcastic snickers), but what went wrong here? Didn’t you talk about page length? Weren’t you clear about editing out non-relevant content? Surely you touched on including pictures. How could it be that after all of your hard work and intuition the student just didn’t get the point?  What about all of your positive survey results? Could they have misled you?

Stay tuned for part 2 of The Mystery of the Resume Writing Assessment where I’ll discuss the post-event assessment.  In the meantime…any guesses, comments, or thoughts on why this approach doesn’t always work? Leave them in the comments section below!

The Assessment Diaries: Beyond Satisfaction Follow-Up

Desalina Allen

Desalina Allen, Senior Assistant Director at NYU Wasserman Center for Career Development
Twitter: @DesalinaAllen
LinkedIn: www.linkedin.com/in/desalina

The results are in!  I recently shared assessment plans for our Dining for Success etiquette dinner.  We moved away from a satisfaction survey to a pre-dinner and post-dinner skills assessment for the first time and, as I shared in my previous post, I was a little nervous about the results.  Here is what we found:

Section One:  Understanding Formal Place Settings

Let’s face it.  We could all use a refresher on how not to steal your future boss’ bread plate, and our students were no different.  Before and after the event they were asked to identify each plate, cup, and piece of silverware in this photo:

Then, at the beginning of the event we had all utensils, plates, and glasses piled in the center of the table and asked each student to organize their place setting. We noticed a bit of uncertainty during the activity and our employer volunteers stepped in often to help, which tells us that students were not completely clear about formal place settings.

This experience conflicts with what we found via the assessment. We didn’t see much of a difference between pre and post results. In fact, most students correctly identified the items (with #6 dinner fork and #5 salad fork being confused just a few times).  We did see a slight drop in the number of blank responses, which could be interpreted to mean that students felt more certain about formal place settings after the event.

Section Two:  Appropriate vs. Inappropriate Table Topics

Students were asked to list three appropriate topics to discuss at mealtime interviews or networking event as well as three topics to avoid.  During the event, we provided employer volunteers with a list of suggestions and encouraged them to supplement based on their experience.

On the pre and post surveys, students were instructed to leave questions blank if they did not know the answer. Comparing responses revealed a significant increase in the number of students who answered these questions after the event.  We also noticed that a wider variety of more detailed topics were listed in the post surveys.  For example, students most often listed “career,” “food,” and “hobbies” in the pre-dinner survey, while post-dinner survey responses included things like “the professional’s background,” “the industry,” “new projects,” and “current events.”

Section Three: Ordering Food

While guests were only offered two entrèe options, employer volunteers were encouraged to share basic guidelines regarding how and what to order during sit-down dinners or interviews.  Almost all of the pre survey responses revolved around not ordering food that is too messy or difficult to eat.  Post survey results again provided more breadth and detail.  Student mentioned avoiding “smelly” food, considering price, and following the lead of the interviewer/host.  One student even suggested not ordering meat if your host is a vegetarian…discuss!

Section Four: Following Up

How should students follow up with an individual after a networking event or meal time interview?  Turns out, most students already understood the basics (insert career counselor sigh of relief here).  On the pre-event survey, many students responded that you should send a follow up thank you via e-mail (or in some cases, USPS), however after the event students included details like “within 24-48 hours” and mentioned LinkedIn for the first time.  

What we learned

Overall, we were happy with the improvements we saw between the pre and post-event surveys.  And, of course, we found that 97 percent of students were satisfied with the event!  Here are a few key takeaways and thoughts regarding the survey for next year’s event:

  • The table setting question may not have accurately measured students’ level of comfort with formal dining before and after the event.  The way the image was laid out may have been too simple.  For future surveys, we are considering having students draw a diagram or place items around a plate to more accurately reflect our table setting activity.

  • Students understand the basics regarding discussion topics, ordering, and following up after events, but the activities and discussions gave them a more broad and anecdotal understanding of how to navigate during mealtime events and interviews.

  • We will consider measuring different skills/content areas each year.  Our event also included activities revolving around introducing yourself and handling sticky situations that were not assessed in the pre- or post-event surveys.  It would be interesting to see how students’ understanding of these topics changed as a result of the event.