I recently wrote about the problem with asking students to assess their own learning. In a nutshell—studies show we are not able to accurately measure our own learning and that we even tend to overestimate what we have learned.
This concept can definitely be applied to resumes. Even the best resume presentation or one-on-one review isn’t always enough to really teach students what makes an excellent vs. just an OK resume. We already know this anecdotally—when students come back for two, three, or four reviews and haven’t yet mastered some of the basics it demonstrates just how complex marketing yourself on paper can be. Thus, we cannot use students’ self-reported learning after these events or meetings as evidence that they really learned.
As career services professionals, we could critique resumes in our sleep. I know I’ve easily reviewed thousands of resumes in my five short years working in career development! For this reason, when we want an accurate understanding of how well our students are marketing themselves via their resumes it makes more sense for us as experts to evaluate them.
Enter resume rubrics. Rubrics are a way to standardize the way we define and measure something. They also make our evaluation techniques more transparent and clear to students and can be a useful tool in training new staff members.
When I started developing a rubric for NYU Wasserman, I found it extremely helpful to look at examples from NACE and other schools. I then created a draft and brought it first to my assessment team and then to staff as a whole for feedback. Several revisions later, we had a document that made explicit what we look for in a resume. More specifically, we defined what makes an excellent resume vs. a good resume vs. one that needs improvement.
Once you have your rubric, you can track and report on how your students are doing as a whole (or by class year, major, etc.). If you have enough time and patience, you can also follow a student’s progress over time or after a specific resume intervention. For example, evaluate a student’s resume before a workshop and then encourage them to come back with changes and evaluate it again. Did they improve? Which topics were still difficult to grasp? Might you need to spend more time addressing those during the workshop?
Below you will find some examples of resume rubrics that I have found helpful, as well as the rubric we use at NYU. Do you use rubrics at your institution? If so, please share them in the comments section!