Stop and measure the roses

I’ve just read a very interesting report called Stop and Measure the Roses by Aminder Nijjar. This report looks at how higher education careers services measure their effectiveness and success. I had lunch with Aminder the other day and we sat and set the careers’ world to rights for a couple of hours, so I was ready and primed to read her report. So what, apart from an hilarious title, does this report have to offer us?

‘Impact’ is the buzz word for today’s blog and it is this that Stop and Measure the Roses is concerned with. As Robert Partridge (University of York, Careers Service) says in the report How do I know my service is doing a good job? And what constitutes a ‘good job’?” For those of us who work in the public sector and do something ‘soft’ like supporting learning, advising on careers or generally helping people, how can we prove that we are worth something? In my dark hours I like to run little thought experiments where I close down all of the universities or ship all of the country’s careers advisors off to an island. What would happen? How long would it be before anyone… noticed? …cared? …or before the fabric of civilisation fell apart and left the denizens of the private sector looking over a smoking ruin? Run the experiments yourself and let me know the answers you arrive at.

If we make widgets, we can measure our productivity by how many widgets we make. If we offer careers advice somehow it doesn’t seem enough to measure us by how much advice we’ve given. Has any of it been followed? Has anyone ended up happier or more productive as a result? Do they feel that we have helped them? Will they even remember our names? These are some alternative measures of impact, but what Stop and Measure the Roses manages to do is to collect a wide variety of current practice together and to start to gesture towards some more sophisticated ways of evaluating what we do and measuring its impact.

Unfortunately, the current ways in which careers services are measured and valued are usually fairly crude. Typically university senior management ask careers services to justify what they do either in terms of volume of widgets (in this case usually number of students seen, companies attending careers fairs or something similar) or more usually in terms of the university’s showing in the Destinations of Leavers of Higher Education (DLHE) survey. For those of you who haven’t had the opportunity to be part of the  DLHE it probably merits some description. The DLHE is an annual survey of graduates. It basically involves asking everyone who has graduated what they are doing now (six months or so later). Universities put a huge amount of effort and resource into making the DLHE happen. It undoubtedly provides lots of useful data, but its co-option as a key metric in the university league tables means that it moves from being a useful bit of analysis about higher educations impact on the labour market to become a stick for VCs to beat careers services with. The response to this usually goes somewhere along the following lines:

The DLHE is crude. Small variations in results can have a bit impact on placing in the league tables. Some of these variations can be explained by margin of error or by different ‘interpretations’ of how to code the same job.

  • The DLHE is limited. It only tells us first destination. This isn’t really a good measure of people’s career. It may be a better measure of where they live or where the university is based. e.g. if your university is in London it is a lot easier for people to walk out of their final exam and get a job that is broadly speaking a graduate job.
  • The DLHE is not (only) the careers service’s problem. On its own a higher education careers service has pretty minimal chance of having an impact on the DLHE. If academic departments ignore the issues and simply direct the odd student to the careers service the battle is lost. Having an impact on DLHE requires a whole institution response.
  • The DLHE is prescriptive. Or at least the way it is interpreted in league tables is. These make some very clear assumptions about what the ideal career of graduates should be. We might not all be happy to sign up to these assumptions. 
    Don’t get me wrong, I love DLHE. I think that having accurate labour market data is essential for careers work and I think that it is a legitimate question for students to ask “what did people who graduated from this institution go on to do?” However, its transformation into a metric of success is at least a little problematic and in need of some qualification. This is essentially what this report is trying to do.

     

    If you want a radical thought I’d ask do we need to do DLHE every year? Would it be more useful to direct the resources that are currently directed to DLHE to some of the other impact measures that are listed in this report? A biannual DLHE might free people up to provide a more rounded picture of the work of careers services.

     

    Other impact measures that services are currently using include:

    User satisfaction

  • Income generation
  • Reputation/press coverage
  • Comparison to other services or standards such as MATRIX
  • Level of engagement in/partnership with academic schools
  • Alternative surveys that measure other things e.g. international students
    These are all fine, but to me at least, they suffer from the same flaws at DLHE. They miss out some of the key parts of what careers education thinks that it is about i.e. the engagement of participants in personal reflection and a process of lifelong learning. I’d like to think more about how we capture these profound and personal impacts as well as the instrumental and external ones. To be fair, the report also notes that there is lots of room for us to come up with new and different ways of measuring impact. Aminder also includes a very useful appendix detailing some of the comments and ideas about appropriate impact measures that were given by the careers services that she spoke to. 

     

    The report is rich with ideas that I’ve only touched upon here and is well worth a read. But any further thoughts on impact measures would be much appreciated.