I’ve just read a very interesting report called Stop and Measure the Roses by Aminder Nijjar. This report looks at how higher education careers services measure their effectiveness and success. I had lunch with Aminder the other day and we sat and set the careers’ world to rights for a couple of hours, so I was ready and primed to read her report. So what, apart from an hilarious title, does this report have to offer us?
‘Impact’ is the buzz word for today’s blog and it is this that Stop and Measure the Roses is concerned with. As Robert Partridge (University of York, Careers Service) says in the report “How do I know my service is doing a good job? And what constitutes a ‘good job’?” For those of us who work in the public sector and do something ‘soft’ like supporting learning, advising on careers or generally helping people, how can we prove that we are worth something? In my dark hours I like to run little thought experiments where I close down all of the universities or ship all of the country’s careers advisors off to an island. What would happen? How long would it be before anyone… noticed? …cared? …or before the fabric of civilisation fell apart and left the denizens of the private sector looking over a smoking ruin? Run the experiments yourself and let me know the answers you arrive at.
If we make widgets, we can measure our productivity by how many widgets we make. If we offer careers advice somehow it doesn’t seem enough to measure us by how much advice we’ve given. Has any of it been followed? Has anyone ended up happier or more productive as a result? Do they feel that we have helped them? Will they even remember our names? These are some alternative measures of impact, but what Stop and Measure the Roses manages to do is to collect a wide variety of current practice together and to start to gesture towards some more sophisticated ways of evaluating what we do and measuring its impact.
Unfortunately, the current ways in which careers services are measured and valued are usually fairly crude. Typically university senior management ask careers services to justify what they do either in terms of volume of widgets (in this case usually number of students seen, companies attending careers fairs or something similar) or more usually in terms of the university’s showing in the Destinations of Leavers of Higher Education (DLHE) survey. For those of you who haven’t had the opportunity to be part of the DLHE it probably merits some description. The DLHE is an annual survey of graduates. It basically involves asking everyone who has graduated what they are doing now (six months or so later). Universities put a huge amount of effort and resource into making the DLHE happen. It undoubtedly provides lots of useful data, but its co-option as a key metric in the university league tables means that it moves from being a useful bit of analysis about higher educations impact on the labour market to become a stick for VCs to beat careers services with. The response to this usually goes somewhere along the following lines:
The DLHE is crude. Small variations in results can have a bit impact on placing in the league tables. Some of these variations can be explained by margin of error or by different ‘interpretations’ of how to code the same job.