Why people don’t use Web 2.0 – thoughts prompted by #cfbweb2

I attended an event (#cfbweb2) last week organised by the HEA subject centre in Bioscience. This may sound like a rather odd place for someone with only a C in GCSE Biology to hang out, but the event was designed to get bio-scientists involved in Web 2.0 and I was wondering whether there were any ideas that I could steal for the careers field.

 

One of the problems with Web 2.0 is that most people perceive it to be primarily a technical problem. For those who haven’t dabbled a reasonable assumption is made that it is some kind of difficult bit of software that you have to learn. As the term “Web 2.0” is so vague it probably scares people off who either thing “it is all so complicated that I’ll never understand it” or “I haven’t got time to get into all that right now!”. However this set of objections proceeds from a misunderstanding. Web 2.0 does not really pose any technical problems (all of the technology is pretty simple to use) what it does pose is a conceptual and social problem. What is Web 2.0 and who can | use it with?

 

Web 2.0 does not really introduce any dramatically new technologies to those who have seen websites and emails. What it does do is suggest very different ways of using that technology. It pushes us to think of our working lives as a conversation, to embed our actions in a social context and ultimately to recognise and embrace the performative aspects of what we say and do. Web 2.0 is a socialisation of work and learning and as such represents a huge conceptual leap for most people.

 

Before Web 2.0, you went on a course and learnt a new skill or bit of information. Now you are being encouraged to reflect on it, write a blog and share your learning. This has the potential to deepen your own learning but also to pass it on to others and then extend it as they comment and feedback. I’m not going to try and write a summary of the implications of Web 2.0 here, I’m just trying to make the point that it is essentially a social change rather than a technological one. We are being asked to act differently rather than just use different tools. We are being asked to change the way we lead our lives rather than (as most technology promises) to make our lives easier.

 

For those who are sold on Web 2.0 (like myself) this is a wonderful brave new world. It promises to be both more fun and more productive than the old unsocial way of doing things. It offers huge benefits for professional communities where the annual conference need no longer be the one point in the year that you talk to others like you. You can be in constant communication with your peers – ideas can be developed, practice shared and new developments chewed over and responded to. The many are undoubtedly more powerful than the few and a technology that enables us to tap into this should be enormously popular with bio-scientists, careers advisers or any other professional community.

 

However for most people the world of Web 2.0 is a distant and strange one. As they move closer to it, it appears to be filled with banal chatter, blatant self-publicity and endless geek in-jokes (and that is just my Twitter stream). Event like #cfbweb2 are designed to get people past this – evidencing the value through a lived experience. However, it is incredibly difficulty to manufacture these penny drop moments. We get caught up in technology and tools and find it difficult to develop the social experience within the confines of a training day. People come away having tried the tools, but not really having experienced the value of Web 2.0.

 

So one explanation as to why most people aren’t convinced by Web 2.0  is that it is difficult to explain. However this probably doesn’t capture the whole reason. Web 2.0 requires a dramatic shift in the way people conduct their working lives. They have to spend more time reflecting, communicating and performing in order to gain benefit. They have to conceive of themselves in social rather than individualistic terms and trust that the community will advance their own interests. This is a lot of change for most people and we know that change is difficult.

 

However, it is also worth thinking about who makes this change most easily. These are the people who are likely to be the nodes around which the Web 2.0 world develops. I’m developing a hypothesis that Web 2.0 activity is likely to correlate with people who are high on extraversion and openness (see OCEAN). This means that we might have to talk about it in very different ways for different types of personalities. Simply making a rational case for people to adopt is not likely to be enough. For different kinds of people different Web 2.0 technologies might be the hook. For example at #cfbweb2 there was a demo of CiteULike which is a social citation tool. It seems likely to me that this is going to appeal to people with much lower extraversion than those who take to blogging. Is building the Web 2.0 about finding the right gateway drug for a whole host of different people?

 

So the question is will Web 2.0 ever break through into the majority or is it destined to be confined to a noisy, geeky bunch of socially active, extravert early adopters?

Advertisements

My memory

My memory is bad. I know everyone says that, but it is. Actually so is yours. Your memory is also bad, all of our memories are bad. We forget stuff that we once knew, we combine things together and we even make up new memories that didn’t ever happen. Your memory is all you’ve got to verify that you are you and that the way you see the world is accurate, but it is unreliable. This offers us a few problems as individuals, but it offers us huge problems as researchers. How can we be sure that anything anyone tells us is true?

Why am I worrying about this? Well, I’ve been reading The Wisdom of Crowds (I’ll blog on it soon) in which Surrowiecki talks about both the Challenger and Columbia Space Shuttle disasters. I realised that I’d joined these two together in my head. Despite the fact that they happened 17 years apart, they had become a single memory in my head. I had a really strong memory of the Challenger disaster. I can remember talking about it at school, jokes in poor taste and endless news coverage. I couldn’t however remember the Columbia crash at all. I don’t know why, I watch the news, I’m vaguely interested in the space programme, why would I forget this? However, I’ve worked out where I was, who I was with and what I was doing during the month it happened. Wikipedia has restored my knowledge in this event, but not my memory. Actually I’ve even generated a memory, but I don’t think that it is right. I think that I’m misremembering. I’ve now spent so much time over the last 24 hours trying to remember this that I’ve got no faith at all in my current memory.

So am I losing my memory? Probably not, oral historians have been talking about the memory for years. They’ve had to defend themselves against the ink and paper brigade who say “how can you trust memory, it isn’t written down”.  One answer to this is “how can you trust paper, it isn’t the actual event”, but that only takes us so far. Memory, oral historians would argue, is interesting because of its subjectivity rather than despite it. Memory reveals what people have found a cognitive way to preserve, their telling of their history is in dialogue with their present, it is influenced by those around them and it is often self-serving and biased. However this is all data that is ripe for analysis. Memory isn’t the best way to get the text of a speech given by Winston Churchill or the number of left boots produced by the Soviet Union between 1970-1974, but it is a very good way to explore the social and political implications of those things. To examine how they were received and dealt with and how they interact with our present.

Alessandro Portelli has written extensively about this (for example in The order has been carried out). Memory is untrustworthy as it serves personal and political purposes, Portelli’s histories of Fascist Italy show how memories can be mapped onto political and historical narratives. What the society would like to marginalise is more difficult to remember. When our memories work with the dominant narrative they are stronger and more confidently expressed. This is not a reason not to use memories in research, but it is a reason to be careful about what you are told. I’ve found this myself when interviewing people about their PhDs, a common narrative is usually constructed where the supervisor was either wonderful or entirely absent. The polarity of experience becomes greater with distance but also with the retelling of the story, especially the retelling in a context where others are telling very similar stories. Narrative tropes are swapped backwards and forwards and “your story” is influenced both in terms of style and remembered facts. Memories are the stuff of which much social science research is built, but they are slippery things that we should be very careful about how we approach.

If you want more of this sort of stuff have a look at the Oral History Reader.

Any if you’ve got any ideas about why I forgot the Columbia crash. Let me know…

Occupational versatility and planned procrastination

David Peck (iCeGS associate) and John Mariott (Ask iCeGS) have tracked down a copy of Alec Rodger’s inaugural lecture. The lecture was entitled ‘Occupational Versatility and Planned Procrastination’ and was delivered to Birkbeck College on the 16th October 1961.

 

It makes for fascinating reading on a whole host of levels Rodger is often seen as being the leading advocate for the talent matching approach in career guidance. This is generally out of vogue and so Rodger gets overlooked on the whole. I’m not an advocate of talent matching, but I still found a lot to interest me here. Most interesting is Rodger’s attempt to situate guidance within the context of economic planning. His starting point is not how can we help people to live happy, productive lives, but rather what does the economy need and how can we steer people in that direction. As he puts it

 

The need for vigilance in the avoidance of waste will continue to be great, not only for the common good but also for the sake of the individuals who may be wasted.

 

Economy first, people second. Which is not to say that Rodger was unfeeling or disinterested in individuals, as a number of the pen portraits he paints of the tribulations of those who have been poorly matched reveal a strong sympathy for those round pegs who have been jammed into the square holes.

 

The grappling with the big picture of the labour market is particularly interesting because a key problem that he perceives is the coming maturity of the baby boom generation. As we are now struggling with what we are going to do as the boomers leave the labour market it is extremely instructive to note that there was equal apprehension about them joining it.

 

This is just one of many areas where Rodger’s assessment of the issues in the labour market make us aware that continuity is just as important as change in society. The following sentence in particular sends its echoes down the decades.

 

Whatever the Chancellor may manage to do through his bank rate manipulations, credit squeezes, pay pauses, hire purchase restrictions and the like, economic competition and technological change will soon make it necessary for us to build up as much occupational versatility as we can muster.

 

So economic uncertainty, global competition and technological change make it essential that individuals are equipped to switch careers and to retool their occupational identity and their skills. Sound familiar?

 

The fact that this rhetoric of continuous change and the need for an increasingly flexible work force seems to be pretty unchanged after almost 50 years make me at least a little sceptical about its empirical basis. Is the world of work really changing so much? If it needed to change in 1961 in roughly the same way as it now needs to change why are we still introducing ideas about the need for occupational flexibility and mobility as if they are new?

 

The need for flexibility would superficially appear to present a major problem for the talent matching approach. If we can test someone for their aptitudes and then correlate these aptitudes with a job we surely have a problem if the job they were matched to is no longer needed by the labour market. What if you no longer match any of the opportunities?

 

Obviously the talent matchers never promised everyone a perfect match. Talent matching approaches stress the importance of strong labour market knowledge so that guidance fits with what is actually available. Best fit was what was on offer rather than a perfect match. However Rodger recognises the weakness of this matching approach. Sometimes the round pegs won’t come out of the round holes they were first put into. His solution to this is to allow people to defer the decision (planned procrastination) giving them a broader training and enabling people to have a wider range of aptitudes that can then be used to match them to a variety of places in the labour market.

 

This suggestion that training should be broader is coupled with Rodger’s attack on the apprenticeships model of training and make me wonder whether the current vogue for vocationally focused training and apprenticeships will offer quiet the panacea that we are promised. To his credit Rodger problematicised his call for flexibility, arguing that academics should investigate the relative merits of high levels of specialisation and flexibility in the training of occupations.

 

Rodger is clear however that “the use of scientific specialists, obtained through planned procrastination or otherwise” is essential in the provision of guidance. The challenging task for these people then is to find people opportunities that enable them to enter the labour market and utilise their skills and aptitudes without making them inflexible and unable to move into alternative occupations should the labour market need them to do this.

 

Would any/many contemporary careers advisers claim that they are really doing anything radically different to this? 

Barrie Hopson

We had a great iCeGS annual lecture yesterday featuring Barrie Hopson. When we announced that Barrie was going to be giving the lecture we were overwhelmed with careers practioners who were keen to catch up with what he was up to. People remembered him from Build Your Own Rainbow and Lifeskills and were wondering what he’d been up to for the last few years. Barrie has been working outside of the careers field for a few years and mentioned that he’d been variously asked “Barrie Hopson, I thought you were dead” and “Didn’t you used to be Barrie Hopson”?

However, Barrie has recently returned to the careers fold and has been writing very engagingly on Portfolio Careers and The Pluses of Being 50 Plus. His work is well worth revisiting if you haven’t looked at it for a while and he is one of the few authors who really gets what the point of a blog is. As he said he can have a new idea the day after the book is published and get it straight out there.
I won’t try and summarise his lecture in full, not least because we are planning to publish it as an iCeGS occasional paper. However he did set out a very interesting typology of careers that I think might be useful for those who are thinking about their own career.

Careers, Barrie argued, can be seen as working through four patterns
  1. The ladder: This is the conventional upwardly mobile career that moves through a limited number of organisations in a logical and planned fashion.
  2. Serial careers: Describes a pattern where people move from one field to another, it is characterised by changes of direction and movement.
  3. Lifestyle careers: Where people focus on their work-life balance, organising work around other things like home, family and hobbies
  4. Portfolio careers: Where people manage a number of jobs, incomes and interest simultaneously. 

I thought that this was a nice set of options. Whether they are really different patterns of career or just different stages is questionable. As is the question about whether this typology works across culture and class equally well. There is clearly more to think about, but there is certainly a possibility that the world of work is becoming more decentralised and less focused around large employers. Can any economists tell me if this is true? I’d also be interested in hearing portfolio workers opinions about whether it is a good thing or not.

I’ll post as soon as we’ve got Barrie’s paper up on the iCeGS site and you can engage with it all on a deeper level than my memory of yesterday is going to allow.

The politics of guidance

I’ve just been reading Tony Watts’ chapter on ‘Socio-political ideologies in guidance’ in Rethinking Career Education and Guidance. In this article Watts sets out a typology of guidance ideologies that I found extremely useful. Watts sets up two axes based on where the focus of the intervention is.

Focus on society Focus on the individual
Change Radical (social change) Progressive (individual change)
Status quo Conservative (social control) Liberal (non-directive)

So approaches to guidance can be divided up by whether you are focused on the social context or the individual and on whether you want to change the thing that you are focused on. So in detail this leaves you with the following categories.

  • Liberal: Guidance that is focused on the individual and pursues a non-directive approach. Individuals are supported to make decisions, but their decision making is not challenged.
  • Conservative: Guidance that serves the current needs of society e.g. matching the labour force to capitals needs. The process of guidance is about steering people into places that they can be socially and economically useful.
  • Progressive: Guidance that encourages and supports individuals to exceed the role that they and those around them might have imagined. This might involve challenging their sense of what they are good at or fit for.
  • Radical: Guidance that encourages individuals to challenge the social and economic conditions that are constraining their choice. This might move people beyond thinking about what they can do and get them thinking about why they and those like them can’t do other things.

I think that I’ll save my thoughts on where I sit and what I think guidance’s role should be for another post. I found this a very useful conceptualisation of the possible roles. Does it work for everyone else?

Where would you put yourself?