A short presentation looking at the impact mobile devices are having on learning, knowledge sharing and organisation structures
There’s an abundance of great new talks up on the TED web site, following the most recent conference in February 2010. One gem was delivered by Daniel Kahneman, a Nobel prize winner in behavioural economics.
The riddle of experience vs memory
Early in the talk, an example is given to demonstrate the difference between what we experience and what we choose to remember:
A man described how he had been listening to a glorious symphony. At the very end, there was a dreadful screeching sound – “It ruined the whole experience”. But it hadn’t. What it had ruined was the memory of the experience.
The talk centres on the difference between what we remember and what we actually experienced, and it’s impact on our happiness. Six years ago, Dan Gilbert, author of ‘Stumbling on Happiness’ delivered a very similar talk. His approach came from the other side – what we expect to experience versus what we actually do experience. He challenged the the idea that we’ll be miserable if we don’t get what we want or things don’t go as planned.
Why are we happy?
A powerful example from this talk:
Given a choice between winning the lottery or becoming a paraplegic within the next 12 months, which would make you happy? When we simulate this, the choice seems obvious. The reality, taken from real-world data, is that both lottery winners and paraplegics are happy. Winning or losing in any situation have far less impact than people expect them to have…
Whilst both talks focus on self, our flawed assumptions about happiness can have worse consequences when we apply our assumptions to somebody else. We think we can imagine life in another’s shoes. Both talks above demonstrate that we cannot.
Hidden Project Requirements…
Back in 2003 I was presenting to the SharePoint product group, providing customer feedback from beta testing of what was to become SharePoint Portal Server 2003 (SPS 2003). The first version of SharePoint (SPS 2001) had plenty of shortcomings that led to a massive re-write. But re-writing involved eliminating a number of features completely and they were parked for later release (some have still yet to reappear…) One of my slides went along the following lines:
“Last year, customers were complaining how bad the workflow is in SPS 2001 …Now that you’ve removed it, they’re saying it’s great and want it back”
People didn’t hate workflow, they loved the idea of it (the simulation) and hated how it worked (the actual experience) to the point nobody had a good word to say about it. But removing the feature completely caused all sorts of headaches at the time (the memory was suddenly a lot rosier).
The type of projects I work on usually involve introducing technology that will change the way people work. But change goes in both directions. People’s behaviour will influence how effective (or not) the technology is. Hence the interest in behavioural economics. At the start of a project I’ll often hear comments like: ‘Users will never use this feature’, ‘They won’t work that way’ or ‘They don’t need to know…’ But we never know for certain what will happen until people actually start using the technology. It’s why I prefer to get organisations to prototype ideas before going for full-scale project implementations. After the initial statements about what users do and don’t do and will and won’t like, projects often head down the road of ‘Now that I see it, it isn’t what I want’ or ‘Didn’t know it could do that…’ or ‘Never thought that would be useful…’ Prototypes offer a glimpse into the actual experience – you wear the shoes instead of remembering the worn out pair or trying to imagine what a different pair would feel like to walk in.
This Monday’s Start the Week programme on Radio 4 included an interesting discussion about amateurism during World War II, or as it was titled: ‘The dodgy dossier that fooled Hitler’. The short version (I’d encourage you to listen to the podcast, details at the end of the post):
In 1943, allied troops were in North Africa waiting for orders to attack in Europe. If you looked at a map it was pretty obvious where the attack would start – Sicily. To try and gain the upper hand, an elaborate hoax was put in place to try and convince Hitler that instead of Sicily, the attack was going to begin from Greece in the Eastern Mediterranean and Sardinia in the West. This involved procuring a dead person in London, covering up the fact he had died of poisoning to instead make it look like he died in an air crash, dropping him in the sea to float ashore at a specific location in Southern Spain where intercepted messages suggested a particular German secret agent was operating. The false documents planted on the body should hopefully be discovered by said agent, be identified as real battle plans and hopefully be passed up the chain of command to the very top.
The whole idea sounds like some ridiculous plot in a work of fiction. There are far too many variables and dependencies that could go wrong. And worst of all, if the German secret agent was not fooled by the fake documents, it would beyond doubt confirm Sicily as the real location and likely double Hitler’s efforts there. In short, the plan had as much chance of making matters worse as making them better.
The plan worked.
Listen to the podcast to hear more about it, including “although World War II claimed more lives than any other conflict in history, finding the right dead body was incredibly difficult…” it’s a great conversation. But what’s interesting, and the reason for this post, was a comment made towards the end of the story:
“If Churchill hadn’t been such an enthusiast for this sort of operation and given them full rein…In a way it’s a celebration of amateurism, they were allowed to think what ever they wanted and try it out.”
An Admiral commented about the plan “You can rely on the enemy’s ‘yesmanship’ and ‘wishfulness’”
How many leaders today be prepared to take such a leap of faith? The preference is to rely on statistics and follow standard procedures over ideas and instincts. A simple example was reported this week. Somebody tweeted they were going to blow up their local airport. When discovered by the police, they were arrested under the Terror Act, have had their phone and laptop confiscated, received a lifetime ban from said airport, and been suspended from work until it is decided whether or not they will be prosecuted. The missing piece of context from this story: just before the alleged bomb threat, the person had been tweeting their frustration with the snow and how it was ruining their holiday plans because the local airport was closed. It was a stupid joke in the current climate. But really, how long should it have taken for someone to decide if this was a serious terrorist threat or not versus following the standard ‘send in the cavalry’ procedure. Our officials are becoming yes-folk. And that puts us at more risk, not less…
The danger in relying on process and statistics at the expense of ideas and instincts is you risk missing the threat in front of your eyes. Perhaps we should bring a bit of amateurism, or humanism, back into official processes.
For the rest of this week (until January 25th) you can download a copy of the programme via iTunes or listen using BBC’s iPlayer