In April 2021, Microsoft Research hosted a two-day workshop introducing their platform for situated intelligence (PSI). The workshop was recorded and videos are available online at YouTube, also embedded here with notes.Read More
Big data continues to be a hot topic and we are increasingly seeing data-driven decisions and processes replace expert opinions in everyday activities. Indeed, one designer quit Google with the following comment:
I had a recent debate over whether a border should be 3, 4 or 5 pixels wide, and was asked to prove my case. I can’t operate in an environment like that.
Trouble is, Google was able to prove that using data over instinct when deciding between 41 shades of blue for text-based links led to an annual increase of $200 million in advertising revenue. But whilst some decisions may be purely data-driven, most remain dependent on how the data is interpreted. And interpretation can be heavily influenced or manipulated by the environment, politics and language used.
A recent psychological research study showed that playing a game with a different avatar influenced behaviour afterwards. Those who played with the Superman avatar (context: hero saving the day) were kinder in later decisions than those who played with the Voldermort avatar (context: evil world destroyer).
In 2004, an experiment conducted at Stanford University (recently reported in The Atlantic) showed the influence of language on game play. Using the classic Prisoner’s Dilemma, one group were told they were playing ‘The Community Game’ and one group were told they were playing ‘The Wall Street Game’. Two-thirds of those playing ‘The Community Game’ chose to co-operate and share the rewards. Two-thirds of those playing ‘The Wall Street Game’ chose not to and focused on personal gain.
A simple shift in language can influence decisions and behaviour. Often with participants even realising. The subject of behavioural economics is not new. But combined with big data, its role in deliberately influencing decisions will continue to advance.
- Why Google has 200 million reasons to put engineers over designers – The Guardian, February 2014
- It matters which avatar you choose when gaming – Harvard Business Review, February 2014
- These two words will make you more selfish – The Atlantic, October 2013
- The Behavioural Insights Team – UK Cabinet Office web site
Flickr image: Optical illusion kindly shared by The Lex Talionis. It’s impossible to see both states of an optical illusion simultaneously. You have to make a choice about how you interpret what you think you see…
“Don’t confuse precision with accuracy. I can be wrong to 5 decimal places…” Dr Tim O’Neil
(Click to view source)
The image above is a great demonstration of how rubbish we are at predictions, particularly when they involve innovations. As part of their annual survey, Gartner asked CIOs which technology company has been most influential over the last 10 years? And which will be in the next 10 years?
First of all, there isn’t much context to go on from this slide. Influential to the organisation or in general? Internally or externally? Enterprise, consumer or both? You would have to assume both given the prominence of Apple alongside more traditional enterprise players like SAP yet no sign of Facebook.
Talking of Apple. Imagine if the same question had been asked in 1997, the year Apple was verging on bankruptcy and Michael Dell recommended shutting it down. Ten years later and Apple was on its second wind, but only in the world of digital music. The iPhone had not yet been announced and the iPad still three years away. What would CIOs have predicted then? I’m betting Apple would still have barely registered. Today, more than two-thirds of Apple’s revenue is from products released since 2007 and now they are considered the most influential company over the past 10 years. And everyone thinks they’re done on the innovation front. That’s humans for you. We’re great at hindsight.
The correct answer would be ‘I don’t know’. ‘Others’ is the closest option on the slide. It may come from a company that does not yet exist. But it is just as likely to come from an established player. IBM has been doing rather well establishing enterprise social tools in large corporates (the Lotus brand is in danger of finding its second wind) and leading externally with the ‘Smarter Cities’ initiative. Amazon has just announced the ability to host virtual desktops on Amazon Web Services (AWS). Microsoft will have a new CEO in the next 12 months. Who knows what the technology landscape is going to look like in 10 years time. One thing’s for sure, there’s a storm already brewing between enterprise and consumer worlds. Would you say IT doesn’t matter? 😉
p.s. I’ll stick in my 2ps worth. I’m surprised Salesforce and LinkedIn didn’t register on the slide… presumably they’re in the ‘Others’ bucket.
Social Media Analytics are a big area of growth… but beware the wrong focus. A lot of analytics can be like looking in the rear-view mirror whilst driving. Tells you where you’ve just been, allows you to make some course adjustments safely but you really want to keep most of your focus on the road ahead.
Instead, think more in terms of awareness. Real-time analytics can be hugely beneficial if it makes you aware of a conversation taking place that you should be involved in. That means monitoring the various channels, currently including Facebook, Twitter, LinkedIn, Pinterest, Google+… and having a social media strategy that enables you to respond effectively to whatever the monitoring brings up. By all means have your regular reviews, rank people’s contributions if you must. But don’t let the bells and whistles distract you from what matters.
The Economist has an article ‘Don’t lie to me Argentina‘ explaining why they are removing a figure from their indicators page:
Since 2007 Argentina’s government has published inflation figures that almost nobody believes. These show prices as having risen by between 5% and 11% a year. Independent economists, provincial statistical offices and surveys of inflation expectations have all put the rate at more than double the official number…
What seems to have started as a desire to avoid bad headlines in a country with a history of hyperinflation has led to the debasement of INDEC, once one of Latin America’s best statistical offices…
We see no prospect of a speedy return to credible numbers. From this week, we have decided to drop INDEC’s figures entirely.
Whilst we often talk about how statistics can always provide the answers people are looking for (hence the popular quote used as the title for this post), there is another angle to consider – are the underlying numbers telling the truth? It is a critical question when decisions are based on increasingly complex calculations that are then converted into summary data visualisations to assist decision-making. Was the original source data automatically scraped from systems or keyed in by people? Just how balanced is that scorecard..?
One of the current hot trends on the Internet is the emergence of ‘Big Data’ – being able to scrape massive quantities of information automatically generated, such as the search and surfing habits of everyone who ever logged into Facebook… and then analysing for patterns. One of the potential attractions is eliminating human error – or influence (in terms of truthfulness) – over the underlying data sources. Doesn’t solve the challenge of influence through gaming of the system but that’s perhaps a post for another day.
If you are interested in the use and abuse of statistics, there’s an excellent short book that walks through historical examples of where statistics simply don’t work – The Tyranny of Numbers: Why counting won’t make us happy, by David Boyle – Click Here for an old book review I wrote. And naturally, the book is listed on Amazon.
Related blog posts:
A few years ago, I published an infographic showing the history of SharePoint, to help decypher the different twists, turns and acquisitions that influenced what went into (and out of) SharePoint. (May get round to doing an update on that sometime…)
A related product has also had a few twists and turns of its own – PerformancePoint. The clue is in the name, it’s in the same family of products as SharePoint and originally targeted performance management solutions. Here’s its life story so far…
Back in 2001, business intelligence and performance management were quite hot topics but became overshadowed by the rise of the portal. An early market leader was ProClarity and most people thought Microsoft would acquire it. Instead they purchased Data Analyzer, owned by a ProClarity partner.In the same year, Microsoft acquired Great Plains, a provider of business applications to small and medium-sized organisations. Included with the acquisition was FRx Forecaster which had been acquired by Great Plains the previous year.
Data Analyzer remained available as a desktop product for a while before disappearing. Some of the technology merged into what would become Microsoft’s first performance management server product: Business Scorecard Manager 2005 (BSM – naturally, not to be confused with the British School of Motoring if you’re reading this in the UK 🙂 )
BSM enabled you to define key performance indicators (KPIs) and then create scorecards and dashboards to monitor and analyse performance against targets. The product included web parts that could display those KPIs, scorecards and dashboards on a SharePoint site. It even had a little bit of Visio integration producing strategy maps (a key component of an effective business scorecard). BSM was a classic v1 product: difficult to install, basic capabilities and limited adoption by organisations.
In 2006, Microsoft finally acquired the company it should have bought in the first place – ProClarity, which had a desktop and server product. The products were available standalone and some of the technology integrated into the replacement for BSM – PerformancePoint Server 2007 (PPS). Also integrated into PPS was a new forecasting capability based on the FRx Forecaster
PPS was effectively two products – a Monitoring Server and a Planning Server. The Monitoring Server included a revamped Dashboard Designer with improvements to the core monitoring and analysis capabilities – KPIs, reports, scorecards and dashboards. It also leveraged corresponding web parts available in SharePoint Server 2007 Enterprise Edition. The Planning Server included a new Planning Business Modeler that enabled multiple data sources to be mapped and used to plan, budget and forecast expected performance. The Planning Server proved particularly problematic to configure and use…
In 2009, Microsoft announced that PerformancePoint Server was being discontinued. The Monitoring Server elements were to be merged into future releases of SharePoint (and anyone licensed for SharePoint Server 2007 Enterprise Edition was immediately given access to PerformancePoint Server 2007 as part of that license). The source code for the Planning Server elements was released under restricted license as a Financial Planning Accelerator, ending its life within Microsoft. The FRx technology returned to the Dynamics product range.
In 2010, SharePoint Server 2010 was released and the Enterprise Edition includes the new PerformancePoint Service complete with dashboard and scorecarding capabilities but no planning options. This year also saw the release of Management Reporter which offers both monitoring and planning capabilities with direct integration into the various Dynamics products. And a new BI tool was released – PowerPivot for Excel, an add-in that enables you to create pivot tables and visualisations based on very large data sets. A trend worth keeping an eye on…
Going forward, Microsoft has business intelligence and performance management solutions in two camps: the Office and SharePoint platform that can provide a front-end to business applications and data sources of all shapes and sizes; and the Dynamics Product range that provides end-to-end business applications for small- to medium-sized organisations (and divisions within larger organisations). Dynamics can also leverage SharePoint as its front-end, just like any other business application.
SQL Server continues to provide the core foundation for all data-driven solutions – offering its own database capabilities as well as warehousing and integration with other ODBC-compliant data sources plus the reporting and analysis services on which BI solutions are built. SharePoint provides the web front-end for information and data-driven solutions amongst other things, like search, collaboration etc… Office continues to provide desktop tools as well as web-based versions that integrate with SharePoint. Excel now has its sidekick PowerPivot (wish they’d named that one PivotPoint…), Visio continues to be, well, Visio – one of the few acquisitions to keep its original name intact. And also worth a mention are Bing Maps and MapPoint, which provide location-specific visualisations. I originally wrote that MapPoint was discontinued. But did a search to check when it stopped being available only to find it alive and well as MapPoint 2010… hey ho!
You’d be right to think this performance management roadmap has looked a little rocky. What’s interesting to note is there is a Corporate Performance Management team within the Dynamics group, whilst Business Intelligence messaging barely mentions it, focusing instead on subsets of performance management – reporting and analysis.
If you are a performance management purist, you will likely be disappointed with the capabilities offered by PerformancePoint, much in the same way a taxonomy purist will gripe at the limitations within ManagedMetadata. Both are services within SharePoint 2010 that help manage and visualise information – they are part of a platform as opposed to specialist niche solutions that will typically offer a more comprehensive feature set. But if you want to start improving how everyone interacts with information and data as part of daily decisions and activities, a platform is a pretty good place to begin, requiring less skills or resources to get started.
Final note: All the above comments are based on my own opinions and observations. They do not represent any Microsoft official statements from the past, present or future 🙂 Have to mention on this sort of post as it covers the period of time I worked at Microsoft.
- Microsoft Business Intelligence (BI) web site
- PerformancePoint Services team blog
- Dynamics Corporate Performance Management (CPM) team blog
- MapPoint web site (only BI-related product that doesn’t get a mention on the BI web site)
- New features in PerformancePoint Services 2010 – blog post by PPS team
- Management Reporter: What’s next – roadmap into 2012+ (more planning planned)
- Microsoft shutters PerformancePoint and Planning Server gets an extension – RedmondMag
Related blog posts
I try to use trains a lot (even more so now, thanks to rising fuel prices) and there is one pattern in particular that really irritates me. Regardless of train operator, it appears all ticket collectors have been on the same ‘How to reduce your passenger levels’ training course.
The announcement overheard today went along the lines (the train was about to depart the station):
“If you have an Advanced Saver ticket, it must be for this train. The departure time will be printed on your ticket. If your ticket is for a different time it will not be valid. You will be required to purchase a new ticket for this journey if you choose to stay on the train.”
In other words, get off the damn train if you bought a cheap ticket and it wasn’t for this time.
Now, to be fair, the rules are pretty clear when you purchase Advanced Saver tickets. But here’s the irony. This train was the last one before rush hour started. There were all of 5 people in my carriage. Why oh why would the train operator want to throw people off an empty train? It creates the double-whammy of saving nothing (the reduced weight is unlikely to make a dent on the amount of fuel used to run the train) and potentially adding to over-crowding on the next train, exacerbated by a bunch of pissed off customers.
By all means, have the rule. But for goodness sake, allow the ticket collectors to use their brains. If the train is empty, turn a blind eye. Gently remind the passenger about the rules and make it clear that an exception can be made this time only because the train isn’t full. You have happy customers and more space available on the next train, which might be busier than yours. If the train is full, enforce the rule to the letter. That’s only fair to those who have paid for tickets specifically for this train.
I love what technology can do to make systems better. But oh so often, there are simple changes you can make to improve services, sales and profits. And they cost nothing at all.