Aligning investment with reality

HeartBeat

An interesting article appeared at the end of last week, highlighting a shift in thinking towards IT projects within the UK government:

Gov.uk was launched quickly and iteratively, with a new simplicity that has resulted in a website containing fewer than 10% of previous separately hosted pages and is set to save as much as £70m on previous arrangements.

A new online system for people to apply for Power of Attorney on behalf of others had taken 10 working days to procure, 24 days to build and code a prototype alpha system for live testing and a beta was due to go live in two months, he said. The whole project had been commissioned from a small business using the G-Cloud for around £50,000 a year, compared with a quote obtained from one large existing provider of £4m set up plus £1.8m per year

There are two reasons why the costs to deliver an IT project should drop significantly to the established normal. 1. At the tail-end of a previous disruptive innovation, 2. At the beginning of a new one.

hype-curve

The hype curve of innovation applies to just about any industry. A new concept is invented. It starts small and is highly specialised but creates demonstrable value to customers. Such success never goes unnoticed and demand begins to grow. That brings competition and rival proprietary solutions. To begin with, more value is created, usually at an increasing pace as different companies come up with more innovative features to compete with one another. But then a tipping point is reached and, for a short while, everything gets a bit chaotic and messy. Hidden costs emerge, problems arise, competitors get acquired and solutions are suddenly discontinued or dramatically altered. Out of the disruption comes a new demand – standardisation that allows for continuity and economies of scale. And so the market settles down into slow growth, cheaper solutions and small incremental improvements. Until a new disruption comes along.

Building traditional web sites for publishing content are at the end of 15-year innovation cycle. The standards for design and formatting of web content have become so well established that even Microsoft has just about embraced them within the latest versions of the Internet Explorer web browser. Most popular public-facing web sites now follow familiar conventions regarding navigation and page layouts. To consolidate multiple different government departmental web sites under a single umbrella gov.uk web site makes absolute sense and should save a lot of money.

Using agile approaches to software development has grown in popularity in recent years. The goal being to do ‘just enough’ design to build a working solution, and quickly tweak and iterate based on actual usage patterns rather than predicted requirements. It requires far less ‘up-front’ investment due to much shorter planning cycles and usually results in far better user adoption rates. But it doesn’t guarantee a cheaper solution over time. That will depend on the iterations and ongoing development.

Applying an agile approach to business systems is at the early stages of the innovation hype cycle. Some solutions are simply brilliant, but growth in competition means some are not. The disruption and hidden costs are yet to emerge and it’s a little early to be celebrating dramatic savings in annual operating costs. I have already seen one government project that I know is so under-costed, it will take the supplier in question into bankruptcy unless they are able to renegotiate down the line. Yes, the bigger systems integrators were insanely expensive in their quote for what was needed. But insanely cheap is a short-lived improvement.

A comment was made by Tom Loosemore, deputy director of the Government Digital Service (GDS) responsible for the projects quoted above:

“We don’t talk to IT departments other than to ask what legacy can offer”

That’s not a healthy comment and was not well received at the conference where it was made. Today’s IT legacy is just yesterday’s innovations gone stale. I think the GDS could look to the automotive industry for how to better embrace IT as part of doing business. The current approach may be saving a lot of money in the short term (and that’s an understandable driver in the current economic climate) but there are going to be consequences.

Seeking innovative ways to use technology to solve business problems is a good approach. But assuming it is a panacea for all IT projects is not.

References

Platform or Application?

Platforms are often criticised when compared to applications for delivering IT solutions. They rarely offer the same richness of features and can often be more complicated to set-up. But that doesn’t mean they are without their own benefits  Read More

Retiring old equipment

rusty old car

IDC has a report out showing the increase in IT costs for managing older hardware and software, covered by a recent Computerworld article. It’s sponsored by Microsoft so yes the numbers are unsurprising. But they are also quite believable.

  • The magic milestone is after the three-year mark, when “costs begin to accelerate” because of additional IT and help desk time, and increased user downtime due to more security woes and time spent rebooting
  • IT labor costs jump 25% during year four of a PC’s lifespan, and another 29% in year five,, while user productivity costs climb 23% in year four and jump 40% during year five. Total year five costs are a whopping 73% higher than support costs of a two-year-old client
  • Organizations reported that they spent 82% less time managing patches on Windows 7 systems than they did on Windows XP, 90% less time mitigating malware, and 84% less help desk time
  • Windows 7 users wasted 94% less time rebooting their computers and lost 90% less time due to malware attacks.

It’s part of Microsoft’s campaign to get computers running Windows XP replaced.  And so they should be. You rarely see a company car over the age of 3 years old, let alone in service for over a decade.  This is another area that in time will benefit from the transition to cloud computing and apps that work across multiple devices. One of the issues for replacing old desktops in large companies has been a dependency on bespoke applications that prove hard to migrate. That’s one trend to be glad to see the back of.

An associate who runs an IT support company for small businesses has recently introduced a new service plan for his clients.  The support costs now increase for all equipment over 5 years old.  For many customers, it has spurred them into action. IT systems can become like a pair of comfy old slippers – we hang on to them for far too long after they outlive their usefulness. The comparison to managing cars and their servicing/MOT costs has encouraged many to now look at a more regular cycle to keep technology current and useful to business activities instead of just sticking with what they are used to regardless of the wear and tear. Not all good news for Microsoft, some are now looking at a possible switch to Apple and others…

Video interview about Olympic IT

BCS interview image

The British Computer Society/Chartered Institute for IT has posted an interview with the Metropolitan Police’s Director of IT, Stephen Whatson, who’s been tasked with the IT infrastructure for the Olympics this year.  Includes some interesting comments about the preparation and decisions made. Video embedded below (Flash player required).

 

Source: BCS – Video interview: Olympic IT, Apr 2012

Architect or Gardener?

When planning for a new system and its governance requirements, consider whether the system is transactional and in need of specifics versus collaborative and in need of guidance. Architect one and landscape the other.

Read More

What really matters on a tablet

Touch a tablet

…is how responsive the screen is to interaction: visual, touch and motion.  That’s where Apple is succeeding and others, Android- and Windows-based devices, are failing compete. It also needs to be ‘Instant On’ like a mobile phone, but most have at least cottoned on to that.

— Update 30th March —

Steven Sinofsky has just posted on the Windows 8 blog details about Touch hardware and Windows 8, expanding on information first shared during the Build conference last October.  We attended that conference and have a Samsung Windows 8 tablet prototype in our R&D lab.  The touch sensitivity is certainly far superior to previous Windows devices we’ve worked with. But not as superior as the iPad (v1) that I’m still using for day-to-day tablet activities.

Somebody has added a great comment to Steven Sinofsky’s post:

Is there any support for great track pad for the laptops ? Many people (programmers, businesses) have the traditional laptops (may be with touch screen in future) but having a great quality touch with same gesture support will be THE feature for Windows PCs going forward

Spot on question. My primary hardware devices at the moment are an Apple MacBook Air, Apple iPad (v1) and Apple iPhone 3GS. All three have comparible sensitivity in the touch department. No, the MacBook Air does not have a touch-screen but the trackpad is so good that I no longer use a mouse.  The same cannot be said for any Windows-based laptops I’ve worked with.   Microsoft needs to get better at linking the experience (hate using that word but can’t think of a better one) across different form factors. A tricky challenge when you’re dependent on many different hardware vendors.  And given Microsoft is not great at achieving this across software, or even their own web sites, the odds of that happening are not great.

— original post —

Ryan Block at gdgt outlines why he thinks the new iPad retina-display specs are a big deal:

The core experience of the iPad, and every tablet for that matter, is the screen. It’s so fundamental that it’s almost completely forgettable. Post-PC devices have absolutely nothing to hide behind. Specs, form-factors, all that stuff melts away in favor of something else that’s much more intangible. When the software provides the metaphor for the device, every tablet lives and dies by the display and what’s on that display.

So when a device comes along like the iPad that doesn’t just display the application, but actually becomes the application, radically improving its screen radically improves the experience. And when a device’s screen is as radically improved as the display in the new iPad, the device itself is fundamentally changed

Whilst the article emphasises the new retina-display introduced with the latest iPad, the same can be said for other sensory inputs and outputs – the sensitivity of the screen to touch (for swiping, input etc.) and reaction of apps to motion.

The first mobile phone I used that involved a pure touch-based user interface, i.e. no physical keyboard, was the HTC Hero running Google’s Android OS. For me it was a step change in how I used a phone and I loved it from the start. The ability to quickly swipe across screens and retrieve or view different data, whether it was to check emails, find a contact, follow a map, send a Tweet… it was a jump in productivity for me. Until…

… the iPad launched.

Having always been a fan of tablets and frustrated by the lack of progress, it was an easy decision to get one and see if the device was worthy of the hype.  I still have it 2 years later and it’s an integral part of my daily work.  I don’t still have the HTC Hero.  Because once I started using the iPad, the way I touched screens altered. The iPad was way more responsive (read: reacted to a much lighter touch) than the HTC Hero. All of a sudden, I’d go to swipe the screen on the phone and it wouldn’t respond. I’d have to swipe again, but harder.  It was nothing compared to the lack of sensitivity on Windows touch-enabled phones but it was enough to be annoying.

Then there’s the thought behind motion on Apple devices. I was delighted the first time I moved the iPhone from my ear to look at the screen (on a dreaded automated call that required keyboard input) – the keypad automatically appeared. On the HTC Hero, I was forever accidentally cancelling calls because it didn’t do that, you had to push a button to reactivate the screen and I’d invariably press the button that ended the call. Doh!

That’s why 2 years later, I now also have an iPhone, albeit the ageing 3GS model. Everything about how it responds to my actions trumps the alternatives I’ve tried.  That’s the challenge facing Apple’s rivals. Tablets will, in one form or another, become a standard part of the typical workplace in the coming years.  And they are setting the bar for what people have become used to.  Alternatives need to either be a lot cheaper or do something fundamentally different that the iPad can’t.

Related blog posts

PerformancePoint – A brief history

A few years ago, I published an infographic showing the history of SharePoint, to help decypher the different twists, turns and acquisitions that influenced what went into (and out of) SharePoint. (May get round to doing an update on that sometime…)

A related product has also had a few twists and turns of its own – PerformancePoint. The clue is in the name, it’s in the same family of products as SharePoint and originally targeted performance management solutions. Here’s its life story so far…

PerformancePoint History

Back in 2001, business intelligence and performance management were quite hot topics but became overshadowed by the rise of the portal. An early market leader was ProClarity and most people thought Microsoft would acquire it. Instead they purchased Data Analyzer, owned by a ProClarity partner.In the same year, Microsoft acquired Great Plains, a provider of business applications to small and medium-sized organisations. Included with the acquisition was FRx Forecaster which had been acquired by Great Plains the previous year.

Data Analyzer remained available as a desktop product for a while before disappearing. Some of the technology merged into what would become Microsoft’s first performance management server product: Business Scorecard Manager 2005 (BSM – naturally, not to be confused with the British School of Motoring if you’re reading this in the UK 🙂 )

BSM enabled you to define key performance indicators (KPIs) and then create scorecards and dashboards to monitor and analyse performance against targets. The product included web parts that could display those KPIs, scorecards and dashboards on a SharePoint site. It even had a little bit of Visio integration producing strategy maps (a key component of an effective business scorecard).  BSM was a classic v1 product: difficult to install, basic capabilities and limited adoption by organisations.

In 2006, Microsoft finally acquired the company it should have bought in the first place – ProClarity, which had a desktop and server product. The products were available standalone and some of the technology integrated into the replacement for BSM – PerformancePoint Server 2007 (PPS). Also integrated into PPS was a new forecasting capability based on the FRx Forecaster

PPS was effectively two products – a Monitoring Server and a Planning Server. The Monitoring Server included a revamped Dashboard Designer with improvements to the core monitoring and analysis capabilities – KPIs, reports, scorecards and dashboards. It also leveraged corresponding web parts available in SharePoint Server 2007 Enterprise Edition. The Planning Server included a new Planning Business Modeler that enabled multiple data sources to be mapped and used to plan, budget and forecast expected performance. The Planning Server proved particularly problematic to configure and use…

In 2009, Microsoft announced that PerformancePoint Server was being discontinued. The Monitoring Server elements were to be merged into future releases of SharePoint (and anyone licensed for SharePoint Server 2007 Enterprise Edition was immediately given access to PerformancePoint Server 2007 as part of that license). The source code for the Planning Server elements was released under restricted license as a Financial Planning Accelerator, ending its life within Microsoft. The FRx technology returned to the Dynamics product range.

In 2010, SharePoint Server 2010 was released and the Enterprise Edition includes the new PerformancePoint Service complete with dashboard and scorecarding capabilities but no planning options. This year also saw the release of Management Reporter which offers both monitoring and planning capabilities with direct integration into the various Dynamics products. And a new BI tool was released – PowerPivot for Excel, an add-in that enables you to create pivot tables and visualisations based on very large data sets. A trend worth keeping an eye on…

Going forward, Microsoft has business intelligence and performance management solutions in two camps: the Office and SharePoint platform that can provide a front-end to business applications and data sources of all shapes and sizes; and the Dynamics Product range that provides end-to-end business applications for small- to medium-sized organisations (and divisions within larger organisations). Dynamics can also leverage SharePoint as its front-end, just like any other business application.

Microsoft Business Intelligence and Performance Management tools

SQL Server continues to provide the core foundation for all data-driven solutions – offering its own database capabilities as well as warehousing and integration with other ODBC-compliant data sources plus the reporting and analysis services on which BI solutions are built. SharePoint provides the web front-end for information and data-driven solutions amongst other things, like search, collaboration etc… Office continues to provide desktop tools as well as web-based versions that integrate with SharePoint. Excel now has its sidekick PowerPivot (wish they’d named that one PivotPoint…), Visio continues to be, well, Visio – one of the few acquisitions to keep its original name intact. And also worth a mention are Bing Maps and MapPoint, which provide location-specific visualisations. I originally wrote that MapPoint was discontinued. But did a search to check when it stopped being available only to find it alive and well as MapPoint 2010… hey ho!

You’d be right to think this performance management roadmap has looked a little rocky. What’s interesting to note is there is a Corporate Performance Management team within the Dynamics group, whilst Business Intelligence messaging barely mentions it, focusing instead on subsets of performance management – reporting and analysis.

If you are a performance management purist, you will likely be disappointed with the capabilities offered by PerformancePoint, much in the same way a taxonomy purist will gripe at the limitations within ManagedMetadata. Both are services within SharePoint 2010 that help manage and visualise information – they are part of a platform as opposed to specialist niche solutions that will typically offer a more comprehensive feature set. But if you want to start improving how everyone interacts with information and data as part of daily decisions and activities, a platform is a pretty good place to begin, requiring less skills or resources to get started.

Final note: All the above comments are based on my own opinions and observations. They do not represent any Microsoft official statements from the past, present or future 🙂 Have to mention on this sort of post as it covers the period of time I worked at Microsoft.

References

Related blog posts

APIs and Future Business Models

Fabulous presentation walking through the history of commerce in the 20th Century and why APIs and developers will increasingly be involved in the successful business models of the 21st Century

 The more information matters to your business, the more important APIs will become to leverage that information and improve services to your customers. Access to great development skills is rapidly becoming a competitive advantage…

Evolving web business models

There is an outstanding presentation on Slideshare explaining why all web-based businesses need to be evolving their business models to leverage APIs more than their own web sites. Found via Twitter but I can’t find who originally shared as Tweetville is amok with a ‘0 followers’ discussion at the moment. And this presentation is too good to get lost in the stream.

To summarise the presentation:

  • Darwin identified that finches lived in a very remote location meaning the variations had to compete with each other to survive. The finches you see today are the winners
  • At the start of the 20th Century, retail business was primarily local within villages, towns and cities, selling direct to people. With the evolution of suburbia, we saw the shift from the corner shop to the shopping mall with each mall containing mostly the same retail brands – business went from direct to indirect. The big brands at the end of the 20th Century were the winners
  • At the start of the 21st Century, web-based business was ‘local’ to the web site, selling direct to visitors. With the evolution of social networks and mobile devices, we are seeing a shift from visiting the corner-shop equivalent web site to the mall equivalent – lots of businesses hosted on the same web platform, be it micro-applications on your mobile phone or applications in widgets on a social networking site. To be one of those applications means using APIs (application programming interfaces). How important is it?:

80% of web-based traffic will be coming from beyond the browser…

If you are doing business online, you need developers who understand APIs