Mobile and movement matters

flickr-ipods recharge

Summary: Location-aware applications continue to be one of the biggest growth areas for mobile scenarios. Putting motion into the mix will bring about a fresh round of innovative solutions

So the new iPhone announcements this week have been met with a resounding ‘meh’ from tech pundits and Wall Street. Whenever people start saying Apple has got it all wrong, there’s usually some valuable titbits in the details. And whilst a lot of focus has been on the colourful new iPhone 5C range that appears to be merging with the iPod Nano, the interesting innovation is unsurprisingly in the more expensive iPhone 5S.

The iPhone 5S includes a new capability called M7, a ‘motion coprocessor’. A motion coprocessor can track and log the motion of the phone with minimal battery use. This is significantly more efficient than the current methods that require apps to be running continuously in the background to record activities. Instead, it becomes possible for apps to simply read the log files when they are launched. Whilst the focus is on the impact to fitness apps, that’s just one narrow market. The world of work is increasingly going mobile and being able to adapt alerts, feedback and notifications based not just on location but also on current movements opens up new possibilities.

Whilst a lot of people talk about the visual and physical design of Apple devices, what impresses me more is the thought and detail that goes into designing everyday interactions.  An example I frequently give is that, when you are on a phone call, the simple act of moving the phone will reactivate the display with the keyboard displayed. It’s a simple thing. The chances are, if you are moving the phone away from your ear to look at it, you are talking to an automated system that requires you to key in something. If the display remains in sleep mode, you have to first activate it. On my previous phone (running Android), you had 2 buttons to pick from and a 50:50 chance of cancelling the call instead. The devil is always in the details.

Location-aware applications continue to be one of the biggest growth areas for mobile scenarios. Putting motion into the mix will bring about a fresh round of innovative solutions. And I am guessing we can expect to see the hardware move beyond the mobile phone to wearable devices, if/when Apple finally produces the long-rumoured iWatch. Google is already heading down that path with Google Glass. Yes, other companies have recently announced smart watches. But I’ve yet to see one a non-geek would want to wear. Apple continues to be the leader in designing for the mainstream and ignoring the technical and financial experts who think they know better.

Related blog posts


Flickr image at the start of this post: The Operating Table kindly shared by Blake Danger Bentley. I was looking for something to symbolise extended battery life but this was far too fun to not use instead

What really matters on a tablet

Touch a tablet

…is how responsive the screen is to interaction: visual, touch and motion.  That’s where Apple is succeeding and others, Android- and Windows-based devices, are failing compete. It also needs to be ‘Instant On’ like a mobile phone, but most have at least cottoned on to that.

— Update 30th March —

Steven Sinofsky has just posted on the Windows 8 blog details about Touch hardware and Windows 8, expanding on information first shared during the Build conference last October.  We attended that conference and have a Samsung Windows 8 tablet prototype in our R&D lab.  The touch sensitivity is certainly far superior to previous Windows devices we’ve worked with. But not as superior as the iPad (v1) that I’m still using for day-to-day tablet activities.

Somebody has added a great comment to Steven Sinofsky’s post:

Is there any support for great track pad for the laptops ? Many people (programmers, businesses) have the traditional laptops (may be with touch screen in future) but having a great quality touch with same gesture support will be THE feature for Windows PCs going forward

Spot on question. My primary hardware devices at the moment are an Apple MacBook Air, Apple iPad (v1) and Apple iPhone 3GS. All three have comparible sensitivity in the touch department. No, the MacBook Air does not have a touch-screen but the trackpad is so good that I no longer use a mouse.  The same cannot be said for any Windows-based laptops I’ve worked with.   Microsoft needs to get better at linking the experience (hate using that word but can’t think of a better one) across different form factors. A tricky challenge when you’re dependent on many different hardware vendors.  And given Microsoft is not great at achieving this across software, or even their own web sites, the odds of that happening are not great.

— original post —

Ryan Block at gdgt outlines why he thinks the new iPad retina-display specs are a big deal:

The core experience of the iPad, and every tablet for that matter, is the screen. It’s so fundamental that it’s almost completely forgettable. Post-PC devices have absolutely nothing to hide behind. Specs, form-factors, all that stuff melts away in favor of something else that’s much more intangible. When the software provides the metaphor for the device, every tablet lives and dies by the display and what’s on that display.

So when a device comes along like the iPad that doesn’t just display the application, but actually becomes the application, radically improving its screen radically improves the experience. And when a device’s screen is as radically improved as the display in the new iPad, the device itself is fundamentally changed

Whilst the article emphasises the new retina-display introduced with the latest iPad, the same can be said for other sensory inputs and outputs – the sensitivity of the screen to touch (for swiping, input etc.) and reaction of apps to motion.

The first mobile phone I used that involved a pure touch-based user interface, i.e. no physical keyboard, was the HTC Hero running Google’s Android OS. For me it was a step change in how I used a phone and I loved it from the start. The ability to quickly swipe across screens and retrieve or view different data, whether it was to check emails, find a contact, follow a map, send a Tweet… it was a jump in productivity for me. Until…

… the iPad launched.

Having always been a fan of tablets and frustrated by the lack of progress, it was an easy decision to get one and see if the device was worthy of the hype.  I still have it 2 years later and it’s an integral part of my daily work.  I don’t still have the HTC Hero.  Because once I started using the iPad, the way I touched screens altered. The iPad was way more responsive (read: reacted to a much lighter touch) than the HTC Hero. All of a sudden, I’d go to swipe the screen on the phone and it wouldn’t respond. I’d have to swipe again, but harder.  It was nothing compared to the lack of sensitivity on Windows touch-enabled phones but it was enough to be annoying.

Then there’s the thought behind motion on Apple devices. I was delighted the first time I moved the iPhone from my ear to look at the screen (on a dreaded automated call that required keyboard input) – the keypad automatically appeared. On the HTC Hero, I was forever accidentally cancelling calls because it didn’t do that, you had to push a button to reactivate the screen and I’d invariably press the button that ended the call. Doh!

That’s why 2 years later, I now also have an iPhone, albeit the ageing 3GS model. Everything about how it responds to my actions trumps the alternatives I’ve tried.  That’s the challenge facing Apple’s rivals. Tablets will, in one form or another, become a standard part of the typical workplace in the coming years.  And they are setting the bar for what people have become used to.  Alternatives need to either be a lot cheaper or do something fundamentally different that the iPad can’t.

Related blog posts

Why design means compromise

Catching up on podcasts, I was recently listening to ‘An hour with Bill Buxton’ recorded at Microsoft’s Mix conference in 2010. Bill Buxton is Principle Researcher at Microsoft Research and an early pioneer in human-computer interaction

Read More