Mobile and movement matters

flickr-ipods recharge

Summary: Location-aware applications continue to be one of the biggest growth areas for mobile scenarios. Putting motion into the mix will bring about a fresh round of innovative solutions

So the new iPhone announcements this week have been met with a resounding ‘meh’ from tech pundits and Wall Street. Whenever people start saying Apple has got it all wrong, there’s usually some valuable titbits in the details. And whilst a lot of focus has been on the colourful new iPhone 5C range that appears to be merging with the iPod Nano, the interesting innovation is unsurprisingly in the more expensive iPhone 5S.

The iPhone 5S includes a new capability called M7, a ‘motion coprocessor’. A motion coprocessor can track and log the motion of the phone with minimal battery use. This is significantly more efficient than the current methods that require apps to be running continuously in the background to record activities. Instead, it becomes possible for apps to simply read the log files when they are launched. Whilst the focus is on the impact to fitness apps, that’s just one narrow market. The world of work is increasingly going mobile and being able to adapt alerts, feedback and notifications based not just on location but also on current movements opens up new possibilities.

Whilst a lot of people talk about the visual and physical design of Apple devices, what impresses me more is the thought and detail that goes into designing everyday interactions.  An example I frequently give is that, when you are on a phone call, the simple act of moving the phone will reactivate the display with the keyboard displayed. It’s a simple thing. The chances are, if you are moving the phone away from your ear to look at it, you are talking to an automated system that requires you to key in something. If the display remains in sleep mode, you have to first activate it. On my previous phone (running Android), you had 2 buttons to pick from and a 50:50 chance of cancelling the call instead. The devil is always in the details.

Location-aware applications continue to be one of the biggest growth areas for mobile scenarios. Putting motion into the mix will bring about a fresh round of innovative solutions. And I am guessing we can expect to see the hardware move beyond the mobile phone to wearable devices, if/when Apple finally produces the long-rumoured iWatch. Google is already heading down that path with Google Glass. Yes, other companies have recently announced smart watches. But I’ve yet to see one a non-geek would want to wear. Apple continues to be the leader in designing for the mainstream and ignoring the technical and financial experts who think they know better.

Related blog posts

References

Flickr image at the start of this post: The Operating Table kindly shared by Blake Danger Bentley. I was looking for something to symbolise extended battery life but this was far too fun to not use instead

Mobile relationships and lifelines

Mobile use in Kwali

An interesting article over on the Harvard Business Review blog network, looking at the rising use of mobile devices for providing healthcare and financial advice to women in developing countries. One of the challenges faced – how to build relationships using a medium that involves non-visual clues.

And a great soundbite explains how one company is tackling traditions that can inhibit the use of mobile phones for women in some cultures:

“A lot of big organizations will say we need to get every woman a phone. But it’s not that simple. Women can get in trouble for having a phone — what we’re trying to do at FrontlineSMS is to make phones such useful tools that it becomes a financial liability for the family to not let the woman have a phone.”

I can’t add much to the article and recommend reading in full. Sharing it here because I love stories about new technologies making a positive difference in the world.

Reference

Related blog posts

Flickr image ‘User Testing in Kwali’ kindly shared by The Reboot

Phones and Accessibility

When the iPhone was first announced in 2007, the one aspect I thought could be a bad idea for at least one group of people was the lack of any keypad. The all glass interface seemed most unfriendly in terms of not being able to feel the keys – at the time, an essential method for people with poor or no sight.

But there have been other developments in that time too. Below is an amazing video of a blind person using his iPhone to take pictures and post them to Facebook. Keypad interaction not required. But what’s most beautiful is the comment ‘I never had a camera before’. It’s a great example of how technology can be a great leveller and bring equality in so many different ways.

Video source: Kottke org – How blind people use Instagram

Can apps be open and trusted?

Summary: Technical experts may criticise Apple’s strict approach to the app store and what apps are allowed on it. But never underestimate the human need to feel in control of, and trust, personal devices such as mobile phones. Walled gardens have their benefits.

One of the arguments for open source versus proprietary closed source software platforms has been that ‘many eyes’ developing and testing the code will result in a more secure and stable platform with fewer risks or bugs. Also, that there will be less opportunity for ‘vendor lock-in’, meaning better choice and control for customers.

Any discussion about open vs closed systems will often include polarized views. In practice, each has its benefits.

I prefer to drive privately-manufactured cars, trusting that the vendor will have completed sufficient testing to guarantee the safety of the car within certain driving parameters. I expect my car to cope with a certain amount of road surface water, a necessary requirement living in the UK, but I know it is not designed to be a boat. The fuel I put in the car is also likely to be privately manufactured, but delivered in a standard format. I prefer open source/public roads in the UK, expecting them to be mostly of a suitable width and quality to drive on. Private roads can be an entirely different experience – some better, some worse. And the better ones usually come with additional or hidden fees.

The same can be said about software. I don’t mind using proprietary software (the vehicle) but I prefer the data and communications (the fuel and the road) to be in open or standard formats. The reason for this ramble is a recent series of articles about mobile phones and criticism directed towards the closed nature of different vendors’ app stores.

One news article – Microsoft balks at Apple’s 30% fee – describes how updates to Microsoft’s Skydrive app for iOS have been rejected by Apple for not complying with the app store guidelines, causing problems for the app and any other apps that integrate with it (Skydrive is a cloud-based file storage system). The article focused on the subcription model as being the cause of the issue. Whilst money may have had something to do with it, I don’t think it is the only reason. Apple has a very strict criteria for how apps are installed and updated. From the content of the article, it seems the issue is with apps having features that link to external web sites. The risk being that such links could lead to updates taking place outside of the app store making it possible to bypass all review processes and restrictions.

Another article – Google’s Android malware scanner detects only 15% of malicious code in test – describes the concern with methods that allow apps to be updated outside the relevant app store. How easy is it for someone to distribute an app containing malicious code? Very easily it would seem. App stores aren’t completely immune, but how easy or difficult it is to release naughty apps is much more dependent on the app store review process.

It is possible that the Microsoft-Apple app store squabble is just about who gets what cut of what fees. And technical experts may criticise Apple’s closed approach to installing apps on the iPhone. But we should never underestimate the importance of feelings such as trust and control. Those feelings matter to people, and all the more so when it comes to personal mobile devices. Walled gardens have their benefits.

References

Flickr image courtesy of Martin Cathrae

Can tablets enhance productivity?

Picture by ThomasThomas, via Flickr

…no this post is not about the ‘swallow with water to augment human capabilities’ kind. But the answer is the same.

One of the biggest criticisms about tablet devices such as the iPad is that they are only good for consuming content, not for creating content. And it’s a fair argument. Particularly the high-end content creation activities such as video editing and software programming. However, they are not everyday activities for the majority of people in work.  For most people, content creation involves Microsoft Office or something similar. Word processing, spreadsheets and creating presentations.

Even for the everyday content creation tasks, a tablet struggles to compete with a traditional computer, desktop or portable.  My work kit comprises of a MacBook Air and iPad. If I could only take one on a work trip, it would be the MacBook Air. However on holiday, the iPad wins. A slim keyboard goes in the luggage just in case. But creating a detailed proposal is hard work on an iPad. However, I am increasingly using the two together. I find it easier to research and create visual layouts and concepts on the iPad but the MacBook Air wins when typing is required.

But that’s an argument about content creation, not productivity. Just how many of those documents being created everyday in the workplace help improve productivity?  How many of the reports get read from start to finish and are used to make an informed decision versus justify decisions already made. Maybe it’s time to start asking how much unstructured content really needs to be created, versus updating forms and applications on the go.

Whilst digital technologies continue to transform how we create, share and consume content, communicate with others and make decisions before acting, activities in most workplaces continue along traditional lines. People walking around with printed files that are out of date before they even get to the meeting. Notes being filed and forgotten. Hierarchy trumping evidence.

For tablets to have a real impact in the workplace from a productivity standpoint requires a rethink about what activities really matter in the workplace.

Thanks to ThomasThomas for the Flickr image used in this post

What really matters on a tablet

Touch a tablet

…is how responsive the screen is to interaction: visual, touch and motion.  That’s where Apple is succeeding and others, Android- and Windows-based devices, are failing compete. It also needs to be ‘Instant On’ like a mobile phone, but most have at least cottoned on to that.

— Update 30th March —

Steven Sinofsky has just posted on the Windows 8 blog details about Touch hardware and Windows 8, expanding on information first shared during the Build conference last October.  We attended that conference and have a Samsung Windows 8 tablet prototype in our R&D lab.  The touch sensitivity is certainly far superior to previous Windows devices we’ve worked with. But not as superior as the iPad (v1) that I’m still using for day-to-day tablet activities.

Somebody has added a great comment to Steven Sinofsky’s post:

Is there any support for great track pad for the laptops ? Many people (programmers, businesses) have the traditional laptops (may be with touch screen in future) but having a great quality touch with same gesture support will be THE feature for Windows PCs going forward

Spot on question. My primary hardware devices at the moment are an Apple MacBook Air, Apple iPad (v1) and Apple iPhone 3GS. All three have comparible sensitivity in the touch department. No, the MacBook Air does not have a touch-screen but the trackpad is so good that I no longer use a mouse.  The same cannot be said for any Windows-based laptops I’ve worked with.   Microsoft needs to get better at linking the experience (hate using that word but can’t think of a better one) across different form factors. A tricky challenge when you’re dependent on many different hardware vendors.  And given Microsoft is not great at achieving this across software, or even their own web sites, the odds of that happening are not great.

— original post —

Ryan Block at gdgt outlines why he thinks the new iPad retina-display specs are a big deal:

The core experience of the iPad, and every tablet for that matter, is the screen. It’s so fundamental that it’s almost completely forgettable. Post-PC devices have absolutely nothing to hide behind. Specs, form-factors, all that stuff melts away in favor of something else that’s much more intangible. When the software provides the metaphor for the device, every tablet lives and dies by the display and what’s on that display.

So when a device comes along like the iPad that doesn’t just display the application, but actually becomes the application, radically improving its screen radically improves the experience. And when a device’s screen is as radically improved as the display in the new iPad, the device itself is fundamentally changed

Whilst the article emphasises the new retina-display introduced with the latest iPad, the same can be said for other sensory inputs and outputs – the sensitivity of the screen to touch (for swiping, input etc.) and reaction of apps to motion.

The first mobile phone I used that involved a pure touch-based user interface, i.e. no physical keyboard, was the HTC Hero running Google’s Android OS. For me it was a step change in how I used a phone and I loved it from the start. The ability to quickly swipe across screens and retrieve or view different data, whether it was to check emails, find a contact, follow a map, send a Tweet… it was a jump in productivity for me. Until…

… the iPad launched.

Having always been a fan of tablets and frustrated by the lack of progress, it was an easy decision to get one and see if the device was worthy of the hype.  I still have it 2 years later and it’s an integral part of my daily work.  I don’t still have the HTC Hero.  Because once I started using the iPad, the way I touched screens altered. The iPad was way more responsive (read: reacted to a much lighter touch) than the HTC Hero. All of a sudden, I’d go to swipe the screen on the phone and it wouldn’t respond. I’d have to swipe again, but harder.  It was nothing compared to the lack of sensitivity on Windows touch-enabled phones but it was enough to be annoying.

Then there’s the thought behind motion on Apple devices. I was delighted the first time I moved the iPhone from my ear to look at the screen (on a dreaded automated call that required keyboard input) – the keypad automatically appeared. On the HTC Hero, I was forever accidentally cancelling calls because it didn’t do that, you had to push a button to reactivate the screen and I’d invariably press the button that ended the call. Doh!

That’s why 2 years later, I now also have an iPhone, albeit the ageing 3GS model. Everything about how it responds to my actions trumps the alternatives I’ve tried.  That’s the challenge facing Apple’s rivals. Tablets will, in one form or another, become a standard part of the typical workplace in the coming years.  And they are setting the bar for what people have become used to.  Alternatives need to either be a lot cheaper or do something fundamentally different that the iPad can’t.

Related blog posts