Data stolen from Ashley Madison shows how weak passwords remain the most popular. Simply guessing six basic number sequences in order from 1 – 5 to 1 – 0 will net 225,000 accounts.
Summary: Technical experts may criticise Apple’s strict approach to the app store and what apps are allowed on it. But never underestimate the human need to feel in control of, and trust, personal devices such as mobile phones. Walled gardens have their benefits.
One of the arguments for open source versus proprietary closed source software platforms has been that ‘many eyes’ developing and testing the code will result in a more secure and stable platform with fewer risks or bugs. Also, that there will be less opportunity for ‘vendor lock-in’, meaning better choice and control for customers.
Any discussion about open vs closed systems will often include polarized views. In practice, each has its benefits.
I prefer to drive privately-manufactured cars, trusting that the vendor will have completed sufficient testing to guarantee the safety of the car within certain driving parameters. I expect my car to cope with a certain amount of road surface water, a necessary requirement living in the UK, but I know it is not designed to be a boat. The fuel I put in the car is also likely to be privately manufactured, but delivered in a standard format. I prefer open source/public roads in the UK, expecting them to be mostly of a suitable width and quality to drive on. Private roads can be an entirely different experience – some better, some worse. And the better ones usually come with additional or hidden fees.
The same can be said about software. I don’t mind using proprietary software (the vehicle) but I prefer the data and communications (the fuel and the road) to be in open or standard formats. The reason for this ramble is a recent series of articles about mobile phones and criticism directed towards the closed nature of different vendors’ app stores.
One news article – Microsoft balks at Apple’s 30% fee – describes how updates to Microsoft’s Skydrive app for iOS have been rejected by Apple for not complying with the app store guidelines, causing problems for the app and any other apps that integrate with it (Skydrive is a cloud-based file storage system). The article focused on the subcription model as being the cause of the issue. Whilst money may have had something to do with it, I don’t think it is the only reason. Apple has a very strict criteria for how apps are installed and updated. From the content of the article, it seems the issue is with apps having features that link to external web sites. The risk being that such links could lead to updates taking place outside of the app store making it possible to bypass all review processes and restrictions.
Another article – Google’s Android malware scanner detects only 15% of malicious code in test – describes the concern with methods that allow apps to be updated outside the relevant app store. How easy is it for someone to distribute an app containing malicious code? Very easily it would seem. App stores aren’t completely immune, but how easy or difficult it is to release naughty apps is much more dependent on the app store review process.
It is possible that the Microsoft-Apple app store squabble is just about who gets what cut of what fees. And technical experts may criticise Apple’s closed approach to installing apps on the iPhone. But we should never underestimate the importance of feelings such as trust and control. Those feelings matter to people, and all the more so when it comes to personal mobile devices. Walled gardens have their benefits.
- Microsoft balks at Apple’s 30% fee, leaving Skydrive and apps that integrate with it in the lurch on IOS – TheNextWeb, Dec 2012
- Google’s Android malware scanner detects only 15% of malicious code in test – The Verge, Dec 2012
Flickr image courtesy of Martin Cathrae
If you want to protect your intellectual property, worry less about online security controls and more about loyalty. If your employees care, they are less likely to share with outsiders.
In the past week, Microsoft has changed its standard terms of service agreements. As reported by The Verge:
Microsoft’s revised policy allows the company to access and display user content across all of its cloud properties. Whereas the previous version of the TOS granted Microsoft the right to appropriate user content “solely to the extent necessary to provide the service,” the terms now state that this content can be used to “provide, protect and improve Microsoft products and services.”
Commentors on the article noted that this was a somewhat hypocritical move. When Google made a similar change to their terms of service just 6 months ago, Microsoft took out adverts in major newspapers to spread a little FUD*. Covered by the IdeaLab at the time, Microsoft felt the need to advise everyone:
Google is in the midst of making some unpopular changes to some of their most popular products. Those changes, cloaked in language like “transparency,” “simplicity,” and “consistency,” are really about one thing: making it easier for Google to connect the dots between everything you search, send, say or stream while using one of their services.
But, the way they’re doing it is making it harder for you to maintain control of your personal information. Why are they so interested in doing this that they would risk this kind of backlash? One logical point: Every data point they collect and connect to you increases how valuable you are to an advertiser.
Hypocracy aside, and pity the Microsoftie that has to keep a straight face explaining the about-turn, the changes in terms of service are no surprise. Enabling content to be integrated across services does offer the potential to improve the services and yes, also the potential to earn more money from advertising. A necessary factor when offering ‘free’ services to consumers. Somebody always pays.
Dropbox is a popular online file sharing tool and has also come in for criticism. A recent article by Varonis highlighted that Dropbox holds the keys to encrypt and decrypt to your data on their servers (their emphasis, not mine). They have to, both for feature reasons – the file sharing element – and for legal reasons. What does this mean?
This means that a Dropbox employee could theoretically view (or steal) your data
O! M! G!*
Before you start worrrying abut Dropbox’s employees, look closer to home… A recent study by the Ponemon Institute found: (my comments in brackets…)
- 90% of organisations in the study had experienced leakage or loss of sensitive or confidential documents over the past 12 months
- 71% of respondents say that controlling sensitive or confidential documents is more difficult than controlling records in databases (surprised it wasn’t higher than that)
- 70% of respondents say that employees, contractor or business partners have access to sensitive or confidential documents even when access is not a job or role-related requirement
- 63% of respondents do not believe they are effective at assigning privilege (permissions) to [manage] access to sensitive or confidential documents
So to summarise, most organisations do not have adequate controls to manage their intellectual property when it is in document form, regardless of where it is stored. If that’s the case, accept a simple fact. If a document exists, at some point you may lose control of it. The terms of service for online storage are the least of your worries.
So what’s a business to do?
Whilst I would not suggest throwing out the security controls and it sounds like some organisations could do with improving them, I would encourage putting more effort (and investment) into making sure employees care. People who feel loyal to a cause will protect that cause.
Over this last weekend, Lewis Hamilton grumpily shared an Instagram picture of the McLaren Formula 1 team telemetry sheet, showing his and his team mate Jensen Button’s performance during qualifying. Reported today in The Times:
Christian Horner, the Red Bull team principal, could not contain his mirth as he claimed his engineers were poring over data that is usually restricted only to McLaren’s drivers and race engineers. Hamilton deleted the tweet but it was too late.
What security system could have prevented that? A photo of a print-out shared via Twitter by someone paid an awful lot of money to win races in part based on the intellectual property held in said photo. But who was having a particularly crappy weekend with the team, again… Naturally the PR machine is now in full throttle (pun intended) and McLaren claims the data loss is no big deal.
How employees feel about the company will have a far bigger influence on maintaining control of your data than any security system, digital or physical. The ‘vibe’ of the office matters more than most people realise, for so many different reasons – productivity gains, collaborative working, knowledge sharing and yes, protecting intellectual property from prying eyes.
- Updated service agreement allows Microsoft to integrate content across cloud services – The Verge, September 2012
- Marco Arment on Dropbox: Don’t use it for anything valuable – Varonis, July 2012
- What Facebook Knows – MIT Technology Review, July 2012
- 2012 Confidential Documents at Risk Study – The Ponemon Institute, August 2012
- Lewis Hamilton leaves a trail of debris after a Tweet too far – The Times, September 2012 (sub req’d)
- Lewis Hamilton Tweet has not caused us much harm [says] McLaren – BBC News, September 2012
* FUD = Fear, Uncertainty and Doubt. What competitors like to create about rivals in customers’ minds
* OMG = Oh My God/Goodness, depending on your religious slant, often delivered with a twist of sarcasm.
In the past couple of weeks there have been a series of articles raising concerns about the amount of personal data being published to online social networks and the potential for it to be used for ill intent.
There are two different scenarios people should consider before sharing personal information:
- Would I mind if a complete stranger knew that information?
- Do I mind what any of my ‘friends’ do with the information?
If the answer is Yes to either question think twice before putting that personal information online at all. That’s not to say sharing is inherently good or bad. But once you have shared information with anyone, you have lost control of it. If you answered ‘No’ to question two above, you answered ‘No’ to both.
Here is a simple scenario using Facebook. In the image above, the green buddy is you. The blue buddies are your ‘friends’. The red buddies represent everyone else with Internet access.
You set up your privacy settings so that only friends can see your personal information. Anyone who is on Facebook but not a friend will only see your name, nothing else. That’s your decision. Sounds sensible. Sounds under control.
But if one of your friends decides to share information with their friends or third party applications, they may handover your personal information as well. It can be done in complete innocence and for good intentions – ‘I want to send birthday cards to my friends’, ‘Are any of my friends nearby to meet up with?’, ‘I’m interested in this group, I’ll add my friends to it as well’, ‘Has anybody in my network bought this <insert name of any item>?’ In the right context, all great stuff. But information about you has now been handed over to and stored somewhere beyond your control. The same applies to every application or web site that you allow to connect to your Facebook profile. Do you read all the terms and conditions, the notes about agreeing to data being stored indefinitely or granting access to other third parties?
It is not just you who decides how secure your personal information is. If you decide to share it with them, all your friends get to decide too. As do all the apps and web sites you connect to. And if you’re one of Facebook’s social butterflies, everyone gets to decide.
This doesn’t mean you should head straight to Facebook and switch everything off (too late for existing content anyway) but if you are going to participate in online social networks and care about what happens to your personal data, it’s a good idea to keep track of privacy settings and changes to policies.
If you’re not paying for a product, you’re not the customer, you are the product being sold. – Andrew Lewis
For Facebook and every application/advertising tool that uses it, it is in their best interests to get you to share your personal information. They will make it as easy and seamless to do as possible. And many will make it difficult or inconvenient to change those default settings to be more private. So think long and hard about what you want to share with anyone. And question whether having different privacy policies for everyone versus ‘friends’ actually means anything. A simpler (and more reliable) approach is to either share something with nobody or share with everybody.
A hassle, yes. But massive online social networks are still a young concept on the Internet meaning lessons will be learned the hard way. And everyone with a Facebook account can count themselves as one of the testers.
- Selling you on Facebook – Wall Street Journal, Apr 2012
- Selling digital fear – TechCrunch, Apr 2012 (response to WSJ article)
- This creepy app is a wakeup call for Facebook privacy – Cult of Mac, Apr 2012
- Facebook: Tracking your web activity even after you log out? – PCMag, Sep 2011
There’s lots of news about the latest release of classified documents on Wikileaks. If you want to have a peek, The Guardian has a great visualisation to get started.
I had a mooch. Everything I scanned through was thoroughly boring. As is the case with most information, even the classified stuff.
Over the past 10 years, I’ve worked with numerous government organisations. When discussing intranets, collaborative sites and knowledge management systems, one of the most frequent concerns is how to secure access to information and prevent the wrong eyes from seeing it. It is no small irony that the first ever monetary fines applied by the Information Commissioner’s Office (ICO) this month were for breaches that had nothing to do with networks.
The first was a £100,000 fine against Hertfordshire County Council for two incidents of faxing highly sensitive personal information to the wrong people. The second was a £60,000 fine against an employment services company for the loss of a laptop.
Here’s another example. A fair few years ago, I was in Luxembourg to present at an EU event. The night before the meeting I was in my hotel room when an envelope appeared under the door. Assuming it was details about the event, I opened it and pulled out the documents. The first hint that the documents might not be to do with the event was seeing Restricted stamped across the top of the first page. The second indication was, when scanning the content, it became apparent the documents were something to do with nuclear weapons facilities across Europe. By that point, I looked at the front of the envelope to discover that it wasn’t addressed to Miss S Richardson (i.e. me) but was instead addressed to <insert very senior military rank I can’t remember> Richardson. A rather terse conversation took place in the hotel reception as the documents were forwarded to their rightful recipient.
All three examples above were security breaches due to stupid human error. None involved networks or bypassing security systems. But only one required legal intervention – the faxing of content to the wrong people that was both legally confidential and highly sensitive.
And that’s the rub. Most security leaks don’t matter. A few years ago, some idiot in the UK tax office HMRC downloaded oodles of personal details to a CD and then lost it in the post. If there has been a bout of serious identity theft as a result, I haven’t heard about it. Ditto for a more recent breach that managed to send addresses and bank details to the wrong people. More people have been affected by a cock-up in the calculation of tax due to incorrect data than from identity theft due to lost data.
Most of the content on Wikileaks is embarassing to its targets (usually governments and/or large corporations) rather than dangerous. Yes there are exceptions, such as the failure to redact personally incriminating information from documents that could put lives in danger. But they are the exception, rather than the norm we tend to assume when documents are classified as confidential.
One of the recommendations I give to clients looking to improve the use and value of their intranets is to devalue information. Make it easier to access. The confidentiality of most content is over-rated. It’s importance and usefulness to other people is often under-rated.
For most organisations, content falls into one of three categories:
- Legal – fines and prison (though that is rare) may result from failing to protect legal documents
- Sensitive – contains information that could put at risk or be damaging to an individual or organisation
- Everything else
There’s no arguing over legal documents. No prizes for guessing at least most content falls (or should) under Everything Else. And sensitive…whilst some is easily justified (such as research into a new prototype that you wouldn’t want your competitors knowing about) an awful lot is considered sensitive purely to avoid embarrassment or conflict. Sometimes people should question verbalising their opinions, let alone putting them in writing… And if you work in government, for goodness sake don’t save it on your laptop!
One closing quote whilst on the subject of securing information. Whilst it refers to anonymity, it equally applies to trying to hide information from public view, which too often appears to be the reason for confidential classifications: (apologies, I can’t recall the source)
Providing a level of anonymity is great for play but prevents accountability
References and examples:
- US Embassy cables: browse the database – The Guardian, Nov 10
- Did WikiLeaks ‘Cablegate’ result from sharing too much information? – Harvard Business Review, Nov 10
- ICO fines Hertfordshire £100,000 – Kable, Nov 10
- HMRC sends private data by accident – publicservice.co.uk, May 10
- UK families put on fraud alert – BBC News, Nov 07
- A million people face tax bills of up to £5 after recalculations – The Guardian, Sept 10
- More than 1,000 government laptops lost or stolen – The Guardian, March 08
- New batch of terror files left on train – The Independent, June 08
- More than 60,000 devices left in cabs during last six months – BBC News, Sept 08