Wearable gadget

Is the use of machines to evaluate and predict human behaviour a de-humanising act? Should a birth-to-death record of our actions be used in judgement throughout life? How about through generations…?


China is currently piloting a ‘social credit’ system with a view to rolling it out to the full population by 2020. Specific details of the system are not public knowledge but a variety of articles have revealed some insights into what it might mean for the life of citizens. Getting a low score or failing to complete expected activities could lead to being prevented from travelling on planes or being limited to certain travel classes on trains. Getting a high score could lead to rewards such as discounts on utility bills or being able to borrow more money and at preferential interest rates.  Play too many online video games? Perhaps your internet access should be limited… Could your parents behaviour be indicative of your future actions? Perhaps it should be used to determine who goes to university…

But it isn’t just China. Turns out your chance of getting a job just about anywhere could increasingly be influenced by online social media. The latest system to be called out for breaking rules on Twitter and Facebook is a service providing an ‘automated baby-sitter ranking system’ by mining the internet for data about prospective baby-sitters to produce a risk-rating for their likely behaviour. For example, scoring from 1 to 5 (where 1 is the best, and 5 is the worst) on a range of factors such as likely to take drugs, likely to be a bully, likely to have a bad attitude…

One example provided in The Washington Post explained how, despite passing criminal background checks, parental comments and a face-to-face interview, a candidate’s prospects were thrown into doubt when an AI scored her at 2 instead of 1 for some aspects. With no information or context provided for what she may have done to fail to achieve the perfect score, the parents began to doubt her suitability:

“Social media shows a person’s character…
so why did she come in at a 2 and not a 1?”

Whilst the use of a social credit system is a worrying development, the response by the parent is just as chilling. What have we become? That comment means a scale of 1 to 5 isn’t required, a simple binary – Yes or No – answer will do. Because anything less than a perfect score and you are out!

There are a host of issues with this trend. For starters, behaviour analytics to profile individuals based on their digital data traces is barely better than crystal ball gazing. It lacks the context that is needed to explain behaviour given most occurs in social settings. What does a score of 2 instead of 1 out of 5 on some obscure measure of ‘respect’ even mean? Will I be judged on the content of this post? Has it got too many question marks in it? Maybe the makes me an uncertain individual who has difficulty making decisions????? (or suggests someone with sarcastic tendencies…).

Second, use of AI in scenarios that can affect the health, well-being and prospects of any individual needs to have the same level of rigour we demand from traditional statistical methods. What is the confidence level in the result? What is the range of variance or potential for error? Can the result be challenged? What data was used? Is it robust, representative and reliable?  A lack of convincing response to any of those questions is cause to doubt the usefulness of the AI* in question.

Third, have we forgotten what it is to be human? Are personality traits fixed genetically at birth? Are all our actions pre-destined? Is there no free-will whatsoever in this universe? Maybe… hope not. Because the consequences of that belief are even more chilling than a nation running a cradle-to-grave social credit system.  Do we really desire a world where everybody is identical? Achieving the same perfect score on every personality trait that can be measured? We should be careful what we wish for…

It’s interesting that at the same time as people seem happier to trust an opaque algorithm over human cognition, there is a growth in courses promoting meditation and mindfulness as a balancer against the pressures of life. We create those pressures when we demand perfection, in ourselves and in others.

There is so much potential for AI to enhance and extend our cognitive capabilities. But that opportunity will be lost if we forget what it is to be human in the process.

References

* AI in this context is being applied to any non-human algorithmic process used to determine a response.


wearable technology conceptFeatured image: istockphoto (licensed for this site only, not for reuse)

Category:
Behaviour, Blog
Tags:
, , , ,

Join the conversation! 2 Comments

  1. Reminds me that you need to be careful about what you measure because people will find a way to game the system – even if the system is life.

  2. That they will. It is a little depressing that we choose to copy the binary world of machines rather than embrace diversity, acknowledge we’re all fallible and forgive past mistakes.

Comments are closed.

%d bloggers like this: