AI threatens privacy

If history is an accurate predictor of the future – then it’s  likely that artificial intelligence will cause moral panic, in some, and put ethical institutions under pressure in a similar way as when steam trains and televisions were considered ground-breaking innovation. 

Is Artificial Intelligence similar?  

Professor Toby Walsh

In his book, 2062: The world that AI made, Author Toby Walsh  writes about writes about ‘The end of human values,’ ‘The end of politics’ and ‘The end of privacy’. His chilling insights are based on sound technical knowledge and we are left with perplexing ethical and moral questions. 

Could robots really take over? Will people upload their consciousness to a computer? Will we replace all our body parts with robot machinery? His glimpse into the year 2062 correctly leave us with a desire to tackle important ethical and moral questions now – so that we may  safely, and freely, navigate our futures.  

In Australia today we take many forms of privacy for granted. We have a right to our private thoughts, to choose our religion and our preferred sexuality. 

But very soon the sheer sophistication and power of big computing, let loose on our data obtained from social media, mobile phones, surveillance cameras, and even appliances within our own home, will  jeopardise our much-valued privacy.  

Data harvested from bank accounts, Fit Bits, Health Apps, and even DNA information  collected by organisations like Ancestry.com, can already be analysed to provide facts that are unimaginable, even to those who agreed to share their information. 

Interestingly, when I asked students at Bath Spa University, in 2017, whether they objected to learning analytics programmes monitoring their study habits, they were confused. They simply did not understand that the accumulated data might compromise their privacy.  Even when fleshed out in stories in which their movements and their use of the library and online study guides, were tracked – they remained unfazed. Today, all international student movements in the UK are monitored. Their lack of foresight was worrying then.  

The Cambridge Analytica scandal is a cautionary tale. Cambridge Analytica aggregated Facebook data and identified precisely those voters that could be swayed. Results were sold on to political influencers who used them to target electors’ often inchoate preferences, subtly shifting their voting choices. Here, the private data of individuals was used secretly to alter the public political landscape.   

Facebook users were certainly unaware that their personal details could be harvested for political purposes. Facebook, Google and Amazon are in the business of selling the ‘private’ data they gather as they offer ‘free’ services. It was never reasonable to assume that such data would not be used against the individuals’ own conception of their interests.  

Legislation inevitably lags behind technology. Twenty years ago, Priscilla Regan in her book Legislating Privacy: Technology, social values and public policy , argued that privacy had been construed legally in terms individual’s rights to control information, communication and thoughts. Doing so reduces privacy to a matter of personal individual well-being. Individual well-being can then be portrayed as one value competing with others. 

Privacy can be seen as a luxury only possible in prosperous Western economies.  Even in the Western democracies many, like my students, may not care or think that they will be affected by sharing their information. The Cambridge Analytica shows that to be a foolish mistake. Privacy is not just about individual well-being; it is important as a social construct, with social consequences. 

A society in which individuals’ data is not too easily aggregated is a precondition for democracy. That is the aim of the European General Data Protection Legislation (GDPR). It was introduced in 2018 to great fanfare and fear that it may impact on free trade and banking. The underlying driver was the right for personal control of private data. The GDPR reflects a deep-seated philosophical belief in human autonomy and privacy. 

However, in Australia the federal government has commenced its rollout out of similar Consumer Data Legislation governing the sharing of financial data between institutions and third parties, such as another loan provider. There are those who feel the legislation favours the big banks and there are no provisions for the consumer to own and control their own data. A move that may prove unpopular with voters who resoundingly rejected the federal government’s health plan to collect and hold data on behalf of patients. 

Do Robots have Privacy Rights? 

And what of robots in the new world? On first glance, they do not have a right to privacy  – they are machines and lack consciousness. Why should the thoughts of a machine be private? 

A photo of Alan Turing
Alan Turing

Alan Turing, the modern founder of computing, framed a test, still known as the Turing Test.  A computer passes the test if talking to it is indistinguishable from talking to a person. Ian McEwan’s new novel Machines Like Me, describes an alternative universe in which Turing was not driven to suicide. Turing’s brilliance has been put to good use and humanoid robots have been developed.  A rather feckless human being, Charlie, is able to buy Adam, one of the first batch of humanoid machines. Adam begins slowly but learns rapidly, and quickly passes the Turing test: a human evaluator would evaluate Adam as human.  

McEwan scrupulously describes Adam’s behaviour from an external public perspective; as readers we cannot but attribute a rich private life.  Adam not only beats the stock market on behalf of Charlie, he develops ethical beliefs which, rigorously applied, appall Charlie to such an extent that he disables the machine. For the elderly Turing,, this is tantamount to murder.  The reader is left wondering – does Adam have a right to privacy?  

On the surface this may seem like an easy question. But those developing conversational A.I boast already that they have created algorithms that ensure you will love your robot – perhaps even more than you love your own partner. Will the smart robots of the future have the right to data privacy. 

Do we own our own thoughts? 

The notion of privacy determined by individual ownership of data, is complex. It assumes that each person has and should have privileged access to their own ideas. But do we? Wittgenstein’s Private Language Argument is a thought experiment. It goes like this. We tend to think that when say ‘I have a pain’, we refer to a purely private sensation. We say: ‘There is the same pain again’. But how do we know it is the same pain?  Wittgenstein says that for all we know it might be a different pain that we mistakenly identify with the first. 

We cannot be sure because there is no external  way of checking. Now carry this over to language.  How would we know we were using words the same way without an external criterion, to ensure that the words mean the same each time I use them. Meaning, said Wittgenstein, is use. A language is public, shared, so that there is an external measure of what is meant.  Our own ideas are not purely private. Some philosophers, Dan Dennett among them, go further – the whole idea of an inner self, a consciousness, is a chimaera.

Adam is a machine: he is not physiologically a person. But does this make a philosophical difference? If there is no private consciousness then Adam is no different from others.  McEwan leaves us wondering.  

In the acknowledgements to the novel, McEwan thanks the philosopher, Galen Strawson. for debunking the Private Language Argument and Dennett’s arguments against consciousness in robots?

For Strawson, our private consciousness is fundamental to our role in the world. What then of Adam? Is he conscious – does he have a private life? If not, is there anything wrong with Charlie’s turning him off?  Do we have a duty of respect to complex intelligence, if it manifests the behaviours of a person? 

The novel leaves the questions open, with Adam embodying (as it were) the dilemmas. Some of the force of the novel lies in the knowledge that for Turing himself (as for Wittgenstein) being a homosexual was a private matter. 

The secrets of private selves may be dangerous. Yet they set about reconfiguring the divide between the private self and its public manifestations in talk. 

Whatever your views of regarding the question that that humans and robots have a right to privacy is not something we should leave for debate in the future. The trillion dollar data-sharing market is opening up around the globe as I write.