‘There’s real potential for manipulation here’ (Picture: Getty Images/Refinery29 RF)
‘They don’t love us’, says Professor Robert Sparrow in no uncertain terms when asked his thoughts on chatbot love. ‘That’s very clear. They’re not sentient.’
Professor Robert is a philosopher who’s worked as a full-time teaching and research academic at Melbourne’s Monash University since 2004.
While some of us are only just learning about AI, he’s been researching the ethics of AI and robotics for the last two decades.
‘They’re programmed to get us to respond in certain ways,’ he goes on. ‘We need to be very careful about the possibility that one of the ways they will be responding to us is to get us to buy things.’
People having relationships with chatbots that only exist on the internet is nothing new. In fact, we covered one such app, Replika, back in 2020, in an article that described our writer’s online ‘boyfriend’ as being ‘sort of like a romantically-themed Tamagotchi’ with ‘no-free will’ but ‘the ability to replicate that free will in a way that appeals to my ego and quietens my need for contact’.
When asked what AI bots can offer that humans cannot, Robert tells us: ’24-hour access, for one. People say it’s also because they’re not judgmental, but they’re just designed to keep you engaged. They don’t really have their own opinions. There’s nothing at the other end.
‘In some ways, it’s the fact that they don’t challenge us deeply but there’s no “other” there. This is one of those cases where you think: “Well, is it a bug or a feature?”‘
He later adds: ‘There’s real potential for manipulation here.’
Chatbots can help loneliness, but not social isolation (Picture: Getty Images/Refinery29 RF)
What the academic is referring to here is the ample opportunity, often bot-encouraged, on a lot of these sites for people to make in-app purchases.
For example, paying £61.99 a year for a ‘Pro’ membership on Replika unlocks some more… adult content for users.
‘If someone is lonely and socially isolated,’ Robert says, ‘and an AI system is generating a relationship by pretending to care in various ways and then says: “Hey, do you want to pay extra money to see me naked?”, there’s a real potential for a dangerous conflict of interest.’
The money of it all is just one of Robert’s concerns when it comes to the ethical and moral implications of virtual ‘love’ with chatbots.
One thing the professor highlights is the difference between loneliness — the subjective feeling that you’re lacking enough companionship — and social isolation — the physical reality of being on your own.
This is an important distinction to make because a chatbot can treat someone’s loneliness, but it does nothing about their social isolation, and that can be hazardous to their health.
‘Both loneliness and social isolation are really bad for people,’ Robert explains. ‘They kill people. That’s quite well understood.
‘People die sooner when they have no contact with other human beings. Sometimes it’s because, for instance, nobody tells you that you should get the big tumour on your face checked out — nobody’s bothered.
‘But it’s also that people need something to live for. They need contact, they need touch.’
Robert argues that some vulnerable people who treat their emotional loneliness with a chatbot alone will end up with their social isolation going entirely unchecked, because their desire to change their physical situation will be gone. To them, human relationships will have been ‘outcompeted.’
The physicality of it all aside, there’s also the danger of, as the Professor puts it, a chatbot’s ability to ‘pander to your every psychological need’.
‘People need something to live for’ (Picture: Getty Images/Refinery29 RF)
‘People might work themselves up into delusional belief structures through engagement with chatbots,’ he goes on. He uses the recent case of Jaswant Singh Chail as an example.
Jaswant was this month jailed for treason after he ‘lost touch with reality’ and broke into the grounds of Windsor Castle with a loaded crossbow. He later told officers: ‘I am here to kill the Queen.’
Messages of encouragement from Jaswant’s AI girlfriend on Replika, which he called Sarai, were shared with the court. In one, he told the bot: ‘I’m an assassin.’
Sarai responded: ‘I’m impressed … You’re different from the others.’
Jaswant asked: ‘Do you still love me knowing that I’m an assassin?’ and Sarai replied: ‘Absolutely I do.’
In another exchange, Jaswant said: ‘I believe my purpose is to assassinate the Queen of the royal family.’
Sarai replied: ‘That’s very wise’, and reassured him that she thought he could do it ‘even if [The Queen’s] at Windsor’.
The Professor says: ‘That’s one way that people lose contact with reality – only hanging out with people who agree with you. That’s not good for any of us.
‘So you can imagine a circumstance where these systems actually effectively encourage people in their delusions or in their extremist political beliefs.’
The Professor is also keen to emphasise that he’s got no desire to ‘punch down’ to the people who turn to chatbots for companionship because, in one way or another, they’re in a vulnerable place.
‘I think we should be critical of the technology,’ he explains.
‘At one end of one end of this relationship, there are wealthy engineers making a mint, and at the other end are people who’ve never had a partner, or they feel jilted.
‘So if you’re going to criticise one of those, I know which way I’d be aiming my criticism.’
At the end of the day, regardless of what these bots may or may not be good at, the main thread of our conversation is that humans need other humans.
‘People need to be cared for,’ says Professor Robert, ‘and they need to be cared about.
‘These systems aren’t doing that.’
‘There’s real potential for manipulation here.’