I am currently 60% through Kazuo Ishiguro’s latest novel, Klara and the Sun. In recent years, the British writer has pivoted away from psychological family dramas towards science fiction and fantasy. Klara and the Sun, in fact, is narrated by an artificial friend (or “AF”) called Klara, who relates her experiences starting as a robot in a store window and subsequent adoption by Josie, a lonely teenage girl.
I don’t know where this book is going, but I can say already I’m quite attached to Klara, even though I know she is only a highly advanced computer with a human-like body. This is not the first time a fictional computer has won my heart.
There’s a reason Data is one of fans’ favorite Star Trek characters. It’s the same reason Ishiguro was able to write a novel from an android’s perspective.
We all would love to have a friend who accepts and supports us unconditionally. Somebody who is always dependable and has few other motives in life than to be there for us. Someone who doesn’t criticize us, tease us, use us, or abandon us. Total commitment, in our noncommittal world, but not in a clingy way—we want to be able to walk away, too.
Klara tells the reader about her “duty” towards Josie. This means getting to know Josie really well, avoiding conflict between them or family members, and protecting her from negative situations. Klara’s self-sacrificial behaviors can be viewed as actions of love. At least, they sort of would be, if a human were doing them.
I say sort of because there’s an irony here. Klara has zero free will. She has a certain autonomy, meaning she can say she is quite willing and happy to be Josie’s friend—and mean it, to an extent. But Klara has no real will outside of her software and what that software can interpret from her surroundings—analyzing big data, finding patterns, and converting her data into opinions, perspectives, and something like emotions. The software, of course, has conditioned her to be completely focused on her friendship role.
A few years ago, MIT created an AI called “Norman.” Norman was purposefully exposed to negative imagery to see what kind of “perspective” he would have compared to other AI. Unsurprisingly, this robot had a very bleak, psychopathic view of the world. Far from neutral entities, AI algorithms mirror real-world, human society.
While I don’t see Norman becoming mainstream anytime soon, I think the Klara and Data bots are real possibilities and of equal concern… as much as I think it would be cool if they were real. Yes, in my opinion, Klara and Data are as scary as Norman. “But Marian,” you say, “what possible harm can a friendly and helpful robot pose?!”
There was a striking image in the beginning of the novel—a girl crossing a street, trailed several paces by an AI boy. Later on, Klara’s neighbor asks her how they should be interacting. Is Klara just on the level of a “vacuum cleaner”? The neighbor apologizes for being rude to her, but her confusion is still present.
Aside from the ethical and religious issues posed by creating Klara and Data, the problem is as it always is—humans themselves.
There is a current trend towards dividing the world into two groups: unconditional, no-conflict, full-support friends versus “haters.” The first group submits their entire will and identity towards the person they are friending—complete servility. The second group is ostracized as subhuman.
But you do yourself a disservice if you only surround yourself with people who agree with you…AKA ye olde echo chamber. A world where you can buy an AF means you can have a whole family of people who do what you want, 100% of the time. In one scene, Klara is hurt by Josie’s treatment of her—hurt, at least, in the sense she can detect something is not right—but Klara views this as a hint to reevaluate Josie, not to hold Josie to higher standards. In a real-world relationship, Klara would probably call out Josie, they’d have a proper fight, and Josie would have to apologize.
Don’t get me wrong; a level of tolerance and support is imperative in a decent society. I’m not saying we should be rude to each other—absolutely not. What I’m saying rather is that we need the balance and the complexity of real human relationships. Demanding total conformity from our friends (real or artificial) means we do not experience the kind of conflict that is necessary for self-improvement or behavioral boundaries. Sometimes the loss of a friendship can teach us something, too. You can only have this kind of dynamic, healthy relationship if both sides are equal and have free will.
Lastly, and Ishiguro hints at this—a world where AFs live side-by-side with humans means that we have established a social hierarchy of the human and the subhuman. A slavery mindset on the part of the humans is unavoidable, because Klara and her kind are only as “real” as we want them to be. We can treat them like real friends if want to. We can treat them like servants or even trash, if we get tired of them. They are upgradeable, replaceable. Their fluid humanness absolves us of any responsibility or guilt, while intertwining with the deepest, most human emotions we can experience, because they look a lot like us. We become the psychopaths at that point.
It is too soon to say whether I recommend this book or not, but it is certainly really thought-provoking so far.
i’ll be interested to see how it turns out… universal tolerance is appealing, and sharing thoughts and words with others not on the same wavelength is a nice ideal, but i can’t say i’ve seen very much of it at all over the many years i’ve been around…
I just finished the book last night… mixed feelings!
I didn’t read your review because of spoilers, but I saw Thoughts on Papyrus’s, The plot sounds fascinating!
LikeLiked by 1 person
I just read her review – I have to concur, it really is a “commercial” book, which really pains me to say, since I know he is capable of much more. 😕
Interesting….. What exactly is Klara’s ‘function? To be an unconditional friend, companion, ‘slave’? It could be designed to be one of those ‘echo chambers’ you mentioned but it could also be designed to assist Josie in her growth to full adulthood. Then there’s Klara’s legal status. Is it ‘just’ a machine to do with as you wish or does it have a modicum of ‘rights’ in Josie’s treatment of it. LOTS of food for thought though!
One last thing: What religious issues are there in the creation of Klara? I can imagine some of the ethical issues in such things (heavily explored in the recent ‘Westworld’ TV series) in the way we treat future ‘robots’ and what that does ultimately to our own psychological makeup but don’t ‘get’ the religious ones. Presumably these are in the region of ‘creation’?
Yes, the book asks all those questions, and I think the answers were fairly ambiguous. 🙂
Religion-wise… the problem I see if we are making human-like replacements, then it means humans aren’t doing their job (say, caring for the elderly or even just being friends to each other). Plus we’d be substituting soulless objects for roles only humans should fill – children, romantic partners, etc. Ishiguro touches on that, too, although he didn’t explore it as in-depth as I was expecting.
People do get attached to their devices – phones, cars – so i imagine that they’ll get attached to their ‘robots’ too. Robot carers are definitely coming and will probably be a good idea. 24 hour sleepless attention will probably save lives and make a lot of peoples lives better. I can see people having machine friends. If the AI is good enough I think conversations could be really interesting. AI therapists are definitely going to happen. I think there will definitely be a market for robot lovers. There’s a market for everything and when there’s money to be made…. I doubt if it’ll be any significant numbers though. Robot companions of all kinds are coming. Not soon but likely in your life time. I wonder how we’ll deal with that… [muses]
LikeLiked by 1 person
I’ve never considered Data or robots in this light; it’s an interesting observation, and one echoed in Sherry Turkle’s various studies of machine-human interaction — the most recent being Alone Together. She commented on the attachments people would develop for inanimate things, personifying them through their imagination and preferring their ‘company’ to that of real people….and compared it to the way we now treat people we know online, or even interact with online even though we know them IRL, as objects who serve at our pleasure and are ignored when they’re not pleasing.
I think androids like Data have added appeal in storytelling, serving as a lens to view humanity with, or to analyze us ‘logically’. Star Trek brims over with outsider characters: Spock, Data, Odo, The Doctor/Seven, T’Pol, etc.
I’ll have to check that one out! I’ve been getting into digital minimalism topics recently… it seems like we know so much about mental health and the internet, and yet actually separating ourselves from it (or at least, using it healthily) is an unwieldy challenge.
LikeLiked by 1 person