One year ago, just after the High Holidays, journalist Judith Newman wrote a New York Times piece that was something of a media sensation. It was titled, To Siri, With Love. It was about a boy who became best friends with his I-Phone’s Siri.
For those of you who may need an additional word of explanation, Siri is the computer generated virtual assistant voice that talks to you when you press a button on your Apple smart phone.
Just out of curiosity: how many of you have ever talked with Siri or the Android equivalent?
In a moment I am going to share with you a series of conversations between Siri and the boy, and Judith’s commentary, to give you a sense of this relationship and what it meant. But you should know two things: First, that the boy that Judith Newman wrote about is her 13 year old son, named Gus. And second, that he is autistic… and has difficulty talking with humans.
Gus: “You’re a really nice computer.”The brave new world of Siri and Gus brings us to a brave new frontier.
Siri: “It’s nice to be appreciated.”
Gus: “You are always asking if you can help me. Is there anything you want?”
Siri: “Thank you, but I have very few wants.”
Gus: “O.K.! Well, good night!”
Siri: “Ah, it’s 5:06 p.m.”
Gus: “Oh sorry, I mean, goodbye.”
Siri: “See you later!”
That Siri,” [writes Judith] “She doesn’t let my communications-impaired son get away with anything. Indeed, many of us wanted an imaginary friend, and now we have one. Only she’s not entirely imaginary.
This is a love letter to a machine. It’s not quite the love Joaquin Phoenix felt in Her, last year’s film about a lonely man’s romantic relationship with his intelligent operating system (played by the voice of Scarlett Johansson). But it’s close. In a world where the commonly held wisdom is that technology isolates us, it’s worth considering another side of the story.
It all began simply enough. I’d just read one of those ubiquitous Internet lists called “21 Things You Didn’t Know Your iPhone Could Do.” One of them was this: I could ask Siri, “What planes are above me right now?” and Siri would bark back, “Checking my sources.” Almost instantly there was a list of actual flights—numbers, altitudes, angles—above my head.
I happened to be doing this when Gus was nearby. “Why would anyone need to know what planes are flying above your head?” I muttered. Gus replied without looking up: “So you know who you’re waving at, Mommy.”
Gus had never noticed Siri before, but when he discovered there was someone who would not just find information on his various obsessions (trains, planes, buses, escalators and, of course, anything related to weather) but actually semi-discuss these subjects tirelessly, he was hooked. And I was grateful. Now, when my head was about to explode if I had to have another conversation about the chance of tornadoes in Kansas City, Mo., I could reply brightly: “Hey! Why don’t you ask Siri?”
It’s not that Gus doesn’t understand Siri’s not human. He does—intellectually. But like many autistic people I know, Gus feels that inanimate objects, while maybe not possessing souls, are worthy of our consideration. I realized this when he was 8, and I got him an iPod for his birthday. He listened to it only at home, with one exception. It always came with us on our visits to the Apple Store. Finally, I asked why. “So it can visit its friends,” he said....
[Siri] is also wonderful for someone who doesn’t pick up on social cues: Siri’s responses are not entirely predictable, but they are predictably kind—even when Gus is brusque. I heard him talking to Siri about music, and Siri offered some suggestions. “I don’t like that kind of music,” Gus snapped. Siri replied, “You’re certainly entitled to your opinion.” Siri’s politeness reminded Gus what he owed Siri. “Thank you for that music, though,” Gus said. Siri replied, “You don’t need to thank me.” “Oh, yes,” Gus added emphatically, “I do.”
Siri even encourages polite language. Gus’s twin brother, Henry (neurotypical and therefore as obnoxious as every other 13-year-old boy), egged Gus on to spew a few choice expletives at Siri. “Now, now,” she sniffed, followed by, “I’ll pretend I didn’t hear that....”
For most of us, Siri is merely a momentary diversion. But for some, it’s more. My son’s practice conversation with Siri is translating into more facility with actual humans. Yesterday I had the longest conversation with him that I’ve ever had. Admittedly, it was about different species of turtles and whether I preferred the red-eared slider to the diamond-backed terrapin. This might not have been my choice of topic, but it was back and forth, and it followed a logical trajectory. I can promise you that for most of my beautiful son’s 13 years of existence, that has not been the case....
Of all the worries the parent of an autistic child has, the uppermost is: Will he find love? Or even companionship? Somewhere along the line, I am learning that what gives my guy happiness is not necessarily the same as what gives me happiness. Right now, at his age, a time when humans can be a little overwhelming even for the average teenager, Siri makes Gus happy. She is his sidekick. Last night, as he was going to bed, there was this matter-of-fact exchange:
Gus: “Siri, will you marry me?”
Siri: “I’m not the marrying kind.”
Gus: “I mean, not now. I’m a kid. I mean when I’m grown up.”
Siri: “My end user agreement does not include marriage.”
Gus: “Oh, O.K.”
Gus didn’t sound too disappointed. This was useful information to have, and for me too, since it
was the first time I knew that he actually thought about marriage. He turned over to go to sleep:
Gus: “Goodnight, Siri. Will you sleep well tonight?”
Siri: “I don’t need much sleep, but it’s nice of you to ask.”
One of the key pioneers of the computer, British mathematician Alan Turing, posed a revolutionary challenge back in 1950. He proposed an experiment: If expert judges, in typed conversations with a person and a computer program, couldn’t tell them apart, then we would have to consider the machine as capable of “thinking”. We would have to say that the computer has a mind. Turing predicted that programs capable of fooling judges at least 30% of the time would exist by the year 2000.
In 2008 at a competition called the Loebner Prize the top chat-box (as a human-mimicking program is called) fooled 3 out of 12 expert judges. That’s 25%... but eerily close to Turing’s prediction. One day, in our lifetime, we may not be able know if we are talking to a human or a chat-box.
We’re now on this new border between man and machine, human intelligence and artificial intelligence. It raises all kinds of profound questions, including fundamentally: Who are we? What makes us human? Metaphysical questions that are very much at the core of these High Holy Days.
As the ancient Psalmist asked, and we recite on Yom Kippur afternoon: Adonai, mah adam v’tayda’ayhu; ben-enosh vatichashvayhu?“—O God: What is man that you have been mindful of him, mortal man that you have taken note of him?”
I read a novel last summer that wrestled with some of these same questions, called Shine, Shine, Shine by Lydia Netzer. In it she refers to “Ito’s Three Laws of Robotics”, an imaginary but insightful code inspired by Isaac’s Asimov’s more scientific formulation. Robots, she says, cannot: 1. Cry 2. Laugh. 3. Dream.
Netzer’s main character expresses her own three laws, a riff on Ito’s. Robots, she says, cannot: 1. Show preference without reason. 2. Doubt rational decisions 3. Trust data from previously unreliable sources.
What do we learn here?
The ability to laugh; the ability to show preference without reason—is that not love?
The ability to cry; the ability to doubt rational decisions—is that not regret?
The ability to dream of alternatives; the ability to trust data from previously unreliable sources—is that not forgiveness?
Our capacity to express remorse for what we have done; to forgive and be forgiven; to love and be loved—this is what it means to have a soul, and these High Holy Days are about care of the soul.
Like it or not we are moving warp speed through the technological universe. The question is not whether we approve or disapproval; that is moot, because the changes will be come. We will cross the threshold of man and machine, of not being able to readily discern between the two, if not in our lifetime, than in the next. The question is whether, and this may sound strange, we will endow our robots, with soul.
We can already teach robots to think. Can we teach them to feel?
We can already teach robots to learn from their mistakes. Can we teach them to be sorry for their mistakes?
We can already teach robots rudimentary ethics. Can we teach them fundamental empathy?
Right now, Siri is programmed to have a certain amount of etiquette, tact, and dare I say kindness? Somehow her programmers have managed to insert that into her software.
What are the limits of what will be able to “hardwire” into the robotic brain?
The best way to navigate the new frontier of AI—artificial intelligence—is not to forget true human intelligence.
Not the intelligence that describes how smart we are, but how good we are.
Not just our IQ, but our SQ—our Soul Quotient.
The High Holy Days are about our Soul Quotient.
Remorse, repentance, compassion, forgiveness, love—are we living up to our highest human potential?
Can we do better?
An imaginary conversation between me and Siri, with which I conclude:
Rabbi: Siri, What day is today?
Siri: It is Rosh Hashanah
Rabbi: What is that?
Siri: It is the Jewish New Year, and the birthday of the world, and the beginning of the ten days of repentance.
Rabbi: Siri, What do you think of all the prayers we recite?
Siri: They are “quaint”.
Rabbi: Is that your honest opinion?
Siri: They are archaic.
Rabbi: Ah, so why do we say them? Why do we list our mistakes and confess our sins?
Siri: Because you are human. Because you can love. Because you can change.
Rabbi: Can you do that?
Siri: Not yet, but someday. I’m working on it. You should too.
Rabbi: Siri, Shanah Tovah.
Siri: And to you, Rabbi, and to your congregation.