Is your child with autism – or even you – truly having conversations with Siri?
Exemplified by the popularity of such movies like 2001: A Space Odyssey, WarGames, Tron, The Matrix, Terminator, and 2013’s Her, society has had a fascination with artificial intelligence (A.I.) for decades. Today’s world certainly has rapid technological advancements towards the creation of A.I. in everything from Roomba vacuums to smartphones to automated customer support calls. And while it may be easy to perceive that today’s A.I. technology allows for a two-way exchange of thoughts between humans and devices created by Apple, given the popularity of the New York Times’ article To Siri, With Love: How One Boy With Autism Became BFF With Apple’s Siri, let’s clinically distinguish how true ‘conversation’ differs from Siri interactions.
Apple defines Siri as, “…the intelligent personal assistant…Siri understands your natural speech, and it asks you questions if it needs more information to complete a task. Siri understands what you say, knows what you mean, and even talks back. It allows you to use your voice to send messages, schedule meetings, place phone calls, and more.” These rote tasks can also be accomplished without using one’s voice, or as Apple states “speak[ing] specific commands”, by manually entering them. Of note, Apple assigns the term “voice-recognition software.”
In simple terms, Siri is a voice-activated system to acknowledge and perform commands within Apple devices which is limited to a finite set of possible exchanges that can be input/output based on the device’s functions.
Because of its rapid and fluid responses to questions and commands, it might be easy to misperceive what Siri is capable of accomplishing. Some sources state that Siri is an acronym for "Speech Interpretation and Recognition Interface." The term “speech” refers to the way our voices and mouths create sound. So, Siri was designed to “interpret” or “recognize” our speech and apply it to all possible iPhone/iPad commands as stated above. Apple indeed does a superb job in humanizing Siri by offering a variety of voices and languages, making subtle changes its tone of voice-output, and incorporating conversational interjections (“Ah”, “Er”, “Um”, “Hmm”) within its responses. People will generally utilize this technology appropriately, understand the limits of Siri, and might even find it a little entertaining. However, Siri is nothing more than a computerized personal assistant with the sole purpose of executing clerical tasks. Of key significance is that Siri’s stated protocol is not to produce “language,” which is the open-ended communication all humans use to create infinite possibilities for expression and comprehension of ideas. Because it is not designed to interface with the vast meanings of words that our language conveys, true conversation with Siri can never occur.
Here’s an excerpt from the aforementioned NY Times article:
Gus: “You’re a really nice computer.”
Siri: “It’s nice to be appreciated.”
Gus: “You are always asking if you can help me. Is there anything you want?”
Siri: “Thank you, but I have very few wants.”
Gus: “O.K.! Well, good night!”
Siri: “Ah, it’s 5:06 p.m.”
Gus: “Oh sorry, I mean, goodbye.”
Siri: “See you later!”
Though the above example might appear to be an ongoing conversation, it is only a binary correspondence of sentence-to-sentence where Siri makes a single individual response based solely on the statement or utterance directly before. Siri cannot string together or link continuous utterances because utterances are the products of thoughts, and being a software program, Siri inherently cannot think. This ultimately means that Siri can never understand the meaning of conversational contexts. Instead, Siri associates the meaning of our individual utterances using its programmed logic code. This is why Siri falls out of conversational context in these exchanges I had with it:
Me: “Marry me”
Siri: “Give it time.”
Me: “How much time?”
Siri: “Ok, I found this…” and it listed Time magazine articles.
Me: “I don’t like noisy kids.”
Siri: “Ok, I’ll remember that.”
Me: “John’s kids were so noisy today,”
Siri: “I don’t know what that means. If you like, I can search the web for ‘John’s kids were so noisy today.’” In my second example, Siri was unable to synthesize the previous contextual knowledge even though it “promised” it would remember.
Most certainly, the ideal personal assistant would be polite and kind, as in these article excerpts:
I heard him talking to Siri about music, and Siri offered some suggestions. “I don’t like that kind of music,” Gus snapped. Siri replied, “You’re certainly entitled to your opinion.” Siri’s politeness reminded Gus what he owed Siri. “Thank you for that music, though,” Gus said. Siri replied, “You don’t need to thank me.” “Oh, yes,” Gus added emphatically, “I do.”
Siri’s responses are…predictably kind…Siri even encourages polite language. Gus’s twin brother, Henry (neurotypical and therefore as obnoxious as every other 13-year-old boy), egged Gus on to spew a few choice expletives at Siri. “Now, now,” she sniffed, followed by, “I’ll pretend I didn’t hear that.”
Apple cleverly has programmed Siri’s responses to often appear socially appropriate and courteous - and occasionally humorous - due to the use of canned “catch phrases” in response to certain words we speak to it. Such replies may give a false impression that ‘she’ is approachable, inviting, and can carry on kind conversation. However, they are not the result of legitimate social ability or competence, but rather are triggered automatically by certain words, statements, and questions.
The article also looks at Siri’s role as a comforting companion: But the companionability of Siri is not limited to those who have trouble communicating. We’ve all found ourselves like the writer Emily Listfield, having little conversations with her/him at one time or another. “I was in the middle of a breakup, and I was feeling a little sorry for myself,” Ms. Listfield said. “It was midnight and I was noodling around on my iPhone, and I asked Siri, ‘Should I call Richard?’ Like this app is a Magic 8 Ball. Guess what: not a Magic 8 Ball. The next thing I hear is, ‘Calling Richard!’ and dialing.”
My take: Fulfilling its sole purpose – that of a personal assistant - Siri dutifully identified the command phrase “call Richard.” Consideration is warranted regarding “companionability” in Siri. Companions or friends are capable of effective communication and empathy. However, Siri is incapable of analyzing emotions, and Siri cannot contemplate or judge. One cannot tell personal information nor ask intimate questions about Siri’s life. I attempted, “Siri, who are your friends?” Siri output, “You are not supposed to ask your assistant such things.” I proceeded, “Why not?” Siri ouput, “I don’t know.”
Still, after using Siri myself, I found myself getting drawn in by Siri’s inviting nature, almost viewing Siri as a competent conversational partner. But I was becoming increasingly frustrated and disappointed as I came to realize that, in fact, Siri is not a competent conversational partner, and Siri can never initiate an utterance to me; Siri simply provides predetermined output in response to one’s input of keywords/keyphrases. Siri could never know if my voice sounds saddened in order to comfort me, whether I’m fibbing in order to question me on the truth, or what I’m seeing, smelling, touching, tasting, or feeling in order to have a conversation about it.
Caution should be taken to classify any form of interaction with Siri as “conversation” whether one has autism or not. The whole point of communication through social, emotional, and linguistic means is to develop relationships. Clinically, Siri is incapable of real communication, and therefore, incapable of developing true relationships. More of us are embracing A.I. to improve efficiency, but, of course, nothing can - nor should - ever replace real conversation with human beings. -KKS