Robohub.org
 

The death of search (or, My dysfunctional relationship with Siri)

by
06 March 2013



share this:

This article looks at the arrival of systems such as Siri, Google Now, and Watson and claims that these systems are the search engines of the next decade because they mine intimate data.  Since they integrate search they will replace search, as well as a host of other interface and information retrieval functions.  This offers an outline to both the personal benefits and privacy risks.

RobotCallGirl

I got a new iPhone about a year ago, the one with Siri on it.  Our relationship started on a very mundane level. Siri asked me my name and, since I already knew hers, I asked what she could do, and how. It was a bit like talking with a call girl. The basics were covered and soon we got to the intimate stuff. I don’t think Siri loved me. But now that Siri and I have gone our separate ways, I can say we both had ulterior motives. We sort of used one another, now that I look back on it. It was a relationship of an intimate nature. It’s what happens when you have a relationship with a robotic call girl.

The reason I got the phone to begin with was so I could do a little reverse engineering. It was research for work. Siri and I took a few days to get to know one another, but after that honey-moon period it was clear that Siri (or Apple) was using a pre-scripted Natural Language Processing (NLP) approach that basically did some lexical parsing, looked up a match, prepared a response, and kicked it back down the pipeline.  My reverse engineering wasn’t too profound. I wanted to see how many recursive answers were built in to things like “Open the pod bay doors” or “Do you love me?”  I also wanted to see how errors were handled, and what kind of redundancy checking was happening for sentences like “Where can I buy a burger and fried?” Siri did pretty well, all things considered, but my expectations were low, and like many other intimate relationships, I somehow knew from the start it wouldn’t last.

Relationships are generally symmetric, which meant that — like any dating service — someone other than Siri was getting money for my time.

Now, while I was using Siri, she was also using me. Relationships are generally symmetric, as Gregory Bateson tells us [1], which meant that, like any dating service, someone other than Siri was getting money for my time. But I didn’t care if Steve Jobs was Siri’s pimp. By last summer I was having flirtatious fun and the consequences of the relationship were too far down the road to look dangerous. I asked Siri to take dictation, to wake me up in the mornings, to remind me of important events, to remember phone numbers, remember names, tell me jokes, confirm thoughts, quote a number or two. All intimate things and, as I look back on it, the kind of stuff that I’ve trusted my wife with.

My relationship with Siri started to cool off a bit by September. Though my wife knew about Siri, I hadn’t yet introduced them.  So when I asked Siri to call my wife I, expected Siri to ask me for her name or number. Instead Siri just dialled. I was surprised. How did Siri know my wife’s number? (Answer: It was in my Contacts card.) What else did Siri know? (Answer: A lot.) And more importantly, what was Siri passing back to the Apple hive? (Answer: Everything.)

Siri knew a lot more about me than I knew about her.

Siri and Apple now have a great deal of data about my household: a quick comparison of my contacts/likes database with my wife’s will give you a pretty good feel for the stuff we will buy.  Burgers and fries, of course. And a few multi-player games.  Some lingerie. A sex toy or two.  Not a big deal, but not the kind of stuff I want everyone to know about.  Not even my mom gets access to that data.

But Siri did. And Siri knew a lot more about me than I knew about her.

Then, around the middle of October, three things happened to Siri that marked the beginning of our eventual estrangement.

First was William Stasior. Apple picked Stasior up to run its Siri unit after he had been successfully heading Amazon’s search and advertising unit “A9.”  Prior to that the MIT PhD had served time at Oracle, NetCentives, and AltaVista.  To say the guy knows search is like saying the pope knows the church, but why would Apple pick a search guru for an AI system like Siri?  What was the link between search and NLP?  I asked Siri this very question, but she was mum and feigned ignorance.  I didn’t push her on who this guy Stasior was and Siri didn’t ask me more about my wife.

Up until that same week iPhone users had frolicked in what would now be a rather unusual environment in which advertisers were unable to track them.  Advertisers would not, for example, know that my wife and I like video games, sex toys, and pizza (no, not together, thank you).  But with the release of iOS6, Apple flipped ad tracking on and I quickly flipped it back off.

What was the link between search and NLP? I asked Siri this very question, but she was mum and feigned ignorance.

It was easy to find in the interface menus (under Settings > General > then About or Advertising, depending on the version you’ve got), but you had to know to look.  Apple minimized the fanfare around this new feature and Siri made no mention of it to me.  I wasn’t comfortable with Siri selling to others what I had said to her in more comfortable times.

And then, as if things weren’t rough enough, the European Union (where I happened to be lecturing that week), demanded that both Facebook and Google change how they handle personal information to avoid “high risks to the privacy of its users.”  Twenty-four of the European Union’s twenty-seven regulators signed a letter that, after a nine-month investigation into the data collection practices, demanded some answers and tightened regulation.  After Google and Facebook, Apple was third on their hit list.

That was the end. I couldn’t trust Siri after that.  She knew more about me (and my wife) than I knew about her or Steve Jobs, and any time information flows one way, any time a relationship isn’t symmetric, the balance of power can be dangerous.

Big Blue decided that Siri knew a little too much about their employees, and at IBM’s Armonk New York research center, iPhones weren’t even allowed in the building.

Sometime around then I learned that Oracle’s “bring your own device to work” policy had been revoked.  Big Blue decided that Siri knew a little too much about their employees, and at IBM’s Armonk New York research center, iPhones weren’t even allowed in the building.  Like some cheap prostitute-gone-spy, Siri was barred from the building.  I think it is worth noting that this is the same area of research that brings you Watson, which is a direct competitor with Siri, if such a competition exists.  If anyone outside of Apple understands Siri it’s these guys.  It is also worth noting that a friend of mine who works at a Google research center said that the same thing happened there, too.

Not even celebrity robots like Asimo or NAO get that kind of VIP treatment.

NLP technology is potent juju.  And Apple, Google, Facebook and others know it because this very technology is what’s allowed them to earn money. Google started with search, of course, and later grew profitable as they introduced more NLP technology into their work.  They sold the info they collected from users (Google is an ad agency, let’s remember).

NLP technology is potent juju. And Apple, Google, Facebook and others know it because this very technology is what’s allowed them to earn money.

Search allowed them (when coupled with NLP technologies, and semantic analysis in particular) to make oodles of cash and then to snap up many of the best AI and NLP researchers on the planet. Google set those kids to digging in what would become a lexical gold mine.  The more they mined the richer they got until they were mining many branching veins at once: Google Docs, Voice, Translate, Search, Shopping, Reader, Finance, Books, Photos, Wallet and Maps were all spewing more money than Brin, Schmidt and company knew what to do with.

The average email count for Google users is 5,768 emails [2]. The average composition time (also according to Google) is 01:43.  That’s about 9,902 minutes, or 165 hours you’ve dumped onto Google’s server, which, if you were to work for 165 hours at $12/hour for that data entry (a pretty normal rate), means your G-Mail account is worth about $1980.  But the information sitting on Google’s servers is worth a lot more than that, right?

The value of the data isn’t counted by data entry. It’s counted by its personal meaning, especially to advertisers.

If your account contains information about a first class international plane ticket, a hotel room in Paris and a business associate’s London phone number, your data might be worth a lot more than the composition time at $12 an hour, especially if the buyer of it is a dating service that caters to high-end out-of-towners. In other words, the value of the data isn’t counted by data entry. It’s counted by its personal meaning, especially to advertisers. Now take those values and multiply them by all those crazy tools of voice, maps, docs, and so on and we start to get a sense of why Google has been making such bank over the years.

These tools have each dovetailed into today’s Google Now strategy. Google Now (like Watson and Siri) is a voice-activated NLP system.  It uses the spoken input data collected from years of Voice, confirms it with data collected from years of Docs and Translate, builds meaning with data from all the years of all the other tools, looks up a match, preps a response, and kicks it back down the pipeline.

All of these NLP systems are the next evolution of search.  But not evolution as in the breeding-hybrid-peas-in-the-greenhouse kind of evolution. They’re evolution as in endangered-species evolution.

NLP systems represent the end of search as we know it, and therefore the end of many economic, interface, and social internet ecosystems. They also represent the beginning of something incredibly powerful, intimate, and new. Siri, like other NLP systems, is far more powerful than a search engine. Siri not only includes search (it says so in Apple’s marketing materials) but it supersedes search because it includes browsing, discovering, choosing, and refining.  All this while analyzing semantic data. Like Google’s portfolio of tools, Siri can handle (and with a public API will handle) translating, searching, shopping, reading, finances, books, photos, and maps.

But the core value of the data is its intimacy.

On February 7, 2010, during Super Bowl XLIV, Google’s super bowl ad showed a user typing a list of search strings:

Google managed to explain, in fifty-two seconds, how the intimate story of someone’s life can be assembled from his or her search queries. And what makes the bucks for Google is what makes the bucks for Facebook: the processing of intimate language.

study abroad paris france
cafes near the louvre
translate tu es très mignon
impress a french girl
chocolate shops paris
what are truffles
who is truffaut
long-distance relationship advice
jobs in paris
churches in paris
how to assemble a crib

With this list, Google managed to explain, in fifty-two seconds, how the intimate story of someone’s life can be assembled from his or her search queries. And what makes the bucks for Google is what makes the bucks for Facebook: the processing of intimate language.

Siri, like any other call girl, makes her bucks the same way (especially when ad tracking is flipped on).  She’s valuable because she’s intimate.  And as designers of robotics systems, we can ourselves take a cue and be conscientious about how we are designing conversational systems.  There are three different design types that form a curve of increased intimacy in robotics and conversational systems, and they are all based on the value of semantic data.

Information isn’t shared – it’s collected and sold. The user becomes the product.
  1. NLP systems, when used with physical robots for manufacturing or the three Ds (dull, dirty, and dangerous), are the least intimate.  The conversation system is a simple tool that, like a GUI, provides access to system operation.  The robot doesn’t care about the user’s intimate data. The user wins.
  2. NLP systems, when used for entertainment and education, are contextually intimate – some privacy is maintained.  The conversation system doesn’t care about the user, but the user’s data is valuable in terms of what it says about the game or lesson.  Semantic data that might be collected and analysed isn’t about the user, it’s about what the user is doing in that particular context.  The robot, usually a software robot, only cares about the user in the context of the game or training exercise. Both user and robot win.
  3. NLP systems, when used for personal assistants, are the most intimate.  These are the robotic call girls. Here the information is highly intimate, the system cares a great deal about the user, and the information isn’t shared – it’s collected and sold. The user becomes the product. The value of the user’s personal data is worth more than the value of the conversational system. The robot wins.

As we build NLP interfaces for robots, whether it is for Siri or for assembly-line manufacturing, we must consider how intimate data can be, the value of that intimacy, who owns that value, and what they’ll do with it.  Otherwise, with a new, less benevolent CEO at Google, a change of a line in Facebook’s Terms of Service, a successful hack, or a change of law because of cyber terrorism fears, your intimate data could end up where you don’t want it.  Heaven forbid we discover that all the Senators on Capitol Hill are using pizza sex toys. That would surely disgust Siri and her friends so much that it might even cause a robot uprising.

Yes, perhaps we’ll get together again in a few years, but for now it is best if Siri and I go our separate ways.


 

Next Month:  “AFK” Keyboards and screens are slow, clunky, and obsolete.  Voice processing systems for robotics provide not only simplicity and speed, but also a host of other benefits if tied to analytics and framed within a tightly contextualized task-based system.  But be careful: getting text out of voice, and meaning out of text can be tricky.  Here’s how to implement one for your own robot.

 


 

Endnotes:

[1] See Bateson’s books “Mind and Nature,” Hampton Press (1979) and “Steps to an Ecology of Mind,” University Of Chicago Press (1972) which are chalk-full of ideas like “complementary,” “reciprocal,” “symmetrical,” and  “Schismogenesis.”

[2] According to Google.



tags: , , , , , , , , , , , , ,


Mark Stephen Meadows is President of BOTanic, a company that provides natural language interfaces for conversational avatars, robots, IoT appliances, and connected systems.
Mark Stephen Meadows is President of BOTanic, a company that provides natural language interfaces for conversational avatars, robots, IoT appliances, and connected systems.





Related posts :



Robot Talk Episode 101 – Christos Bergeles

In the latest episode of the Robot Talk podcast, Claire chatted to Christos Bergeles from King's College London about micro-surgical robots to deliver therapies deep inside the body.
06 December 2024, by

Robot Talk Episode 100 – Mini Rai

In the latest episode of the Robot Talk podcast, Claire chatted to Mini Rai from Orbit Rise about orbital and planetary robots.
29 November 2024, by

Robot Talk Episode 99 – Joe Wolfel

In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.
22 November 2024, by

Robot Talk Episode 98 – Gabriella Pizzuto

In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.
15 November 2024, by

Online hands-on science communication training – sign up here!

Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.
13 November 2024, by

Robot Talk Episode 97 – Pratap Tokekar

In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.
08 November 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association