Dear Apple, Open Up Siri

- 10 minutes -

One night last week, I came from my routine jog around the village, when I spotted a cat by the backdoor. There are lots of stray cats around the village to get rid of pests, but by that time, I’d seen this specific cat frequent that part by the door maybe two or three other times in the week.

I was getting water from the fridge and saw that we had some leftover Spam. I took out my phone and asked Siri, “Is Spam safe for cats?” Maybe a regular user would expect a direct answer, like in a regular conversation, as they should. But being accustomed to the reality of Siri, I knew what to expect: a series of web searches… from Bing. And that’s what popped up.

I instinctively tapped the first link and it brought me to an article that doesn’t even have the word “Spam” mentioned. I frustratingly resorted to a manual search, which involved tapping on the screen, which involved getting the screen all wet with sweat from my run, which I wanted to avoid in the first place with Siri.

Eventually I found out Spam was safe in moderation and fed a few grams to the cat.

Later, I decided to dig a bit deeper and ask Siri a few more times - I thought that maybe what happened earlier was a fluke. Sometimes it would misinterpret “Spam” as “sperm”. Great. But when it did recognise what I’d said, the same result came. I scrolled down and found one result with “Spam” mentioned in the headline and preview. I scrolled further down and noticed that in the Bing suggestions, the phrase “Can Cats Eat Spam” popped up. I tapped it and then came the actual result I was looking for, right at the top of the list. Same question, different phrasing.

 

The writing’s been on the wall for a long time with Siri. Almost every conversation mentioning Google’s voice assistant, Google Now, and Siri oftentimes comes to the conclusion that Google Now is leaps and bounds better than Siri. There’s also the sheer number of other voice assistants that are out there and/or being developed, even by companies not directly involved in consumer electronics. That’s how Siri came about before Apple bought it.

When Siri was introduced alongside the iPhone 4S, it was made out to be a big deal, and justifiably so given its potential at the time. Now it’s made out to be a tentpole feature across all devices besides the Mac - from the iPad to the Watch. (Even then, there’s Dictation on the Mac, which is different from Siri, but still indicative that Apple’s aware of the value of using your voice to interact with devices.)

And yet Siri’s still more or less stuck in web-search, “I didn’t quite get that” purgatory. Sure, maybe it’s gotten better at actually recognising the words I’ve said. But I haven’t noticed even the slightest improvement over its actual understanding of what I’ve said.

Opening up Siri

Many people have suggested for years that Apple offer a Siri API, which would allow third-party access to Siri for developers. Many think that’ll be all it’ll take, and I think it’ll be a huge first step to making Siri dramatically better.

But at the end of the day, Apple and app developers can’t possibly think of every possible thing people can say (in different languages, mind you) to interact with a device, in every possible scenario where Siri will be useful. Nerds and power users might find their work arounds, as they do, but I can imagine there still being a lot of straight-to-web-search scenarios even with a Siri API.

That’s why, contrary to my usual argument that Apple taking full control has its reasons, I think Apple should make a Siri suggestion box. Not fancy-schmancy algorithms that think they know how to respond to human beings… well, maybe a bit of that to automate things a little. I mean an actual, user-facing place for suggestions of sentences and scenarios where Siri would be useful - like app reviews do for the App Store - to introduce a human element to our devices.

For example, imagine one of the legs on your dining table breaking because your kids are playing rough. You get on your car and rush over to the local hardware store for supplies, but realise you don't know what to get. You’re on the highway and not the kind of guy who feels safe using their phone on the road, much less on a highway. The perfect time for “Hey Siri, what nails should I use to repair a table”, right? But not at all the perfect time for “Here’s what I found for…”. Actually, feel free to apply to that to any ideal scenario that could involve hands-free Siri.

That’s as far as my suggestion for a better Siri goes. And that’s not even addressing the problems that a Siri API could solve, which I've explained in detail here.

However, first things first: Apple would need to recognise that Siri isn’t the best it could be. It’s probably not one of Apple’s most pressing problems at the moment, but even if it was, what’s really going to make them want to improve Siri? Software and services like iTunes, now 13 years old, and Apple Music, their foremost means of music distribution, are undeniably half-baked and a point of user confusion. And for a long time (and since the inception of Apple Music last year), nothing seems to convince Apple that there are jarring problems with these, at least at the surface.

Hopefully WWDC this year will bring big improvements to both of these, and Siri. But in the meantime, the world will just have to get used to the glorified speech-to-text web search app that lives in every iOS device.

 

Further reading: More on why I think streaming services, like Apple Music, could be doomed to fail in the distant future.