For some time now, Apple has faced questions about its growth and what rabbits it can pull out of its hat next, especially as rivals including Google, Facebook and Amazon appear to have gotten the jump on it with emerging technologies like artificial intelligence, virtual reality and augmented reality. The Apple iPhone remains the most profitable computing device in the world, and Apple’s immediate future looks sunny, but its long-term outlook has begun to look partly cloudy. In a world that seems to care less and less about beautiful hardware and more about services that help you from afar, over the air, without your ever having to touch a machine, Apple risks becoming an anachronism.
HomePod will be a test of how Apple responds to these difficulties. That’s because for Apple to outdo Amazon in the home assistant game, it will need to prioritize skills that have long been on its back burner — cloud services and A.I., for instance.
But here’s the surprising thing: Apple seems to be up for such reinvention. If you read between the lines at its keynote address on Monday, you would have noticed something. Again and again, like shamans calling on some new and powerful magic, Apple executives invoked the buzzwords of modern computing: “machine learning,” “deep learning” and “computer vision.”
Subtly but unmistakably, they were suggesting a shift. Apple seems to be transforming itself into a new kind of company, one that prioritizes the nerdy technical stuff that will become the foundation of tomorrow’s intelligent machines — whereas in the past, the company tended to hide this stuff, even if it recognized its importance.
This shift doesn’t mean that HomePod will succeed, or that Amazon’s device is in trouble; with a head start, much cheaper devices, and lots of irrepressible fans, the Echo has momentum that will be difficult to curb. Plus, nobody knows yet how well HomePod will work.
In broad strokes, the device seems to do much of what its rivals can. Say, “Hey Siri, play me some Carly Rae Jepsen,” and it will belt out perfect Canadian pop. It will also answer questions about the music it’s playing, and, just like the Echo, it can perform a wide variety of other functions — setting timers, telling you the weather and controlling your smart lights and other home devices.
HomePod also lacks a lot that the Echo has perfected. For now, it seems bound to Apple’s ecosystem. Apple said that it connects to its own subscription music service, but declined to specify whether it would let you play songs from Spotify or other services, or whether there was a way for third-party developers to create additional voice-activated functions, which is one of the best features of the Echo. Alexa can order an Uber for you; HomePod may ask you to try using your phone. And obviously, HomePod lacks the Echo’s deep integration with Amazon’s online store, which, to many users, is a killer feature: Running out of dish soap? Alexa will order it for you, but Siri can’t.
Yet many of these are omissions that you’d expect in a new device, and Apple will most likely add improvements in updates. What will matter most, at first, is how reliably HomePod can perform the basics.
If you’re a longtime Siri user, your skepticism is warranted. The best thing about Amazon’s speaker is its reliability: Say something from far away, even in a noisy room, and most of the time it will at least recognize what you asked it. Once you get an idea of the kinds of requests it can handle, Alexa begins to seem like a completely natural interface to computers. It responds so quickly that it starts to seem like a helpful member of the family rather than a computer in a can.
Can Siri in HomePod do that? I can’t tell you yet for certain. I got a chance to listen to HomePod — but not use any of its voice-activated features — after Apple’s keynote. As the company promised, it does sound much deeper and richer than a Sonos Play 3 or an Amazon Echo, closer to what you might hear in a high-end home stereo system. But the true test for such a device is how it works in real-world conditions, handling the multiplicity of requests that you might think up during a day — and for now, we have no idea how that might work.
Yet I am cautiously optimistic, mainly on the basis of the many other computationally difficult features that Apple showed off on Monday. Many of them are versions of features that rivals like Google have spent years perfecting.
For example, Siri will now translate languages for you. Like a true assistant, Siri is also now more predictive — it aims to spot common computing problems and offer to help. (If it notices that you’ve been chatting about a coming appointment, for instance, it will offer to add it to your calendar.)
Best of all, Siri is finally a single unified personality that sits across all of your devices. In the past, Siri on your iPad was different from Siri on your Mac — one would learn that you’re traveling to Iceland, and the other might never know. Now, similar to other assistants, Siri knows you and can anticipate your interests on whatever device you use.
There was more, too. Apple is letting developers create apps that can perform machine learning tasks on its devices. It’s also diving heavily into computer vision. Developers can now create augmented reality features, meaning they can add virtual objects overlaid on images from the real world.
If you follow the tech industry you know these are all hot topics that other companies are investing in heavily. Apple is still a laggard, and I wouldn’t expect it to beat Google in an A.I. contest anytime soon.
But it doesn’t need to. All Apple has to do is stay competitive — it’s got to invest just enough in the A.I.-driven future to keep its devices compelling. There’s no mistake, now, that it’s doing so.