Amazon’s boatload of new Echo devices move further beyond the phone
Maybe it’s good that Amazon got smartphones out of its system back in 2014 with the ill-fated Fire phone. It may have cleared the decks and let the company’s best minds think about other things–other devices, different embodiments of artificial intelligence, and different human/machine interfaces–like natural language. While Apple’s ecosystem still rotates around the iPhone, Amazon’s is about an ever-growing number of specialized end points for the Alexa brain.
This dynamic was on full display Thursday at Amazon’s mega-product-announcement event in Seattle. The company announced a bunch of Alexa-powered devices. There was a microwave, a wall clock, a new digital video recorder, and a gizmo for bringing voice control to old-school stereo systems. We saw devices for using Alexa in the car, the living room, the kitchen, and beyond. And phones didn’t play much of a role in this voice-centric vision. In Amazon’s world, you still need a smartphone, but you don’t rely on it for every single aspect of your digital life.
Amazon appears to be taking steps to make Alexa even less dependent on smartphones. One of the main jobs of Alexa is to control connected home gear such as lights, locks, and garage doors. But when you get a new connected home device—say, a voice-controlled light for the bathroom—you have to download a manufacturer app and muck around in it to get your new gizmo connected to Wi-Fi and set up. Then you have to use the Amazon app to get it ready for voice control. This smartphone-based headache is a big reason people avoid automating their home. That’s why Amazon has been training Alexa to help homeowners set up new connected home devices using natural language dialog instead of tapping on smartphone apps in a smartphone.
Two approaches to voice
All of this contrasts with Apple’s approach to voice input. For a long time, Apple people said that talking to Siri on an iPhone was just as easy as talking to an Alexa speaker. But Alexa devices have always had arrays of microphones built to detect voices talking naturally in their vicinity. The microphones in an iPhone are fewer in number, far smaller, and built for close talk. Apple finally accepted this reality when it decided to release the HomePod speaker, which has a fancy seven-microphone array for far-field natural language skills control.
For most of its existence, Apple’s home automation platform HomeKit has used the iPhone as its main hub and controller. Only when the company launched the HomePod could people control HomeKit connected devices using ambient voice commands. While Apple doesn’t release sale numbers, most analysts believe the $349 HomePod hasn’t sold well. So for a great many HomeKit households, the connected devices around the house are still controlled via app iPhone-based taps or Siri commands.
Apple’s approach to the car is also smartphone-centric. The company’s CarPlay software allows an automobile’s dashboard screen to act as a display and a controller for selected features and content on the iPhone. It also uses some of the car’s built-in buttons and knobs to control the content from the phone, and to play audio from the phone. The driver can also make Siri commands via a microphone built into the car. CarPlay only works in cars with the software built in or added through an aftermarket upgrade.
Amazon’s new Echo Auto device works with any car that offers a way to connect with its audio system, such as Bluetoothor an auxiliary-in port. The content served by the device doesn’t come from apps on the phone, but rather from the Alexa brain in the cloud. The real draw of the device is the array of microphones inside that can hear through all the road noise, music, and back seat chatter to hear the user’s voice commands, allowing users to call up directions, find places, make phone calls, and hear audiobooks without moving their hands from the steering wheel or their attention from the road. The wireless connection provided by the smartphone in the car is just an enabler, and someday soon that connection might be a 5G connection built directly into the car, leaving the phone out of the picture entirely.
With the launch of the first Echo back in June 2015, Amazon latched onto something people liked–a new way of interacting with technology (including AI) using natural language, as if talking to another person in the room. The company has an army of people working to improve Alexa’s understanding of both words and their meanings. And it’s growing more confident in this new kind of experience. “The idea of voice automation is not going away; it’s not a fad,” says Amazon senior VP of devices and services Dave Limp, who leads the Alexa effort at Amazon. “I wouldn’t have been as adamant about that a year ago.”
The natural language technology is at a place now where Amazon feels confident enough to expand into new domains. “It’s no longer early days; clearly we’re now thinking about other places, like the car, the kitchen, and meetings–here we have an Echo Dot in every conference room,” Limp told me, hinting at future applications for Alexa in the workplace. “So we’re able to spread our wings a little bit and go out and find and test new environments where ambient user interfaces might work.”
This year Amazon has focused Alexa on in-home audio and video, the connected home, and the car. Next year at this time, we may see Amazon announce products that bring natural voice computing a new set of environments. When you arrive at a very new user experience paradigm first, as Amazon has, you have the advantage of a head start and a lot of green field in front of you.
Fast Company , Read Full Story
(18)