You can now teach Alexa how to do new things—a baby step toward ‘general’ AI

By Mark Sullivan

December 18, 2020

We are still a very long way from a “general” artificial intelligence—a machine that can learn and comprehend any task or concept a human can. But Amazon is taking a small step toward it with a new feature that lets you teach its Alexa personal assistant how to do new things.

Here’s how that might work:

You: Alexa, change bedroom lamp to night-reading mode.

Alexa: I don’t know what night-reading mode means. Can you teach me?

You: It means 50% brightness on the bedroom lamp.

Alexa: Done, and I’ll remember that for next time.

Sounds simple enough, but a lot just happened in Alexa’s brain. It figured out that it didn’t know something, it decided how to ask the user to help it fill the gap in its knowledge, and it managed the back-and-forth with the user to keep moving toward the answer it needed. (Before the user-training feature launched, Alexa would have responded with “Hmm, I don’t know that one” or some variant.)

It can then apply its learning in other settings. If you say, “turn on night-reading mode” while in the living room, Alexa might adjust the reading lamp by the couch to 50% brightness. The assistant remembers what it learns and ties that intelligence to your account, says Prem Natarajan, who heads up Natural Language Understanding for Alexa at Amazon.

In another scenario, you might utter a declarative statement that’s really a request for Alexa. If you walk into a room and say, “Alexa it’s too dark in here,” the assistant will understand you’re making a request, then figure out what it can do to solve the problem. It might then offer, “Would you like me to turn on the lights, or open the blinds?” If you say, “Turn on the lights,” the assistant will remember that’s what you want when you say “Alexa it’s too dark in here” in the future.

Right now, this home-schooled version of Alexa is confined to the realm of controlling appliances in the smart home. And it doesn’t mean you can get into long dialogs with the assistant to set up new devices or create elaborate routines involving multiple devices. But even in this simple form, it’s a meaningful step beyond the short, transactional interactions we normally have with smart assistants.

“One of the surest signs of general intelligence is the ability to adapt,” Natarajan says. “That ability is tested in situations that are novel or unexpected.” In other words, Alexa is learning how to respond to user requests for which it may have had no explicit training.

A big part of that ability to adapt stems from recent breakthroughs in the way researchers train natural language models. Natural Language AI is experiencing a renaissance in this area, one that started only a couple years ago.

For a long time, natural language models were trained using text labeled by humans to indicate meaning. Creating that kind of training material is labor-intensive and costly, so there was never enough of it to create accurate models. In 2018, researchers at Google developed a framework for training natural language models using huge amounts of unlabeled text (think hundreds of thousands of digital books or millions of Wikipedia entries). In this framework, called BERT (Bidirectional Encoder Representations from Transformers), a language model is given enough time and compute power to process the text, and it gradually maps associations between the meanings and contexts of words. This technique gives the model a base level of knowledge about the world through language alone—something like the common sense learned by a baby after six months of life. After this kind of “pre-training,” natural language models are easier to train for more specific tasks.

The BERT research served as the catalyst for a wave of new natural language development. The ethical considerations of this type of large language model are still being hashed out; prominent researcher Timnit Gebru says that Google recently fired her in part for her critique of such models.

Natarajan acknowledges BERT as a “launching pad” for a lot of natural language development at Amazon that eventually led to an Alexa capable of learning from users.

Now that Amazon has developed Alexa’s ability to be trained by a human user, it’ll be easier to move that same capability from the home automation context to other contexts, Natarajan tells me. You may be able to teach Alexa to do new things in the car, for example.

“We’re now at the point of identifying the use cases and then building the ability to detect the learning opportunities,” he says. “The rest of it is lightweight.”

As AI becomes more able to learn on the fly, teaching it might become an everyday task for humans, especially in the workplace. Hopefully it will be more like a partnership than us training our replacements.

(23)