
Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More
Amazon is betting on agent interoperability and model mixing to make its new Alexa voice assistant more effective, retooling its flagship voice assistant with agentic capabilities and browser-use tasks.
This new Alexa has been rebranded to Alexa+, and Amazon is emphasizing that this version âdoes more.â For instance, it can now proactively tell users if a new book from their favorite author is available, or that their favorite artist is in town â and even offer to buy a ticket. Alexa+ reasons through instructions and taps âexpertsâ in different knowledge bases to answer user questions and complete tasks like âWhere is the nearest pizza place to the office? Will my coworkers like it? â Make a reservation if you think they will.â
In other words, Alexa+ blends AI agents, computer use capabilities and knowledge it learns from the larger Amazon ecosystem to be what Amazon hopes is a more capable and smarter home voice assistant.Â
Alexa+ currently runs on Amazonâs Nova models and models from Anthropic. However, Daniel Rausch, Amazonâs VP of Alexa and Echo, told VentureBeat that the device will remain âmodel agnosticâ and that the company could introduce other models (at least models available on Amazon Bedrock) to find the best one for accomplishing tasks.
â[Itâs about] choosing the right integrations to complete a task, figuring out the right sort of instructions, what it takes to actually complete the task, then orchestrating the whole thing,â said Rausch. âThe big thing to understand about it is that Alexa will continue to evolve with the best models available anywhere on Bedrock.â
What is model mixing?
Model mixing or model routing lets enterprises and other users choose the appropriate AI model to tap on a query-by-query basis. Developers increasingly turn to model mixing to cut costs. After all, not every prompt needs to be answered by a reasoning model; some models perform certain tasks better.Â
Amazonâs cloud and AI unit, AWS, has long been a proponent of model mixing. Recently, it announced a feature on Bedrock called Intelligent Prompt Routing, which directs prompts to the best model and model size to resolve the query.
And, it could be working. âI can tell you that I cannot say for any given response from Alexa on any given task what model itâs using,â said Rausch.Â
Agentic interoperability and orchestration
Rausch said Alexa+ brings agents together in three different ways. The first is the traditional API; the second is deploying agents that can navigate websites and apps like Anthropicâs Computer Use; the third is connecting agents to other agents.Â
âBut at the center of it all, orchestrating across all those different kinds of experiences are these baseline, very capable, state-of-the-art LLMs,â said Rausch.Â
He added that if a third-party application already has its own agent, that agent can still talk to the agents working inside Alexa+ even if the external agent was built using a different model.Â
Rausch emphasized that the Alexa team used Bedrockâs tools and technology, including new multi-agent orchestration tools.
Anthropic CPO Mike Krieger told VentureBeat that even earlier versions of Claude wonât be able to accomplish what Alexa+ wants.Â
âA really interesting âWhy now?â moment is apparent in the demo, because, of course, the models have gotten better,â said Krieger. âBut if you tried to do this with 3.0 Sonnet or our 3.0 level models, I think youâd struggle in a lot of ways to use a lot of different tools all at once.â
Although neither Rausch nor Krieger would confirm which specific Anthropic model Amazon used to build Alexa+, itâs worth pointing out that Anthropic released Claude 3.7 Sonnet on Monday, and it is available on Bedrock.Â
Large investments in AIÂ
Many userâs first brush with AI came through AI voice assistants like Alexa, Google Home or even Appleâs Siri. Those let people outsource some tasks, like turning on lights. I do not own an Alexa or Google Home device, but I learned how convenient having one could be when staying at a hotel recently. I could tell the Alexa to stop the alarm, turn on the lights and open a curtain while still under the covers.Â
But while Alexa, Google Home devices, and Siri became ubiquitous in peopleâs lives, they began showing their age when generative AI became popular. Suddenly, people wanted more real-time answers from AI assistants and demanded smarter task resolutions, such as adding multiple meetings to calendars without the need for much prompting.
Amazon admitted that the rise of gen AI, especially agents, has made it possible for Alexa to finally meet its potential.Â
âUntil this moment, we were limited by the technology in what Alexa could be,â Panos Panay, Amazonâs devices and services SVP, said during a demo.Â
Rausch said the hope is that Alexa+ continues to improve, add new models and hopefully make more people comfortable with what the technology can do.Â
Be the first to comment