Amazon and Lockheed Martin are sending Alexa to space as part of Callisto, a technology demonstration on NASA’s upcoming Artemis I mission.
Amazon and Lockheed Martin have announced plans to send Alexa to space as part of Artemis I, the first of several NASA missions intended to land the first woman and the first person of color on the Moon. Alexa will be joining the upcoming mission as part of Callisto, a technology demonstration payload embedded into NASA’s Orion spacecraft and built in collaboration with engineers from Amazon, Cisco, and Lockheed Martin.
“The Star Trek computer was part of our original inspiration for Alexa, so it’s exciting and humbling to see our vision for ambient intelligence come to life on board Orion,” said Aaron Rubenson, vice president of Alexa Everywhere at Amazon. “We’re proud to be working with Lockheed Martin to push the limits of voice technology and AI, and we hope Alexa’s role in the mission helps inspire future scientists, astronauts, and engineers who will define this next era of space exploration.”
Alexa integration on board Orion
Artemis I is the first integrated test of NASA’s deep space exploration systems, which includes the all-new Space Launch System rocket and the Orion spacecraft.
Although the first mission is uncrewed, Artemis I is an important step that will allow NASA and others in the industry to test technology that could be used in subsequent crewed missions to the Moon and other deep space destinations.
Alexa is one of many new, innovative technologies that will be tested as part of Artemis I, and the integration will help those involved explore how ambient intelligence can assist astronauts on future missions.
We envision a future where astronauts can turn to an onboard AI for information, assistance, and companionship, and Amazon engineers have been working closely with Lockheed Martin to integrate Alexa into the Callisto payload.
Lockheed Martin designed custom, space-grade hardware with Alexa built-in, ensuring the device could withstand the intense shock and vibrations of launch and radiation exposure from passing through the Van Allen Radiation Belts.
Amazon provided the acoustic and audio processing software to support far-field voice interactions through Alexa, tuning algorithms to account for noise from engines and pumps and the reverberation associated with so many metallic surfaces within the cabin.
Callisto is also equipped with Amazon’s Local Voice Control technology, which allows Alexa to function in areas with limited to no connectivity. By combining Alexa’s world-class AI with local processing on board the spacecraft, we can bypass the delay (or latency) associated with sending information from the Moon to Earth and back, and allow future astronauts to access specific information and features almost instantly.
On Artemis I, Alexa will be able to access real-time telemetry data and respond to thousands of mission-specific questions on board Orion, including questions like “Alexa, how fast is Orion traveling?” or “Alexa, what’s the temperature in the cabin?” Alexa will even process requests to control connected devices on board the spacecraft, starting with in-cabin lighting. Alexa engineers will use what they learn from Alexa’s time in space to make existing Alexa features even better for customers on Earth, including those operating in harsh or remote environments without connectivity.
Using NASA’s Deep Space Network, Alexa can also retrieve information from Earth for astronauts in space, from news briefings to sports scores, helping astronauts stay connected to home during long missions. Together, these voice interactions can help make life simpler and more efficient for those on board the spacecraft, especially when they are buckled in or preoccupied with other tasks during the mission.