ATLAS, the AR Toolkit for Lunar Astronauts and Scientists, is an augmented reality system on the Microsoft Hololens designed to be a flexible toolkit integrated into the Artemis generation spacesuits, the xEMU, to support astronaut autonomy on lunar expeditions. The modular system provides seamless access to relevant data and information without unwanted intrusions through the use of protocols.
January – April 2020
CLAWS: Collaborative Lab Advancing Work in Space
Hololens, Adobe XD, Figma, Design Thinking, Human-in-the-Loop, A-Frame
- UX/UI Designer
- Human-in-the-Loop Coordinator
A big challenge when designing a user experience for astronauts is simply that this is a very small and difficult-to-access population. That said, NASA supplied us with remote access to several subject matter experts to answer questions that yielded useful insights.
Who we met
What they do
Planetary Science, Software Engineering, Human Systems Integration, Flight Test Engineering, Astronaut, xEMU Spacesuit Specialist, PI for JARVIS (the NASA version of our system)
NASA Artemis Missions to establish a colony on the Moon and build a gateway to Mars.
- Access to telemetry data and bioinformatics for them and their EVA partner.
- Ability to tag geographic points of interest for future sampling and investigation.
- High-level overview for reference with the ability to drill down into mission details.
- Abort protocols
- The interface to stay out of their way unless needed.
- To see the communication system status.
- The first EVAs are sensory overload for the astronaut, so we want to ensure our UI doesn’t add to this.
With these findings, my team focused on geological sampling, field-notes protocol, navigation, and way-point marking.
The right persona
Given the challenge of access to astronauts, particularly astronauts who have completed EVAs on the Moon, I developed ad-hoc or provisional personas based on the SME interviews synthesized with publicly available information about the specific people selected to become astronauts for the Artemis program.
UX Design for AR
Most of my team, including me, had little to no experience designing AR interfaces. So I took a course on designing for AR/VR at the Interaction Design Foundation.
With these valuable lessons, I set out to aggregate additional resources on our particular constraints given environment and hardware.
I put together a design guide to establish a baseline level of usability and a decent UX for my team as outlined in the image below.
I further developed UI assets and design guidance for my team to promote consistency in rapid prototyping using a combination of Adobe Photoshop, Illustrator, XD, and Figma, as shown in the slideshow below.
My team’s paper prototypes initially communicated the UX team’s general vision to the software team, many of whom had an additional year on the project. We sought UX feedback rather than the aesthetic feedback that often comes with higher-fidelity prototypes.
The CLAWS team in our lab received a crash course in our Server/Networking architecture and etiquette.
About one month after kickoff, our team was forced to begin working remotely due to the pandemic. As a result, we lost access to our lab and the ability to conduct testing with the Hololens. I began exploring suitable alternatives for remote human-in-the-loop tests.
Furthermore, for the end users, we recognized the primary problem with visual design would be to accommodate changes in background lighting. The surface of the Moon is incredibly bright, reflecting the Sun’s light. However, just above the horizon, it is pitch black.
“How might we validate our interface prior to demonstration?”
The now-defunct AR prototyping tool WiARframe offered the ability to anchor the UI in space to check the general layout, look, and feel. However, there was no immersion and no ability to test interactivity with the system itself (or what we propose the system to be at this point). And we still couldn’t test the visual design against an appropriate background.
I turned to A-Frame, an open-source browser-based VR library, to create the most immersive remote testing experience possible while maximizing accessibility among our participant recruitment pool, e.g., engineering students (see image below). My blog post on Creating an A-Frame Lunar Environment describes this process in more detail. I have also become an A-Frame contributor, with my Lunar Environment now a default environment template to select from.
Now that we had a rough idea of how to proceed, we needed some workflows for the developers. I produced two happy paths for navigation and geological sampling to get us started.
From Paper to Digital
My time on the project came to a close shortly after designing the interactive prototype and issuing the report below. However, CLAWS was awarded a $10k grant to continue the development of the ATLAS system, with the most recent iteration demonstrated at Johnson Space Center below. Many interaction patterns I devised are still present.
One of the primary lessons I learned from this project was to prioritize rapid prototyping and evaluation. Many clients, stakeholders, and interdisciplinary team members regard UX as a practice in making a UI “pretty.” This left some members of my team spinning their wheels trying to make their designs picture-perfect.
However, we will never get it 100% right when designing for humans. And even if we did it by some sort of fluke, it would only be temporary because what is the right design is always a moving target. If it wasn’t, we would still be driving Model-Ts.
Better to visualize what is in your head sooner rather than later and use this process to course correct as many times as possible before committing these designs to code, whatever the timeline or budget.