Tag: NASA

  • AR Tools for Lunar Sampling

    UI design

    Overview

    The AR Toolkit for Lunar Astronauts and Scientists (ATLAS) is an information system that was conceived during my time working in a software engineering lab at the University of Michigan.

    I led the UX Research, rapid prototyping, and human-in-the-loop testing for an augmented reality information system designed for use in the Artemis generation xEMU spacesuit during lunar EVAs, as well as VEGA (Voice Entity for Guiding Astronauts) a Rasa-based conversational AI.

    ATLAS won the 2020 NASA SUITS Challenge, a $10k EPIC MegaGrant, and became the foundational system upon which the CLAWS lab continues to make leading edge technological advances to enable long duration human spaceflight.

    A video walkthrough of the AR Design Guide I prepared for the other UX designers and software engineers at CLAWS.

    This work was also utilized in the NASA Exploration Habitat (X-Hab) Challenge in collaboration with the Bioastronautics and Life Support Systems Lab at the University of Michigan.

    The goal of ATLAS and VEGA are to assist astronauts in cognitively demanding fieldwork. Because of the nature of the work, I guided the UX toward emphasizing non-intrusiveness, adaptivity, and situational awareness.

    This project led to:

    Further involvement with BLiSS, leading human-centered design to adapt VEGA for NPAS, the NASA Platform for Automated Systems, in collaboration with NASA’s Autonomous Systems Laboratory.

    An internship with NASA’s Exploration Medical Capability team where I worked across internal systems as a Human Factors Engineer and UI Architect to advance medical systems for long duration human spaceflight.

    My thesis research, which synthesized these experiences to better understand the perception of human-centered design among tech-centered engineers designing systems for human spaceflight and the implications for designing the Future of Work on Earth and in Space.

    Finally, my journey at NASA concluded with a tour of the aerospace side on the Convergent Aeronautics Solutions team to deliver human-centered design evangelism in support of advanced urban air mobility (eVTOL, drones, etc.) over the summer prior to entering the PhD program at the University of Michigan School of Information.

    During COVID, I created a test environment in WebVR. This led to my aFrame contributor credit on GitHub!

    Publications & Outputs

  • Creating a Lunar Analog Environment in A-Frame

    As the resident UX researcher and human in the loop testing co-coordinator for CLAWS, it’s my responsibility to plan, facilitate, and analyze usability tests with real people to get feedback on our AR Toolkit for Lunar Astronauts and Scientists (ATLAS). Earlier this year, CLAWS participated in the NASA SUITS Challenge, the pandemic forced our school to close campus, including our lab. My test plan was scrapped, and although I scrambled to put together a fully interactive prototype that participants could click through on their computer, I wasn’t quite able to complete it in time.

    In the coming school year, CLAWS has opted to conduct all collaboration and research activities virtually, including HITL usability testing. Having this pre-plan in place, I’ve begun thinking about how to get the most out of remote testing. First, unlike last year, I am pushing for a more agile and iterative design cycle.

    Instead of spending months evaluating our own work before showing it to test participants, I am seeking to test once a month, beginning with a simple paper prototype that we can test remotely with Marvel App. Based on our findings from these tests, we can improve our design. With Marvel, you simply draw your screens out by hand, take photos of them, and then you can link them together with interactive hotspots for test participants to click through.

    Initially, I had proposed Adobe XD as a means of putting together an interactive prototype for remote testing and demonstration purposes. With XD, designers have the capability of creating complex prototypes that compliment the modularity ATLAS requires. You can create components, and instead of having to create multiple screens to represent every interaction, you can create every interactive state of that component within the component itself! On top of this, XD allows designers to connect sound files to interactions. Sound files like this one:

    PremiumBeat_0013_cursor_click_06.wav

    …which could be used to provide audio feedback letting the user know the system has accepted the user’s command.

    Depending on how complex we want to get with our prototype, we could even test the implementation of our Voiced Entity for Guiding Astronauts (VEGA), the Jarvis-like AI assistant.

    This will be a great way to test ease of use and overall experience before committing the design to code. However, I’ve also begun thinking about the best way to demonstrate our final deliverable to wider audiences. Even if we have a vaccine, it’s likely that a lot of conferences will still be held virtually. Furthermore, this is a big project, with a lot of students working on it, and we should have a final deliverable that showcases our work in an easily accessible format in order to feature it in our portfolio.

    One of the possibilities I’m exploring is wiarframe. This is an app that allows you to set up your AR interface using simple images of your interface components.

    The wiarframe design canvas

    Designers can also prototype a variety of look (gaze, stare) and proximity (approaches, reaches, embraces, retreat) gesture interactions where the component can change state, manipulate other components, even open a URL, call an API, or open another wiarframe interface. This ability to open another wiarframe could enable my team to prototype and link together the individual modules for the user to navigate between.

    Wiarframe is really useful when it comes to AR on mobile devices. Less so when the AR is coming from a head mounted display (HMD). Because, to open a wiarframe prototype, users must download the mobile app, and then anchor the interface to a surface.

    This is really fun, but there is no sense of immersion. Back at our lab, the BLiSS team created a near life-sized mockup of an ISS airlock with which to immerse test participants in a kind of analog environment. This is common for testing designs for human-computer interaction in space. It is still too costly to test designs on actual users in the context of spaceflight (Holden, Ph.D., Ezer, Ph.D., & Vos, Ph.D., 2013).

    In order to get the best feedback out of remote usability testing, we’re going to need an immersive environment, it needs to be cheap and relatively easy to put together, and widely accessible so that we don’t constrain our recruiting pool such that we can’t find participants with the appropriate equipment to test with.

    I believe these requirements can be met and our problems solved with A-Frame. A-Frame allows creators to make WebVR with HTML and Javascript, that anybody with a web browser can experience. What’s more, users can fully immerse themselves in the VR environment with a headset like Vive, Rift, Daydream, GearVR.

    On top of this, as I was exploring what A-Frame could do through the Showcase examples, I came across a WebVR experiment by NASA, Access Mars. Using A-Frame, users are given the opportunity to explore the real surface of Mars by creating a mesh of images recorded by NASA’s Curiosity rover. Users can actually move around to different areas and learn about Mars by interacting with elements.

    An image from Access Mars instructing users on how to interact with it.

    New to A-frame, I wasn’t really sure where to begin. Luckily Kevin Ngo of Supermedium, who maintains A-Frame, has a lot of his components available on Github. With limited experience, I was able to find a suitable starting environment, and with a few minor changes to the code, I developed an initial lunar environment.

    Screenshot of the A-Frame lunar analog environment

    If you’d like to look around, follow this link:

    https://mtthwgrvn-aframe-lunar-analog.glitch.me/

    I’ll be honest there’s not much to see. Still, I’m excited about how easy it was to put this together. Similar to Access Mars, I’d like to develop this environment a little more so that users can do some basic movement from location to location. If we use this to test the Rock Identification for Geological Evaluation w.LIDAR(?) (RIGEL) interface, some additional environmental variables would have to be implemented to better simulate geological sampling. There are physics models that can be incorporated to support controllers which would allow for a user with one of the VR headsets mentioned above, to be able to manipulate objects with their hands. The downside of this is it would limit who we could recruit as a testing participant.

    If nothing else, I want to be able to test with users through their own web browser. Ideally, they’ll be able to share their screen so I can see what they’re looking at, and their webcam so I can see their expression while they’re looking at it. While it’s not the same as actually being on the surface of the Moon, creating analog environments for simulating habitat design are relatively common at NASA (Stuster, 1996; Clancey, 2004; see also: NEEMO and BASALT). A WebVR environment as a lunar analog in which to test AR concepts follows this approach.

    For usability scoring, we are using the standard NASA TLX subjective workload assessment as a Qualtrics survey to get feedback ratings on six subscales:

    • Mental demand
    • Physical demand
    • Temporal demand
    • Performance
    • Effort
    • Frustration

    But testing aside, I also think WebVR is the best way to showcase our project as a readily accessible and interactive portfolio piece that interviewers could play with simply by clicking a link as we describe our role and what we did on the project. On top of this, with outreach being a core component of the work we do in CLAWS, an WebVR experience is ideal for younger students to experience ATLAS from the comfort and safety of their own home.

    References

    Clancey, W. J. (2004). Participant Observation of a Mars Surface Habitat Mission. Moffett Field, CA: NASA-Ames Research Center.

    Holden, Ph.D., K., Ezer, Ph.D., N., & Vos, Ph.D., G. (2013). Evidence Report: Risk of Inadequate Human-Computer Interaction. Human Research Program: Space Human Factors and Habitability, 1–46.

  • The UX of Bioastronautics

    Bioastronautics is a focus area of aerospace engineering that specializes in the study and support of life in space. This area of research spans the biological, behavioral, medical and material domains of living organisms in spaceflight. Increasingly, it’s also being applied to space habitat environments. And while the body of research spans decades, there is little information available regarding the user experience. I’d like to change that.

    Artistic rendition of Space Station Freedom with the STS Orbiter Vehicle
    Space Exploration Initiative — Wikipedia

    Up until recently, the emphasis has been on pushing the bounds of what’s technologically possible and making it work. And to a large extent, this will continue to be true. However, we are on the precipice of a new frontier in which bioastronautics is open to the input of user experience research and design. To optimize the design for the users rather than train the users on how to use the design.

    Below I’ve outlined several gaps in HCI research related to bioastronautics that NASA has identified as presenting a risk to astronauts.

    From NASA’s 2013 Evidence Report: Risk of Inadequate HCI, research gaps include:

    • Methods for improving human-centered design activities and processes
    • Tools to improve HCI, information presentation/acquisition/processing, and decision making for a highly autonomous environment
    • Tools, methods, and metrics which support the allocation of attention and multitasking for individuals and teams
    • Validation methods for human performance models

    Evidence collected in this report details contributing factors that are pertinent for the investigation by the HCI researcher. These include:

    • Requirements, policies, and design processes
    • Informational resources/support
    • Allocation of attention
    • Cognitive overload
    • Environmentally induced perceptual changes
    • Misperception/misinterpretation of the displayed information
    • Spatial disorientation
    • Design of displays and controls

    I’m a graduate student studying Information Science at the University of Michigan and the Usability Testing Coordinator for CLAWS (Collaborative Lab for Advancing Work in Space). My role is as a UX/UI specialist involved in the research and design of ATLAS (Augmented Toolkit for Lunar Astronauts and Scientists) to compete in NASA design challenges, SUITS, and M2M X-Hab.

    Bioastronautics research is still primarily engaged with human factors research dedicated to hardware and industrial design. The application of HCI is lacking, which is why the CLAWS team began actively recruiting from UMSI. The bulk of the team is composed of aerospace, mechanical and industrial engineering, as well as computer science majors.

    To implement the human-centered design strategy, I would start by conducting an ethnographic study through participant observation and contextual inquiry with my team to better understand the culture of bioastronautics. Placing more emphasis on HITL as simulated usability testing, I’ll be seeking to validate our methods both in the BLiSS lab and remotely. Due to the COVID-19 pandemic and self-isolation, we’ve had to scrap my HITL plan and I’m currently in the process of adapting a prototype in XD for remote usability and heuristic testing. Below is a cursory view of the design.

    https://xd.adobe.com/view/482cc044-b8d9-4893-40e6-4b75514adf7f-3e1d/

    Interestingly, our self-isolation presents an opportunity to better understand the sort of issues astronauts will face in space. After all, astronauts on the Moon cannot conduct in-person meetings with ground control. This is specifically one of the target opportunities for HCI concerning the bioastronautics of space travel and exploration. Astronauts on future EVA missions will not be in constant contact with ground control as they have been up to now. Information systems, therefore, need to be designed to maximize autonomy and optimize information processing while simultaneously reducing cognitive load.

    A pertinent example is the GeoNotes protocol we are currently working on. The Artemis generation astronauts are not geologists, save one. But they still need to be able to conduct high-quality lunar sampling and take sufficient field notes for planetary scientists back on Earth, so our task has been to design a geological sampling protocol that supports the needs of the Earth-based scientists as well as the autonomous astronaut.

    Astronauts are cyborgs. They are the people for whom the term was coined. “For the exogenously extended organization complex functioning as an integrated homeostatic system unconsciously, we propose the term ‘Cyborg’.” — Manfred E. Clynes and Nathan S. Kline

    I come from a background in Anthropology. Four field Anthropology. This is the common format of American Anthropology, and it proposes holism through an equal understanding of a person and groups of people by researching humans through biological, cultural, linguistic, and archaeological, or material contexts. What initially drew me to the field of Information is first and foremost, the interdisciplinary approach. Drawing on my background in Anthropology, I have a penchant for synthesis. Next, I came across a TedTalk by Amber Case, “We are all cyborgs now.”

    Amber’s argument is that because we are storing whole swathes of our brains, creating alternate identities, and communicating with each other through digital technologies, we are all cyborgs now. I also hold this view.

    Everything humans do regarding actually leaving Earth’s atmosphere and spending increasing lengths of time in space or on extraterrestrial bodies is in the realm of bioastronautics. All of that technology, from spacesuits to the shuttle, is concerned with supporting life in space. The body of research into the topic thus far has primarily centered around hardware and industrial or mechanical design and engineering. Increasingly, an emphasis on HCI needs to be made to close research gaps identified by NASA and provide adequate UX to end-users as humans seek to spread out and begin colonizing our solar system.