Matthew Garvin

UX Research, Design, Strategy

NASA ExMC DesignOps

Context and background


In January through April 2021, I was contracted by NASA ExMC to work as an embedded UX/UI Architect for four months. I worked alongside the Model-Based Systems (MBSE) team, a group of pioneers and innovators within NASA transitioning to a model-based systems engineering paradigm as part of a digital transformation strategy. The goal is to facilitate diverse stakeholders’ understanding of a given medical system and create a more agile design and development process. Before MBSE, this data was stored across numerous documents, presentation slides, and spreadsheets.

(a) Traditional systems engineering; (b) model-based systems engineering.

My role

As the UX/UI Architect for NASA’s Exploration Medical Capability, I was the sole user researcher and designer for the project. I was responsible for leading all of the interviews, recruitment, synthesis, usability sessions, and presentation of findings throughout the project duration of four months.

Timeline

The project lasted for four months, from January to April 2021. Deliverables were created and handed over during this time frame. The breakdown of the timeline is as follows:

  • Project kickoff: 1 week
  • Design ethnography: 2 months
  • Heuristic Evaluation: 1 week
  • Recruiting: 2 weeks
  • Usability Testing: 1 week
  • Analysis: 1 week
  • Iterative design began in week 2 up until I began reporting.
  • Reporting: 2 weeks

Research Statement and Goals


Initially, the client was only interested in implementing a new wireframe following a prior round of usability testing and then testing it. However, as the project progressed, I learned that new system models were expected to ramp up significantly. Therefore, the project’s goals revolved around developing DesignOps and reusable assets as a design system.

These goals included creating more efficient workflows, improving the quality and impact of design outputs, evaluating the interface to comply with usability principles, and assessing how stakeholders use the model interface to surface top pain points.

Research Methodology


For this project, I utilized four different research methods to fit the timeline and particular constraints of the project. These methods were design ethnography, RITE (Rapid Iterative Testing and Evaluation), heuristic evaluation, and usability testing.

Design ethnography

The first month of the project focused on immersing myself in the context of model-based systems engineering. As I was unfamiliar with MBSE practices, I completed a certification in MBSE from UB SUNY on Coursera as part of my desk research.

However, due to difficulty in obtaining permission to use the MBSE software, MagicDraw, from NASA IT, I adopted design ethnography to better understand MBSE practices, team composition, and what was required from me in this project. The design ethnography method combines participant observation with participatory action and design research, making it useful in generating insights into users and teams and improving their design operations.

As engaging in such a context can come with its own challenges, such as working with a distributed team of busy, highly technical people who communicate mostly through email and various groups in Microsoft Teams, I observed interactions between team members during biweekly sprint planning and element meetings, as well as weekly standups. Additionally, I arranged a number of interviews during this period with participants selected based on relevance to the representative user profiles, including

  • 5 SE/MBSE’s
  • 6 clinicians (physicians, nurses, pharmacists)
  • 1 element scientist.

During the interviews, I asked participants about their review process and gathered insights into their varying information needs, pain points about milestone reviews, and common interaction patterns across user groups. These insights were used to derive several specific questions of interest related to how different users access and use the model.

Analysis and synthesis

A thematic analysis was performed on transcriptions of participant responses to extract key themes. The analysis focused on three criteria related to potential weaknesses/serious issues in the user interface:

  1. What are user groups primary interests when conducting milestone reviews on medical system foundations?
  2. What are user groups primary pain points with regard to milestone reviews?
  3. How can we promote consistency in UI among MBSE’s?

Findings

Among our top findings were:

  • Varying information needs: Different user groups prioritize different information, making it important that all information is easily accessible.
  • Single “source of truth”: The model serves as a single source of truth for stakeholders to align on and promote consistency in presentation.
  • Overview first, zoom and filter, details on demand: Schneiderman’s classic mantra proved the simplest data visualization pattern to satisfy all groups.

These findings derived several specific questions of interest to fine-tune the interface across user groups utilizing the RITE method. These questions addressed the different needs of clinicians, engineers, and mission management.

Recommendations

As both the researcher and designer, I aimed to apply human-centered design principles through rapid prototyping and evaluation with user data. While we understand that achieving a 100% correct design is impossible, we strive to adopt the Pareto Principle, also known as the 80/20 rule, suggesting that 20% of the effort generates 80% of the results. In simpler terms, I wanted to get feedback on my initial design before investing significant time and effort in the wrong direction.

Rapid Iterative Testing and Evaluation


Although I could not access the actual model initially, I had access to a trial version of MagicDraw, which allowed me to develop a designer’s workflow and set up a sandbox to design in. I discovered that the original interface designed by the MBSE lead was built using a content diagram instead of MagicDraw’s UI tools for designing the HTML report.

Engineer-designed content diagram as HTML Report

Therefore, I had to produce step-by-step documentation on unlocking the UI design capabilities within MagicDraw. Initially, I was tasked with implementing an updated design based on a wireframe from a previous contractor.

The previous contractor’s wireframe was still riddled with design issues

However, I proposed a simpler design that utilized common interaction patterns to improve usability, such as a document-style interface.

Wikipedia is a good example of Schneiderman’s mantra in use

To demonstrate the value of this approach, I backed up my design rationale with user data, which showed reduced errors and task failures over time as changes to the interface were made using the RITE method.

A record of errors and failures over time as changes to the interface were made using the RITE method

Utilizing Success Criteria Scoring (SCS), I tracked errors, failures, and fixes in the interface.

Total problems foundProblems receiving a fixImpact ratio
2828100%

This allowed me to calculate the impact and re-fix ratios, which showed the effectiveness of the RITE approach in finding and fixing problems.

Total fixes (including “re-fixes”)Changes that needed “re-fixing”Re-fix ratio
34618%

For example, participants had difficulty navigating to specific documentation within the model, but implementing a Home button and breadcrumb UI pattern helped solve this issue. However, it became clear that the breadcrumb pattern would not be sustainable if engineers creating the interface had to manually recreate it. Therefore, we turned to Schneiderman’s mantra: Overview first, zoom and filter, and details on demand.

Heuristic Evaluation


The heuristic evaluation technique is a quick and effective usability method that enables design teams to identify and fix easily observable usability issues, allowing for a more efficient use of time with real users during usability testing. Jakob Nielsen’s 10 Usability Heuristics for UI Design serve as a useful set of general design principles that can be incorporated into good design practice.

Cognitive walkthroughs can complement heuristic evaluations by providing evaluators with a user perspective, particularly useful when combined with personas.

The heuristic evaluation process

When performing a heuristic evaluation, it is typical to use 3-5 evaluators to synthesize findings and recommendations. However, it is important to keep in mind that evaluators may misrate the severity of an issue or identify issues that do not actually exist, making it necessary to follow up with usability testing to validate findings.

No usability problemCosmetic problemMinor usability problemMajor usability problemUsability catastrophe
01234
Problem severity ratings

Findings

In this evaluation, the only remaining violations were limitations with the tool, such as the lack of support to help users recognize, diagnose, and recover from errors, which violated Nielsen’s 9th heuristic. It is important to note that being too close to the design may lead to biased opinions. However, this can be remedied by testing the hypothesis that real users sufficiently use the design.

Usability Testing


As my time on the project came to a close, I conducted a round of usability testing on a single, final iteration of the interface using five participants from the same stakeholder groups who had not participated in an earlier iteration.

Technical plan

The testing took place over the course of a week using Microsoft Teams to host and record the sessions. Each session was scheduled for 1.5 hours, with thirty minutes as a buffer for discussion or technical issues.

Tools and instruments

To prepare for the test, I drafted a usability test checklist from the tasks and steps defined in the heuristic evaluation and created a document for participants that outlined these tasks and provided short scenarios for context. During the test, I asked participants to adopt the “think aloud” protocol and gently reminded them of this if they were quiet for a long time.

I used the Success Criteria Scoring (SCS) metric to score each step of the overarching task within the test as either a pass (1), struggle (0), or fail (-1). After completing all the tests, I used this data to calculate the success rate using the formula (S+(P*0.5))/O, where O is equivalent to the number of possible scores. To identify where participants had the most trouble with the interface, I calculated each step’s differential (sum of scores minus count of scores).

SuccessPassFail
10-1
Success Criteria Scoring (SCS)
SCS differential visualizing weakest points of the interface

I also used the Single Ease Question (SEQ) and the Usability Metric for User Experience (UMUX-Lite) metrics to gather quantitative data on the ease of use and overall user experience, respectively. The usability testing also collected qualitative data through recording participants’ thoughts and feelings and conducting follow-up questions and debriefs.

Findings

The usability testing revealed virtually no usability issues with the interface itself. There continues to be some minor confusion over jargon across user-groups

There only major issue that remained related to the permissions required to use the tool and access the interface and linked documents, which was beyond my control and was part of the constraints of the MagicDraw and Cameo Collaborator software.

Recommendations


Although the usability test showed only minor issues, my findings informed some recommendations for further work.

First, fully explore the capabilities of UI Design in MagicDraw to ensure that all available tools and features are being used to support the project goals.

Second, focus on making the workflow to build and update the model interfaces sustainable. The end-users are only one half of the equation, and ensuring that the tool and interface are easy to maintain and update over time is crucial.

Design Principles

One of the guiding principles of the federal government’s system is to make the best thing the easiest thing. The system complies with the latest federal policies on accessibility and supports modern web development practices.

Technically and from a team point of view, designing for flexibility and performance is another important principle. To ensure efficiency and consistency, teams are encouraged to reuse basic components and patterns instead of creating them from scratch for each new product.

Side Navigation

Side navigation is essential to the system’s strategy, providing an overview and clear hierarchy. It offers stepper navigation, which users find helpful in navigating the system.

Impact


My work on the MBSE teams at NASA had a significant impact. The virtual presentation of my innovative approach to creating the model interface was attended by almost 400 people, indicating widespread interest. I conducted interviews to segment user groups and engaged in RITE testing, which reduced usability issues by almost 100%. The effectiveness of my design was further validated through heuristic evaluation and usability testing. Two MBSE team members were able to implement my design, resulting in two model interfaces being up and running. The adoption of the new system has continued to grow, and my work has contributed to two publications disseminating the innovative approach to a wider audience.

Publications

References


Lewis, J. R., Utesch, B. S., & Maher, D. E. (2013). UMUX-LITE — When there’s no time for the SUS. CHI 2013: Changing Perspectives, Paris, France, 2099–2102.

Medlock, M. C., Wixon, D., Terrano, M., Romero, R., & Fulton, B. (2002). Using the RITE method to improve products: A definition and a case study. Usability Professionals Association51, 1963813932-1562338474.

Nielsen, J., & Mack, R. L. (1994). Usability Inspection Methods. New York, NY: John Wiley & Sons.


Mtthwgrvn Avatar

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create a website or blog at WordPress.com