Club Overview

Club Overview

Club Overview

CLAWS is a multi-disciplinary group of 60 undergraduate and graduate students that has competed in the NASA SUITS (Spacesuit User Interface Technologies for Students) Challenge since 2018. The team builds augmented reality (AR) interfaces to assist astronauts in their exploration of the lunar surface. CLAWS members have the opportunity to learn and apply industry skills in a collaborative project team, and engage with the broader space community.

The club is committed to maintaining a fun and friendly environment. The team is organized to provide members with support and guidance throughout the development process. At the end of the year, CLAWS travels to the Johnson Space Center to present a final product to NASA. The entire trip is thrilling and a highlight of each year.

The team is structure with leadership from the Project Manager, Adhav Rajesh. The CLAWS Board, consisting of five of the most experienced members, provides guidance and assists with the high level management of the organization. Technical Advisors are responsible for overseeing each subteam’s technical aspects and onboarding new members. Product Leads are tasked with leading Feature Teams, which each build individual features, and are typically comprised of several members from each subteam.

Paramount to CLAWS is a commitment to providing an open and inclusive environment for students of all backgrounds and identities. The team strives for every members to be supported and engaged in their exciting work on space exploration technology.

CLAWS frequently hosts team social events throughout the year, including Friendsgiving, movie nights, Squid Game competitions, and boba trips. The club has presented at several conferences, including the XR @ Michigan Summit, UX@UM Conference, and the U-M Space Symposium. The team also puts on several outreach events each year to teach K-12 students about science, technology, and space exploration.

NASA SUITS Challenge

NASA SUITS Challenge

NASA SUITS Challenge

As NASA launches the Artemis program for sustained human presence on the lunar surface and ultimately, Mars, engineers are considering what technology will best aid astronauts to safely and successfully complete science and exploration missions. Today, the Mission Control Center at NASA's Johnson Space Center in Houston relays all pertinent information to the crew via a voice loop. In the future, communication delays upwards of 20 minutes to the surface of Mars will require crew members to have more autonomy.

NASA’s Joint-AR project (JARVIS) investigates the potential of augmented reality (AR) displays in astronaut suits. There are numerous benefits of an astronaut AR system:
• Increased autonomy and efficiency for astronauts
• Clearer understanding of information through visualizations
• Greater access to specialized knowledge with less training

Stemming from NASA’s Joint-AR project, the NASA SUITS Challenge tasks university teams with developing AR interfaces for lunar astronauts. The helmet display is intended to assist astronauts with navigation, task management, vitals tracking, geological sample logging, and communication with mission control. In addition to technical delivery, the challenge focuses heavily on utilizing intentional design to craft a polished user experience. Designs from the challenge will serve as inspiration for NASA as the agency develops future space exploration technology for the Artemis program.

At the end of each year, nine university teams are selected as finalists to present their work at the NASA Johnson Space Center in Houston, Texas. NASA scientists, engineers, designers, and astronauts evaluate the student-built projects, providing invaluable feedback. Past presenters have included teams from Stanford, Duke, USC, UC Berkeley, Carnegie Mellon, and CLAWS from U of M.

Subteams

Subteams

Subteams

AR

AR

AR

The AR Team is responsible for developing the spatial interfaces created by the UX Team. This involves front-end development (developing screens, UI functionality, implementing anything that the user sees or does). This also involves back-end development (software architecture, telemetry communication, MCC communication, VEGA integration).

UX Design

UX Design

UX Design

The UX Design Team is responsible for designing interfaces for the AR and Web Teams to develop. This involves incorporating findings from the Research Team to map out intuitive, functional experiences. The designers will test out user flows through design prototypes, utilizing the core principles of visual design (layout, hierarchy, color, typography, motion) to best address astronauts’ needs.

Web

Web

Web

The Web Team is responsible for developing and implementing the Mission Control Center (MCC) dashboard by leveraging the HoloLens’ dynamic AR capabilities. The MCC interface will establish a seamless web connection between astronauts and mission control, facilitating real-time communication, data sharing, and task coordination. This year, the Web Team will leverage multiplayer mode, a database, chat mechanisms, geo-sampling data, and logging of real-time astronaut locations, danger zones, and waypoints.

Hardware

Hardware

Hardware

The Hardware Team is responsible for developing any hardware and electrical components that will improve the functionality of our application next year. Hardware Team members will utilize a diverse range of technologies between, CAD, Raspberry Pi programming, and circuit design.

AI

AI

AI

The AI Team, also known as the VEGA team, works to integrate any NLP/CV related technologies into the final application. AI developers will work closely with the AR Team to connect various backend frameworks to the HoloLens. We will be working with Django’s REST API, OpenCV, AWS EC2, LLMs (LLAMA2, ChatGPT), Rasa intent classification, among other tools.

Research

Research

Research

The Research Team is responsible for working with UX to assess the usability and efficacy of our application. The team is tasked with finding focuses for our application based on studying space, human-computer interaction (HCI) best practices, cognitive science research, and user feedback. The Research Team will conduct organized and insightful user testing that will be compiled and published, helping to establish CLAWS in the scientific community. 

Business

Business

Business

The Business Team will be divided into three distinct committees, each with unique responsibilities. The Finance Committee will be in charge of maintaining CLAWS Finances and exploring grants and fundraisers to enable our NASA trip. The Internal Relations Committee will be in charge of planning social events, updating internal tools like the CLAWS website, and creating an itinerary for the Houston trip. Lastly, the External Relations Committee will be in charge of maintaining our social media presence, organizing outreach events and communicating with other University of Michigan organizations.

Past Projects

Past Projects

Past Projects

NOVA 2022-2023

NOVA 2022-2023

NOVA 2022-2023

Our team pursued six key objectives with NOVA: guiding astronauts in UIA egress procedures, displaying vital signs, aiding navigation (pathfinding, waypoint placement, and avoiding danger zones), enabling geo sampling, issuing rover commands, and facilitating messaging between mission control and astronauts. To accomplish this, CLAWS introduced four vital features. The first, the HoloLens UI, allows the user to interact with pop-ups, buttons, alerts, etc. to guide them along a mission. The Voice Entity for Guiding Astronauts, or “VEGA”, is an AI voice assistant that listens to and performs basic user commands through speech-to-text. The Light Unit Navigation Aid, or LUNA, an extension that allows the user to position displays outside their primary FOV. The Mission Control Center is a web application that allows users to communicate, monitor astronaut vitals, and keep track of mission progress. 

User testing affirmed NOVA's accomplishments: a sleek, user-friendly interface, versatile input methods, context-responsive voice commands, and effective lighting. Based on user interviews, we've embraced usability insights by delivering concise information and prioritizing eye-gaze and voice commands. In the upcoming phases, CLAWS strives to enhance eye-gaze usability, broaden LUNA's field of vision, deepen MCC integration, and expand VEGA's corpus.

HOSHI 2021-2022

HOSHI 2021-2022

HOSHI 2021-2022

HOSHI is an AR-Assistive application that prioritizes an efficient balance of astronaut safety and

autonomy. This cutting-edge application serves as a companion for navigation, geological sampling and documentation, as well as search and rescue operations. At its core, our system interacts seamlessly with users through the AI-powered voice assistant "VEGA".

Through extensive research, we have identified communication challenges between astronauts and Mission Control, primarily being the overreliance on audio instructions from the Mission Control Center and the constraints of conventional pen-and-paper methods. These hurdles impose a substantial cognitive burden on astronauts. HOSHI rises to the occasion as an intuitive and user-centric application. It emphasizes safety, autonomy, visibility, accessibility, functionality over learnability, minimizing cognitive load, and actionable information. The primary mode of interaction with HOSHI centers around voice commands, executed through the AI voice assistant VEGA. Additionally, secondary interactions such as physical gestures (finger points, gentle hand motions) guide astronauts through their missions. Feedback is then given through visual and audio cues.

Through what we have learned with HOSHI, we plan to expand the scope of interactions with next year’s project. This might mean integrating an array of gestures, haptic feedback, eye gaze tracking, adapting to extreme visibility conditions, speech-to-text parsing, a more flexible voice assistant, and user testing.

ATLAS 2020-2021

ATLAS 2020-2021

ATLAS 2020-2021

ATLAS is an AR system for the Microsoft Hololens, designed as an assistive HUD for astronauts on lunar expeditions and our first iteration of the SUITS challenge. This modular system offers streamlined access to mission-critical information via protocols tailored to various EVA stages: mission planning, suit prep, sample collection, repairs, emergencies, and abort procedures.


While the astronaut conducts EVAs, they are given navigation assistance and access to vital information via voice assistant “VEGA”, as well as tools to aid in geo-sample collection and rover repairs. In the face of emergencies, readily available warnings and a pre-configured abort protocol stand ready to ensure a swift and safe response. With color-coded information levels, redundancy measures using QR codes, and simple hand and voice interactions, the astronauts can intuitively use the system without disrupting the flow of the mission. Additionally, the Mission Control Center (MCC), has the authority to update mission tasks at any point, monitor biometrics and communicate with the astronaut. Beyond this, the MCC supports off-site mission planners and scientists, fulfilling the dual roles of mission command hub and data repository.


Despite the challenges during the ongoing COVID-19 pandemic, the team was able to fully design the system. At this point, the core software has been developed and the UI elements have been created. In the future, additional work will be needed to integrate the UI into the main AR application and expand the abilities of VEGA.

Grants

Grants

Grants

Collaborative Lab for Advancing Work in Space

claws-admin@umich.edu

© 2024 CLAWS

Collaborative Lab for Advancing Work in Space

claws-admin@umich.edu

© 2024 CLAWS

Collaborative Lab for

Advancing Work in Space

claws-admin@umich.edu

© 2024 CLAWS