
About Us
CLAWS is an interdisciplinary organization at the University of Michigan. Our 5 core teams—Development, Hardware, UX, Research, and Business—collaborate to design and deliver engineering projects that advance human and robotic exploration in space. The Development Team works across multiple initiatives with a primary focus on the software for the NASA’s SUITS challenge. The Hardware Team on the other hand leads physical systems design and prototyping efforts for NASA’s RASC-AL challenge. The Business Team supports organizational growth, partnerships, and project sustainability while also assisting the Research Team with grants, studies, and technical documentation. The UX Team collaborates across all disciplines to design intuitive interfaces and evaluate XR systems with an emphasis on human factors and astronaut usability.
OVERVIEW
OVERVIEW
OVERVIEW
The team structure begins with the Project Manager and Technical Project Manager. They drive club operations on both the business and development sides. This year, Anirudh Annavarapu and Molly Maloney take up the mantle and continue to reinvent the team into more collaborative and productive systems. As the year continues, they work closely with subteam leads to coordinate projects for mission success.
The team structure begins with the Project Manager and Technical Project Manager. They drive club operations on both the business and development sides. This year, Anirudh Annavarapu and Molly Maloney take up the mantle and continue to reinvent the team into more collaborative and productive systems. As the year continues, they work closely with subteam leads to coordinate projects for mission success.
The Executive Board and Subteam Leads act as experienced members / advisors to the team for ongoing support throughout onboarding and development. They help bridge the gap between leadership and new members. Leadership drives operations and project workflow, with the fall focusing on onboarding and the winter focusing heavily on development. Each subteam has its own approach, but weekly cross-team sessions and clear handoffs—plus rotating highlights and shared onboarding—keep everyone aligned. Feature development begins in onboarding and integrated into an MVP for winter semester.
The team is committed to maintaining an inclusive and diverse environment for all students across all disciplines. With a wide range of majors across colleges, the perspectives new members bring is invaluable. CLAWS frequently hosts team social events throughout the year, including retreats, tailgates, Friendsgiving, movie nights, CLAWS Olympics, Hackathons, boba trips, and more. The club has presented at several conferences—with more to come in the future—including the XR @ Michigan Summit, UX@UM Conference, UMSI Convocation, and the U-M Space Symposium. The team also puts on several outreach events each year to teach K-12 and college students about science, technology, and space exploration.
The Executive Board and Subteam Leads act as experienced members / advisors to the team for ongoing support throughout onboarding and development. They help bridge the gap between leadership and new members. Leadership drives operations and project workflow, with the fall focusing on onboarding and the winter focusing heavily on development. Each subteam has its own approach, but weekly cross-team sessions and clear handoffs—plus rotating highlights and shared onboarding—keep everyone aligned. Feature development begins in onboarding and integrated into an MVP for winter semester.
The team is committed to maintaining an inclusive and diverse environment for all students across all disciplines. With a wide range of majors across colleges, the perspectives new members bring is invaluable. CLAWS frequently hosts team social events throughout the year, including retreats, tailgates, Friendsgiving, movie nights, CLAWS Olympics, Hackathons, boba trips, and more. The club has presented at several conferences—with more to come in the future—including the XR @ Michigan Summit, UX@UM Conference, UMSI Convocation, and the U-M Space Symposium. The team also puts on several outreach events each year to teach K-12 and college students about science, technology, and space exploration.
The team structure begins with the Project Manager and Technical Project Manager. They drive club operations on both the business and development sides. This year, Anirudh Annavarapu and Molly Maloney take up the mantle and continue to reinvent the team into more collaborative and productive systems. As the year continues, they work closely with subteam leads to coordinate projects for mission success.
The Executive Board and Subteam Leads act as experienced members / advisors to the team for ongoing support throughout onboarding and development. They help bridge the gap between leadership and new members. Leadership drives operations and project workflow, with the fall focusing on onboarding and the winter focusing heavily on development. Each subteam has its own approach, but weekly cross-team sessions and clear handoffs—plus rotating highlights and shared onboarding—keep everyone aligned. Feature development begins in onboarding and integrated into an MVP for winter semester.
The team is committed to maintaining an inclusive and diverse environment for all students across all disciplines. With a wide range of majors across colleges, the perspectives new members bring is invaluable. CLAWS frequently hosts team social events throughout the year, including retreats, tailgates, Friendsgiving, movie nights, CLAWS Olympics, Hackathons, boba trips, and more. The club has presented at several conferences—with more to come in the future—including the XR @ Michigan Summit, UX@UM Conference, UMSI Convocation, and the U-M Space Symposium. The team also puts on several outreach events each year to teach K-12 and college students about science, technology, and space exploration.
SUITS
SUITS
SUITS
As NASA launches the Artemis program for sustained human presence on the moon and ultimately, Mars, engineers are considering what technology will best aid astronauts to safely and successfully complete their missions. Today, the Mission Control Center at NASA relays all pertinent information to the crew via a voice loop. In the future, communication delays upwards of 20 minutes to the surface of Mars will require crew members to have more autonomy.
As NASA launches the Artemis program for sustained human presence on the moon and ultimately, Mars, engineers are considering what technology will best aid astronauts to safely and successfully complete their missions. Today, the Mission Control Center at NASA relays all pertinent information to the crew via a voice loop. In the future, communication delays upwards of 20 minutes to the surface of Mars will require crew members to have more autonomy.
Stemming from NASA’s foundational Joint-AR project, the NASA SUITS Challenge tasks university teams with developing AR interfaces for lunar astronauts, and pressurized rovers for assisting them. The helmet display is designed to support astronauts with navigation, task management, vitals tracking, geological sample logging, and communication between mission control and another universities rover.
At the end of each year, 10 university teams are selected as finalists from their written proposals. NASA scientists, engineers, designers, and astronauts evaluate the student-built projects, providing feedback. Past presenters have included teams from Stanford, Duke, USC, UC Berkeley, Carnegie Mellon, UT-Austin, Northeastern University, Purdue University, Columbia University, Boise State University, and many more.
RASC-AL
Long-term human presence on the Moon through the Artemis program means engineers must design not only individual technologies, but entire mission systems that support sustained surface operations. From sample return logistics to autonomous mobility and lunar base infrastructure, future exploration depends on scalable, integrated architectures that enable science, safety, and continuous operations.
Each year, new project themes are released that designate teams to compete in specific areas of lunar and Martian mission design. The challenge invites university teams to develop forward-looking concepts that address these evolving exploration needs. Submissions are evaluated by NASA scientists and engineers based on technical feasibility, innovation, and mission impact, with top teams recognized for advancing future exploration concepts
SUBTEAMS
RASC-AL
Long-term human presence on the Moon through the Artemis program means engineers must design not only individual technologies, but entire mission systems that support sustained surface operations. From sample return logistics to autonomous mobility and lunar base infrastructure, future exploration depends on scalable, integrated architectures that enable science, safety, and continuous operations.
Long-term human presence on the Moon through the Artemis program means engineers must design not only individual technologies, but entire mission systems that support sustained surface operations. From sample return logistics to autonomous mobility and lunar base infrastructure, future exploration depends on scalable, integrated architectures that enable science, safety, and continuous operations.
SUBTEAMS
AR Team
The AR Team focuses on developing the user interface for the SUITS challenge within Unity using MRTK3. They are responsible for building front-end AR features with Unity-specific functionalities such as spatial interactions, gesture controls and input logic tailored for augmented reality environments.
Web Team
The Web Team is responsible for building and maintaining the core technical infrastructure for CLAWS. This includes connecting our AR application with NASA’s telemetry server, our partner university’s Pressurized Rover, and our planned and established internal systems across projects.
AI Team
The AI Team ideates, develops, and integrates AI features that interface with our projects. Team members explore and implement key AI concepts such as computer vision, natural language processing, and machine learning to create interactive and adaptive experiences.
Hardware Team
The Hardware Team designs, builds, and implements the RASC-AL challenge. Members learn and apply skills in CAD, 3-D printing, electronics, circuit design, and programming to prototype and translate their designs into working devices.
UX Team
The UX Team designs the visuals and plans the functionality for the AR application and peripheral devices. Members focus on determining the scope of features, conducting human factors research, designing consistent UIs in Figma, and carrying out user testing of designs.
Research Team
The Research Team will build a VR simulation of last year’s project to evaluate how effectively it assists users with navigation. We’ll deliver a functional prototype, conduct literature reviews, design and run participant studies, and analyze the results. In collaboration with our faculty advisor, we’ll prepare and submit our findings to appropriate research venues and NASA.
Finance Team
The Finance Team is responsible for procuring funds for all events, merch, hardware components, and other needs for the club throughout the year. Grant applications play a key role in obtaining these funds, with a large number of deadlines occurring in the first semester.
Outreach Team
The Outreach Team leads CLAWS’ external communications and manages the planning and execution of outreach events. As a key part of CLAWS’ AR initiatives and educational efforts, the team works to engage communities both locally and beyond. Outreach members will be coordinating a variety of events that could form lasting relationships with organizations, schools, and other teams.
Content Team
The Content Team is responsible for creating, managing, and posting content related to all aspects of CLAWS. The Content team will collaborate closely with all subteams to make sure everything is well promoted and documented.
Social Team
The Social Team is responsible for fostering connections among members and ensuring that CLAWS remains a fun and engaging community. Social Team members organize at least one event each month, giving the team plenty of opportunities to connect outside of work. Past events have included a BBQ, a trip to the apple orchard, and a MasterChef event!
AR Team
The AR Team focuses on developing the user interface for the SUITS challenge within Unity using MRTK3. They are responsible for building front-end AR features with Unity-specific functionalities such as spatial interactions, gesture controls and input logic tailored for augmented reality environments.
Web Team
The Web Team is responsible for building and maintaining the core technical infrastructure for CLAWS. This includes connecting our AR application with NASA’s telemetry server, our partner university’s Pressurized Rover, and our planned and established internal systems across projects.
AI Team
The AI Team ideates, develops, and integrates AI features that interface with our projects. Team members explore and implement key AI concepts such as computer vision, natural language processing, and machine learning to create interactive and adaptive experiences.
Hardware Team
The Hardware Team designs, builds, and implements the RASC-AL challenge. Members learn and apply skills in CAD, 3-D printing, electronics, circuit design, and programming to prototype and translate their designs into working devices.
UX Team
The UX Team designs the visuals and plans the functionality for the AR application and peripheral devices. Members focus on determining the scope of features, conducting human factors research, designing consistent UIs in Figma, and carrying out user testing of designs.
Research Team
The Research Team will build a VR simulation of last year’s project to evaluate how effectively it assists users with navigation. We’ll deliver a functional prototype, conduct literature reviews, design and run participant studies, and analyze the results. In collaboration with our faculty advisor, we’ll prepare and submit our findings to appropriate research venues and NASA.
Finance Team
The Finance Team is responsible for procuring funds for all events, merch, hardware components, and other needs for the club throughout the year. Grant applications play a key role in obtaining these funds, with a large number of deadlines occurring in the first semester.
Outreach Team
The Outreach Team leads CLAWS’ external communications and manages the planning and execution of outreach events. As a key part of CLAWS’ AR initiatives and educational efforts, the team works to engage communities both locally and beyond. Outreach members will be coordinating a variety of events that could form lasting relationships with organizations, schools, and other teams.
Content Team
The Content Team is responsible for creating, managing, and posting content related to all aspects of CLAWS. The Content team will collaborate closely with all subteams to make sure everything is well promoted and documented.
Social Team
The Social Team is responsible for fostering connections among members and ensuring that CLAWS remains a fun and engaging community. Social Team members organize at least one event each month, giving the team plenty of opportunities to connect outside of work. Past events have included a BBQ, a trip to the apple orchard, and a MasterChef event!
AR Team
The AR Team focuses on developing the user interface for the SUITS challenge within Unity using MRTK3. They are responsible for building front-end AR features with Unity-specific functionalities such as spatial interactions, gesture controls and input logic tailored for augmented reality environments.
Web Team
The Web Team is responsible for building and maintaining the core technical infrastructure for CLAWS. This includes connecting our AR application with NASA’s telemetry server, our partner university’s Pressurized Rover, and our planned and established internal systems across projects.
AI Team
The AI Team ideates, develops, and integrates AI features that interface with our projects. Team members explore and implement key AI concepts such as computer vision, natural language processing, and machine learning to create interactive and adaptive experiences.
Hardware Team
The Hardware Team designs, builds, and implements the RASC-AL challenge. Members learn and apply skills in CAD, 3-D printing, electronics, circuit design, and programming to prototype and translate their designs into working devices.
UX Team
The UX Team designs the visuals and plans the functionality across our projects. Members focus on determining the scope of features, conducting human factors research, designing consistent UIs in Figma, and carrying out user testing of designs.
Research Team
The Research Team will build a VR simulation of last year’s project to evaluate how effectively it assists users with navigation. Members will deliver a functional prototype, conduct literature reviews, design and run participant studies, and analyze the results. In collaboration with our faculty advisor, we’ll prepare and submit our findings to appropriate research venues and NASA.
Finance Team
The Finance Team is responsible for procuring funds for all events, merch, hardware components, and other needs for the club throughout the year. Grant applications play a key role in obtaining these funds, with a large number of deadlines occurring in the first semester.
Outreach Team
The Outreach Team leads CLAWS’ external communications and manages the planning and execution of outreach events. As a key part of CLAWS’ initiatives and educational efforts, the team works to engage communities both locally and beyond. Outreach members will be coordinating a variety of events that could form lasting relationships with organizations, schools, and other teams.
Content Team
The Content Team is responsible for creating, managing, and posting content related to all aspects of CLAWS. The Content team will collaborate closely with all subteams to make sure everything is well promoted and documented.
Social Team
The Social Team is responsible for fostering connections among members and ensuring that CLAWS remains a fun and engaging community. Social Team members organize at least one event each month, giving the team plenty of opportunities to connect outside of work. Past events have included a retreat, a BBQ, a trip to the apple orchard, and thanksgiving events!
AR Team
The AR Team focuses on developing the user interface for the SUITS challenge within Unity using MRTK3. They are responsible for building front-end AR features with Unity-specific functionalities such as spatial interactions, gesture controls and input logic tailored for augmented reality environments.
Web Team
The Web Team is responsible for building and maintaining the core technical infrastructure for CLAWS. This includes connecting our AR application with NASA’s telemetry server, our partner university’s Pressurized Rover, and our planned and established internal systems across projects.
AI Team
The AI Team ideates, develops, and integrates AI features that interface with our projects. Team members explore and implement key AI concepts such as computer vision, natural language processing, and machine learning to create interactive and adaptive experiences.
Hardware Team
The Hardware Team designs, builds, and implements the RASC-AL challenge. Members learn and apply skills in CAD, 3-D printing, electronics, circuit design, and programming to prototype and translate their designs into working devices.
UX Team
The UX Team designs the visuals and plans the functionality across our projects. Members focus on determining the scope of features, conducting human factors research, designing consistent UIs in Figma, and carrying out user testing of designs.
Research Team
The Research Team will build a VR simulation of last year’s project to evaluate how effectively it assists users with navigation. Members will deliver a functional prototype, conduct literature reviews, design and run participant studies, and analyze the results. In collaboration with our faculty advisor, we’ll prepare and submit our findings to appropriate research venues and NASA.
Finance Team
The Finance Team is responsible for procuring funds for all events, merch, hardware components, and other needs for the club throughout the year. Grant applications play a key role in obtaining these funds, with a large number of deadlines occurring in the first semester.
Outreach Team
The Outreach Team leads CLAWS’ external communications and manages the planning and execution of outreach events. As a key part of CLAWS’ initiatives and educational efforts, the team works to engage communities both locally and beyond. Outreach members will be coordinating a variety of events that could form lasting relationships with organizations, schools, and other teams.
Content Team
The Content Team is responsible for creating, managing, and posting content related to all aspects of CLAWS. The Content team will collaborate closely with all subteams to make sure everything is well promoted and documented.
Social Team
The Social Team is responsible for fostering connections among members and ensuring that CLAWS remains a fun and engaging community. Social Team members organize at least one event each month, giving the team plenty of opportunities to connect outside of work. Past events have included a retreat, a BBQ, a trip to the apple orchard, and thanksgiving events!
PROJECTS
PROJECTS
AURA • 2024 - 2025
AURA (Astronaut Unified Reality Assistant) is a unified astronaut interface designed to consolidate previous CLAWS iterations into a single, streamlined AR application. Built for EVA operations, AURA integrates the HoloLens 2 with a Local Mission Control Console (LMCC) and a Pressurized Rover, enabling seamless task execution across ingress/egress, site navigation, vitals monitoring, geological sampling, and rover control. The system emphasizes flexibility, autonomy, and reduced cognitive load. Interaction is achieved primarily through a hybrid gaze-and-button model, which minimizes errors encountered in prior eye-gaze-only systems. VEGA, the AI voice assistant, serves as a robust backup to ensure hands-free operability when needed, while LMCC integration strengthens astronaut–Mission Control collaboration. By consolidating features into a single ecosystem, AURA provides astronauts with a more reliable, intuitive, and mission-ready platform. It marks a pivotal step in CLAWS’ trajectory, transitioning from experimental prototypes to a fully realized operational concept.
IRIS • 2023 - 2024
IRIS (Immersive Reality Interplanetary System) is an AR application designed to support astronauts on Mars EVAs by providing intuitive, voice-driven interfaces for ingress, navigation, rover commanding, vitals monitoring, geological sampling, and equipment repair. The system synchronizes astronaut actions with a Local Mission Control Console (LMCC), creating a robust framework for multi-agent collaboration. At its core, IRIS emphasizes hands-free autonomy. Astronauts primarily interact through VEGA, an AI assistant enhanced to handle multi-step natural language commands. Task-specific “modes” ensure only relevant information is displayed, minimizing cognitive load. Supporting subsystems such as COR (Cardiac + Orientation Reporter) expanded the HoloLens’ field of view, while SCOUT, an autonomous rover with lidar and RGBD mapping, enhanced navigation and geology operations. IRIS represented a major leap in EVA simulation, demonstrating real-time astronaut–rover–Mission Control collaboration across scenarios such as rerouting to rescue a crewmate, conducting rover-assisted sampling, and performing equipment repair. Its modular design and voice-first approach established new standards for autonomy and usability in interplanetary mission support systems.
NOVA • 2022 - 2023
NOVA (Navigational Observation Visual Aid for Astronauts) is an AR application developed for the Microsoft HoloLens 2 to support astronauts during EVA operations with enhanced navigation, geology, and rover commanding. Its design aimed to reduce cognitive load while increasing operational efficiency by combining AR overlays, holograms, and audio feedback to guide astronauts through ingress, navigation, geological sampling, and return procedures. The system introduced innovations such as LUNA (Light Unit and Navigational Aid), a Raspberry Pi–based peripheral that expanded field of view and stabilized holograms, and a custom-built Mission Control Center (MCC), a full-stack web app enabling real-time communication and task synchronization between astronauts and ground operators. NOVA also featured a reimagined version of VEGA, upgraded into an intent-based assistant capable of natural, flexible interaction beyond rigid commands. During human-in-the-loop testing, NOVA validated usability across scenarios such as Trek Mode breadcrumb navigation and rover operation. However, its reliance on an eye-gaze dwell-to-select model revealed usability challenges like false selections and delays. These lessons directly informed the hybrid gaze-and-button interaction adopted in the next iteration, AURA.
HOSHI • 2021 - 2022
HOSHI is an AR-Assistive application that prioritizes an efficient balance of astronaut safety and autonomy. This cutting-edge application serves as a companion for navigation, geological sampling and documentation, as well as search and rescue operations. At its core, our system interacts seamlessly with users through the AI-powered voice assistant "VEGA". Through extensive research, we have identified communication challenges between astronauts and Mission Control, primarily being the overreliance on audio instructions from the Mission Control Center and the constraints of conventional pen-and-paper methods. These hurdles impose a substantial cognitive burden on astronauts. HOSHI rises to the occasion as an intuitive and user-centric application. It emphasizes safety, autonomy, visibility, accessibility, functionality over learnability, minimizing cognitive load, and actionable information. The primary mode of interaction with HOSHI centers around voice commands, executed through the AI voice assistant VEGA. Additionally, secondary interactions such as physical gestures (finger points, gentle hand motions) guide astronauts through their missions. Feedback is then given through visual and audio cues.
ATLAS • 2020 - 2021
ATLAS (the AR Toolkit for Lunar Astronauts and Scientists) is our first iteration of the SUITS Challenge, built for the Microsoft HoloLens 2 in collaboration with U-M’s BLiSS project team. Its modular, compact design streamlined access to mission critical information via protocols tailored to various Extra-Vehicular Activity (EVA) stages: mission planning, suit prep, sample collection, repairs, emergencies, and abort procedures. Astronauts received navigation support and vital readouts through the voice assistant VEGA. A Raspberry Pi–based Mobile Support Equipment (MSE) stack with sensors captured external data, and a Mission Control Center (MCC) developed alongside BLiSS’s X-Cap project maintained reliable system-wide data flow. QR codes surfaced location context on demand and–paired with voice input–enabled hands-free interaction. Emergency alerts and preconfigured abort protocols coordinated between MCC and ATLAS to communicate biometrics efficiently. Despite COVID-19 disruptions, the team successfully presented ATLAS at NASA and the Exploration Science Forum (ESF) virtually in May 2020. VEGA delivered core vitals functionality, though navigation and broader system support needed further refinement. Next steps mapped out included expanding MCC’s AR capabilities and tightening UI integration with the main application. Overall, ATLAS established a strong foundation for the team’s next iterations.
AURA • 2024 - 2025
AURA (Astronaut Unified Reality Assistant) is a unified astronaut interface designed to consolidate previous CLAWS iterations into a single, streamlined AR application. Built for EVA operations, AURA integrates the HoloLens 2 with a Local Mission Control Console (LMCC) and a Pressurized Rover, enabling seamless task execution across ingress/egress, site navigation, vitals monitoring, geological sampling, and rover control. The system emphasizes flexibility, autonomy, and reduced cognitive load. Interaction is achieved primarily through a hybrid gaze-and-button model, which minimizes errors encountered in prior eye-gaze-only systems. VEGA, the AI voice assistant, serves as a robust backup to ensure hands-free operability when needed, while LMCC integration strengthens astronaut–Mission Control collaboration. By consolidating features into a single ecosystem, AURA provides astronauts with a more reliable, intuitive, and mission-ready platform. It marks a pivotal step in CLAWS’ trajectory, transitioning from experimental prototypes to a fully realized operational concept.
IRIS • 2023 - 2024
IRIS (Immersive Reality Interplanetary System) is an AR application designed to support astronauts on Mars EVAs by providing intuitive, voice-driven interfaces for ingress, navigation, rover commanding, vitals monitoring, geological sampling, and equipment repair. The system synchronizes astronaut actions with a Local Mission Control Console (LMCC), creating a robust framework for multi-agent collaboration. At its core, IRIS emphasizes hands-free autonomy. Astronauts primarily interact through VEGA, an AI assistant enhanced to handle multi-step natural language commands. Task-specific “modes” ensure only relevant information is displayed, minimizing cognitive load. Supporting subsystems such as COR (Cardiac + Orientation Reporter) expanded the HoloLens’ field of view, while SCOUT, an autonomous rover with lidar and RGBD mapping, enhanced navigation and geology operations. IRIS represented a major leap in EVA simulation, demonstrating real-time astronaut–rover–Mission Control collaboration across scenarios such as rerouting to rescue a crewmate, conducting rover-assisted sampling, and performing equipment repair. Its modular design and voice-first approach established new standards for autonomy and usability in interplanetary mission support systems.
NOVA • 2022 - 2023
NOVA (Navigational Observation Visual Aid for Astronauts) is an AR application developed for the Microsoft HoloLens 2 to support astronauts during EVA operations with enhanced navigation, geology, and rover commanding. Its design aimed to reduce cognitive load while increasing operational efficiency by combining AR overlays, holograms, and audio feedback to guide astronauts through ingress, navigation, geological sampling, and return procedures. The system introduced innovations such as LUNA (Light Unit and Navigational Aid), a Raspberry Pi–based peripheral that expanded field of view and stabilized holograms, and a custom-built Mission Control Center (MCC), a full-stack web app enabling real-time communication and task synchronization between astronauts and ground operators. NOVA also featured a reimagined version of VEGA, upgraded into an intent-based assistant capable of natural, flexible interaction beyond rigid commands. During human-in-the-loop testing, NOVA validated usability across scenarios such as Trek Mode breadcrumb navigation and rover operation. However, its reliance on an eye-gaze dwell-to-select model revealed usability challenges like false selections and delays. These lessons directly informed the hybrid gaze-and-button interaction adopted in the next iteration, AURA.
HOSHI • 2021 - 2022
HOSHI is an AR-Assistive application that prioritizes an efficient balance of astronaut safety and autonomy. This cutting-edge application serves as a companion for navigation, geological sampling and documentation, as well as search and rescue operations. At its core, our system interacts seamlessly with users through the AI-powered voice assistant "VEGA". Through extensive research, we have identified communication challenges between astronauts and Mission Control, primarily being the overreliance on audio instructions from the Mission Control Center and the constraints of conventional pen-and-paper methods. These hurdles impose a substantial cognitive burden on astronauts. HOSHI rises to the occasion as an intuitive and user-centric application. It emphasizes safety, autonomy, visibility, accessibility, functionality over learnability, minimizing cognitive load, and actionable information. The primary mode of interaction with HOSHI centers around voice commands, executed through the AI voice assistant VEGA. Additionally, secondary interactions such as physical gestures (finger points, gentle hand motions) guide astronauts through their missions. Feedback is then given through visual and audio cues.
ATLAS • 2020 - 2021
ATLAS (the AR Toolkit for Lunar Astronauts and Scientists) is our first iteration of the SUITS Challenge, built for the Microsoft HoloLens 2 in collaboration with U-M’s BLiSS project team. Its modular, compact design streamlined access to mission critical information via protocols tailored to various Extra-Vehicular Activity (EVA) stages: mission planning, suit prep, sample collection, repairs, emergencies, and abort procedures. Astronauts received navigation support and vital readouts through the voice assistant VEGA. A Raspberry Pi–based Mobile Support Equipment (MSE) stack with sensors captured external data, and a Mission Control Center (MCC) developed alongside BLiSS’s X-Cap project maintained reliable system-wide data flow. QR codes surfaced location context on demand and–paired with voice input–enabled hands-free interaction. Emergency alerts and preconfigured abort protocols coordinated between MCC and ATLAS to communicate biometrics efficiently. Despite COVID-19 disruptions, the team successfully presented ATLAS at NASA and the Exploration Science Forum (ESF) virtually in May 2020. VEGA delivered core vitals functionality, though navigation and broader system support needed further refinement. Next steps mapped out included expanding MCC’s AR capabilities and tightening UI integration with the main application. Overall, ATLAS established a strong foundation for the team’s next iterations.
AURA • 2024 - 2025
AURA (Astronaut Unified Reality Assistant) is a unified astronaut interface designed to consolidate previous CLAWS iterations into a single, streamlined AR application. Built for EVA operations, AURA integrates the HoloLens 2 with a Local Mission Control Console (LMCC) and a Pressurized Rover, enabling seamless task execution across ingress/egress, site navigation, vitals monitoring, geological sampling, and rover control. The system emphasizes flexibility, autonomy, and reduced cognitive load. Interaction is achieved primarily through a hybrid gaze-and-button model, which minimizes errors encountered in prior eye-gaze-only systems. VEGA, the AI voice assistant, serves as a robust backup to ensure hands-free operability when needed, while LMCC integration strengthens astronaut–Mission Control collaboration. By consolidating features into a single ecosystem, AURA provides astronauts with a more reliable, intuitive, and mission-ready platform. It marks a pivotal step in CLAWS’ trajectory, transitioning from experimental prototypes to a fully realized operational concept.
IRIS • 2023 - 2024
IRIS (Immersive Reality Interplanetary System) is an AR application designed to support astronauts on Mars EVAs by providing intuitive, voice-driven interfaces for ingress, navigation, rover commanding, vitals monitoring, geological sampling, and equipment repair. The system synchronizes astronaut actions with a Local Mission Control Console (LMCC), creating a robust framework for multi-agent collaboration. At its core, IRIS emphasizes hands-free autonomy. Astronauts primarily interact through VEGA, an AI assistant enhanced to handle multi-step natural language commands. Task-specific “modes” ensure only relevant information is displayed, minimizing cognitive load. Supporting subsystems such as COR (Cardiac + Orientation Reporter) expanded the HoloLens’ field of view, while SCOUT, an autonomous rover with lidar and RGBD mapping, enhanced navigation and geology operations. IRIS represented a major leap in EVA simulation, demonstrating real-time astronaut–rover–Mission Control collaboration across scenarios such as rerouting to rescue a crewmate, conducting rover-assisted sampling, and performing equipment repair. Its modular design and voice-first approach established new standards for autonomy and usability in interplanetary mission support systems.
NOVA • 2022 - 2023
NOVA (Navigational Observation Visual Aid for Astronauts) is an AR application developed for the Microsoft HoloLens 2 to support astronauts during EVA operations with enhanced navigation, geology, and rover commanding. Its design aimed to reduce cognitive load while increasing operational efficiency by combining AR overlays, holograms, and audio feedback to guide astronauts through ingress, navigation, geological sampling, and return procedures. The system introduced innovations such as LUNA (Light Unit and Navigational Aid), a Raspberry Pi–based peripheral that expanded field of view and stabilized holograms, and a custom-built Mission Control Center (MCC), a full-stack web app enabling real-time communication and task synchronization between astronauts and ground operators. NOVA also featured a reimagined version of VEGA, upgraded into an intent-based assistant capable of natural, flexible interaction beyond rigid commands. During human-in-the-loop testing, NOVA validated usability across scenarios such as Trek Mode breadcrumb navigation and rover operation. However, its reliance on an eye-gaze dwell-to-select model revealed usability challenges like false selections and delays. These lessons directly informed the hybrid gaze-and-button interaction adopted in the next iteration, AURA.
HOSHI • 2021 - 2022
HOSHI is an AR-Assistive application that prioritizes an efficient balance of astronaut safety and autonomy. This cutting-edge application serves as a companion for navigation, geological sampling and documentation, as well as search and rescue operations. At its core, our system interacts seamlessly with users through the AI-powered voice assistant "VEGA". Through extensive research, we have identified communication challenges between astronauts and Mission Control, primarily being the overreliance on audio instructions from the Mission Control Center and the constraints of conventional pen-and-paper methods. These hurdles impose a substantial cognitive burden on astronauts. HOSHI rises to the occasion as an intuitive and user-centric application. It emphasizes safety, autonomy, visibility, accessibility, functionality over learnability, minimizing cognitive load, and actionable information. The primary mode of interaction with HOSHI centers around voice commands, executed through the AI voice assistant VEGA. Additionally, secondary interactions such as physical gestures (finger points, gentle hand motions) guide astronauts through their missions. Feedback is then given through visual and audio cues.
ATLAS • 2020 - 2021
ATLAS (the AR Toolkit for Lunar Astronauts and Scientists) is our first iteration of the SUITS Challenge, built for the Microsoft HoloLens 2 in collaboration with U-M’s BLiSS project team. Its modular, compact design streamlined access to mission critical information via protocols tailored to various Extra-Vehicular Activity (EVA) stages: mission planning, suit prep, sample collection, repairs, emergencies, and abort procedures. Astronauts received navigation support and vital readouts through the voice assistant VEGA. A Raspberry Pi–based Mobile Support Equipment (MSE) stack with sensors captured external data, and a Mission Control Center (MCC) developed alongside BLiSS’s X-Cap project maintained reliable system-wide data flow. QR codes surfaced location context on demand and–paired with voice input–enabled hands-free interaction. Emergency alerts and preconfigured abort protocols coordinated between MCC and ATLAS to communicate biometrics efficiently. Despite COVID-19 disruptions, the team successfully presented ATLAS at NASA and the Exploration Science Forum (ESF) virtually in May 2020. VEGA delivered core vitals functionality, though navigation and broader system support needed further refinement. Next steps mapped out included expanding MCC’s AR capabilities and tightening UI integration with the main application. Overall, ATLAS established a strong foundation for the team’s next iterations.
AURA • 2024 - 2025
AURA (Astronaut Unified Reality Assistant) is a unified astronaut interface designed to consolidate previous CLAWS iterations into a single, streamlined AR application. Built for EVA operations, AURA integrates the HoloLens 2 with a Local Mission Control Console (LMCC) and a Pressurized Rover, enabling seamless task execution across ingress/egress, site navigation, vitals monitoring, geological sampling, and rover control. The system emphasizes flexibility, autonomy, and reduced cognitive load. Interaction is achieved primarily through a hybrid gaze-and-button model, which minimizes errors encountered in prior eye-gaze-only systems. VEGA, the AI voice assistant, serves as a robust backup to ensure hands-free operability when needed, while LMCC integration strengthens astronaut–Mission Control collaboration. By consolidating features into a single ecosystem, AURA provides astronauts with a more reliable, intuitive, and mission-ready platform. It marks a pivotal step in CLAWS’ trajectory, transitioning from experimental prototypes to a fully realized operational concept.
IRIS • 2023 - 2024
IRIS (Immersive Reality Interplanetary System) is an AR application designed to support astronauts on Mars EVAs by providing intuitive, voice-driven interfaces for ingress, navigation, rover commanding, vitals monitoring, geological sampling, and equipment repair. The system synchronizes astronaut actions with a Local Mission Control Console (LMCC), creating a robust framework for multi-agent collaboration. At its core, IRIS emphasizes hands-free autonomy. Astronauts primarily interact through VEGA, an AI assistant enhanced to handle multi-step natural language commands. Task-specific “modes” ensure only relevant information is displayed, minimizing cognitive load. Supporting subsystems such as COR (Cardiac + Orientation Reporter) expanded the HoloLens’ field of view, while SCOUT, an autonomous rover with lidar and RGBD mapping, enhanced navigation and geology operations. IRIS represented a major leap in EVA simulation, demonstrating real-time astronaut–rover–Mission Control collaboration across scenarios such as rerouting to rescue a crewmate, conducting rover-assisted sampling, and performing equipment repair. Its modular design and voice-first approach established new standards for autonomy and usability in interplanetary mission support systems.
NOVA • 2022 - 2023
NOVA (Navigational Observation Visual Aid for Astronauts) is an AR application developed for the Microsoft HoloLens 2 to support astronauts during EVA operations with enhanced navigation, geology, and rover commanding. Its design aimed to reduce cognitive load while increasing operational efficiency by combining AR overlays, holograms, and audio feedback to guide astronauts through ingress, navigation, geological sampling, and return procedures. The system introduced innovations such as LUNA (Light Unit and Navigational Aid), a Raspberry Pi–based peripheral that expanded field of view and stabilized holograms, and a custom-built Mission Control Center (MCC), a full-stack web app enabling real-time communication and task synchronization between astronauts and ground operators. NOVA also featured a reimagined version of VEGA, upgraded into an intent-based assistant capable of natural, flexible interaction beyond rigid commands. During human-in-the-loop testing, NOVA validated usability across scenarios such as Trek Mode breadcrumb navigation and rover operation. However, its reliance on an eye-gaze dwell-to-select model revealed usability challenges like false selections and delays. These lessons directly informed the hybrid gaze-and-button interaction adopted in the next iteration, AURA.
HOSHI • 2021 - 2022
HOSHI is an AR-Assistive application that prioritizes an efficient balance of astronaut safety and autonomy. This cutting-edge application serves as a companion for navigation, geological sampling and documentation, as well as search and rescue operations. At its core, our system interacts seamlessly with users through the AI-powered voice assistant "VEGA". Through extensive research, we have identified communication challenges between astronauts and Mission Control, primarily being the overreliance on audio instructions from the Mission Control Center and the constraints of conventional pen-and-paper methods. These hurdles impose a substantial cognitive burden on astronauts. HOSHI rises to the occasion as an intuitive and user-centric application. It emphasizes safety, autonomy, visibility, accessibility, functionality over learnability, minimizing cognitive load, and actionable information. The primary mode of interaction with HOSHI centers around voice commands, executed through the AI voice assistant VEGA. Additionally, secondary interactions such as physical gestures (finger points, gentle hand motions) guide astronauts through their missions. Feedback is then given through visual and audio cues.
ATLAS • 2020 - 2021
ATLAS (the AR Toolkit for Lunar Astronauts and Scientists) is our first iteration of the SUITS Challenge, built for the Microsoft HoloLens 2 in collaboration with U-M’s BLiSS project team. Its modular, compact design streamlined access to mission critical information via protocols tailored to various Extra-Vehicular Activity (EVA) stages: mission planning, suit prep, sample collection, repairs, emergencies, and abort procedures. Astronauts received navigation support and vital readouts through the voice assistant VEGA. A Raspberry Pi–based Mobile Support Equipment (MSE) stack with sensors captured external data, and a Mission Control Center (MCC) developed alongside BLiSS’s X-Cap project maintained reliable system-wide data flow. QR codes surfaced location context on demand and–paired with voice input–enabled hands-free interaction. Emergency alerts and preconfigured abort protocols coordinated between MCC and ATLAS to communicate biometrics efficiently. Despite COVID-19 disruptions, the team successfully presented ATLAS at NASA and the Exploration Science Forum (ESF) virtually in May 2020. VEGA delivered core vitals functionality, though navigation and broader system support needed further refinement. Next steps mapped out included expanding MCC’s AR capabilities and tightening UI integration with the main application. Overall, ATLAS established a strong foundation for the team’s next iterations.
Long-term human presence on the Moon through the Artemis program means engineers must design not only individual technologies, but entire mission systems that support sustained surface operations. From sample return logistics to autonomous mobility and lunar base infrastructure, future exploration depends on scalable, integrated architectures that enable science, safety, and continuous operations.
Long-term human presence on the Moon through the Artemis program means engineers must design not only individual technologies, but entire mission systems that support sustained surface operations. From sample return logistics to autonomous mobility and lunar base infrastructure, future exploration depends on scalable, integrated architectures that enable science, safety, and continuous operations.
Stemming from NASA’s foundational Joint-AR project, the NASA SUITS Challenge tasks university teams with developing AR interfaces for lunar astronauts, and pressurized rovers for assisting them. The helmet display is designed to support astronauts with navigation, task management, vitals tracking, geological sample logging, and communication between mission control and another Universities rover.
At the end of each year, 10 university teams are selected as finalists from their written proposals. NASA scientists, engineers, designers, and astronauts evaluate the student-built projects, providing feedback. Past presenters have included teams from Stanford, Duke, USC, UC Berkeley, Carnegie Mellon, UT-Austin, Northeastern University, Purdue University, Columbia University, Boise State University, and many more.
As NASA launches the Artemis program for sustained human presence on the moon and ultimately, Mars, engineers are considering what technology will best aid astronauts to safely and successfully complete their missions. Today, the Mission Control Center at NASA relays all pertinent information to the crew via a voice loop. In the future, communication delays upwards of 20 minutes to the surface of Mars will require crew members to have more autonomy.
Stemming from NASA’s foundational Joint-AR project, the NASA SUITS Challenge tasks university teams with developing AR interfaces for lunar astronauts, and pressurized rovers for assisting them. The helmet display is designed to support astronauts with navigation, task management, vitals tracking, geological sample logging, and communication between mission control and another Universities rover.
At the end of each year, 10 university teams are selected as finalists from their written proposals. NASA scientists, engineers, designers, and astronauts evaluate the student-built projects, providing feedback. Past presenters have included teams from Stanford, Duke, USC, UC Berkeley, Carnegie Mellon, UT-Austin, Northeastern University, Purdue University, Columbia University, Boise State University, and many more.
RASC-AL
SUBTEAMS
PROJECTS

About Us

About Us
CLAWS is an interdisciplinary organization at the University of Michigan. Our 5 core teams—Development, Hardware, UX, Research, and Business—collaborate to design and deliver engineering projects that advance human and robotic exploration in space. The Development Team works across multiple initiatives with a primary focus on the software for the NASA’s SUITS challenge. The Hardware Team on the other hand leads physical systems design and prototyping efforts for NASA’s RASC-AL challenge. The Business Team supports organizational growth, partnerships, and project sustainability while also assisting the Research Team with grants, studies, and technical documentation. The UX Team collaborates across all disciplines to design intuitive interfaces and evaluate XR systems with an emphasis on human factors and astronaut usability.

About Us

About Us
CLAWS is an interdisciplinary organization at the University of Michigan. Our 5 core teams—Development, Hardware, UX, Research, and Business—collaborate to design and deliver engineering projects that advance human and robotic exploration in space. The Development Team works across multiple initiatives with a primary focus on the software for the NASA’s SUITS challenge. The Hardware Team on the other hand leads physical systems design and prototyping efforts for NASA’s RASC-AL challenge. The Business Team supports organizational growth, partnerships, and project sustainability while also assisting the Research Team with grants, studies, and technical documentation. The UX Team collaborates across all disciplines to design intuitive interfaces and evaluate XR systems with an emphasis on human factors and astronaut usability.

























