At NASA MSFC, I worked with a talented team to produce 5 Virtual Reality (VR)
projects in one summer.
We built a Mars Habitat, CAD visualization tools, & training simulations for
astronauts that board the ISS.
For the training simulations, I designed, documented, & developed a tool to
create VR training simulations.
Scientists view multi-variate data from a multitude of space instruments (satellites
& rovers) using software like JMARS.
To complement the workflow of disseminating scientific discoveries, we have
built an AR/VR framework for
users to interact with & visualize spectral pixels captured from orbit. This
In mobile Augmented Reality (AR), there was a lack of realism due to virtual
lighting conditions – the virtual objects did not blend with the physical
Our solution was to create a framework that used reflective probes to further
understand the lighting conditions of the user’s physical environment.
For GLEAM, I worked on integrating our system into different camera-based
platforms, VR headsets, & native AR software libraries.
Working with a cross-disciplinary team, we identified a disparity between hands-on
experience & online chemistry programs.
To solve this problem, we created a software framework that provides tactile
sensations for programmable virtual fluids.
For SWISH, I was involved with the design of the system & development of
VR-to-Microcontroller conversions & communications.
At Amazon re:MARS 2019, Dr. Tanya Harrison gave a talk titled, “Roving Robots on the
Her talk featured an AR portion in which she interacted with the Curiosity Rover
in front of the audience.
For this to happen, we built an Augmented Reality framework allowing users to
orchestrate mixed reality presentations.
Problem: My roommates & I have to manually use lights to go up/down
Solution: We custom 3D printed mounts, wired up motion sensors,
microphones, & connected everything using an Arduino.
Now, our stairs are interactively lit.