Alireza Bahremand

PhD Student | Research & Development

Resume && CV | TEDx Talk
Professional Art

NASA

At NASA MSFC, I worked with a talented team to produce 5 Virtual Reality (VR) projects in one summer.
We built a Mars Habitat, CAD visualization tools, & training simulations for astronauts that board the ISS.
For the training simulations, I designed, documented, & developed a tool to create VR training simulations.

Collaborators: Taylor Waddell, Natasha Liston-Beck

ASU Commencement

Contracted by the University, my colleagues and I published a mobile AR Commencement experiencec for iOS and Android.
We performed volumetric captures, developed a framework, define QA testing procedures, and followed a SCRUM methodology.

Collaborators: Aashiq Shaikh, Lauren Gold, Linda Nguyen, Matt Soson,

Memories

The goal of this project is to showcase how mixed reality technology will change the capturing and playback of our memories.
It utilizes the Azure Kinect for 3D captures, Azure Spatial Anchors for virtual-physical anchoring, and Unity3D to tie it all together.
As the user selects a memory and points their device at the relative location, it instantly plays.

Collaborators: Dylan Kerr, Eliza Zarr

BSI

The British Standards Institute requested a VR Sea Crate Audit Training simulation.
My colleague and I designed assets, developed Unity3D software, and performed QA testing.

Collaborators: Dylan Kerr

VISOR

Scientists visualize multi-variate data from a multitude of space instruments (satellites & rovers).
To complement the workflow of disseminating scientific discoveries, we have built an AR/VR framework for
users to interact with & visualize spectral pixels captured from orbit.

Collaborators: Lauren Gold, Connor Richards, Robert LiKamwa, Kathryn Powell

GLEAM

In mobile AR, there was a lack of realism due to virtual lighting conditions – the virtual objects did not blend with the physical environment.
Our solution was to create a framework that used reflective probes to further understand the lighting conditions of the user’s physical environment.

Collaborators: Siddhant Prakash, Linda Ngyen, Robert LiKamwa, Paul Nathan

SWISH

Working with a cross-disciplinary team, we identified a disparity between hands-on experience & online chemistry programs.
To solve this problem, we created a software framework that provides tactile sensations for programmable virtual fluids.

Collaborators: Shahabedin Sagheb, Frank Liu, Robert LiKamwa, Linda Nguyen

HoloLucination

At Amazon re:MARS 2019, Dr. Tanya Harrison gave a talk titled, “Roving Robots on the Red Planet.”
Her talk featured an AR portion in which she interacted with the Curiosity Rover in front of the audience.
For this to happen, we built an Augmented Reality framework allowing users to orchestrate mixed reality presentations.

Collaborators: Tanya Harrison, Robert LiKamwa

Stair Rails

Problem: My roommates & I have to manually use lights to go up/down our stairs.
Solution: We custom 3D printed mounts, wired up motion sensors, LED strips,
microphones, & connected everything using an Arduino.
Now, our stairs are interactively lit.

Collaborators: Dylan Kerr