Alireza Bahremand

Research Engineer

I specialize in multi-sensory digital embodiment. I've worked on frameworks for olfactory and tactile experiences in XR, multi-sensory storytelling, and AI-assisted volumetric streaming. Outside of work, I enjoy drawing, reading, rock climbing, and discussing gaming classics.


2023
Card image cap
Volumetric Video Editor

A spatial, volumetric video editing and streaming system for Quest and iOS.

Card image cap
SBS Generator

A Mono to SBS Video Generator.

Card image cap
Volumetric Streaming

A volumetric streaming system (VSS) that includes a volumetric video player, playback support, schedulable data operations and AI tasks (e.g., filter, style), and state-of-the-art web streaming technologies.

Card image cap
Chatbot NPC

A Unity project for an AI-driven chatbot NPC with voice and vision capabilities, leveraging OpenAI, Azure, Google Cloud, Oculus Lip Sync, and YOLO-NAS for a rich, interactive user experience.

Card image cap
The Smell Engine

A framework and system that integrates real-time, physical odor synthesis into virtual environments, thereby enhancing user immersion by providing a more granular and localized olfactory experience than traditional systems.

Card image cap
Dreamscape Immersive

I worked on experimental applications (interactive movie poster), developer tooling (runtime debugging tools using classpath scanning libraries), and cinematic VR experiences (VR-film). At Dreamscape, I worked with various teams (e.g., sound, designers, programmers, story).

Card image cap
XR Memories

A prototype that meticulously replays memories as captured, this project uses Azure Anchors, Cosmos DB, and RGBD data to enable location-anchored, volumetric playback of recorded events.

Card image cap
Baltu Technologies

At Baltu, I was a lead developer on SuperDoc, an On-The-Job training platform used to quickly capture and deliver the specialized skills and knowledge of any organization. One of my core focuses was building the REST API and seamlessly integrating it into the application.

Card image cap
Dreamscape Mobile

I was project lead for an ASU AR-based application allowing students to explore content collected in ASU Dreamscape Learn experiences. Tested by 100+ students in introductory biology courses.

Card image cap
Designing for Dreamscape

Using Dreamscape Immersive technology, we built a VR-based cinematic story on climate change. As lead developer, I met with various teams (Design, Narrative, Programming, Audio) and helped put the experience together.

Card image cap
Planetary Visor

An immersive visualization tool that provides geologic and geographic context of the Martian terrain, by localizing it with spectroscopy data from the orbiting satellite.

Card image cap
BSI Training Simulation

A VR training app teaching users how to perform tasks then grades their performance. We built a lightweight Unity-based framework out of this task-based training sim, allowing devs to build more experiences.

Card image cap
ASU Campus Tours

Students can virtually explore ASU by teleporting into various 360-degree portals, switching between campuses, and free form navigation, supporting WebXR and desktop.

Card image cap
ASU Commencement

An AR-based interactive ASU commencement experience including 3D recordings of speakers, resulted in 5K virtual attendees experiencing Commencement in their personal space.

Card image cap
NASA Astronaut Training

Developed a system to import CAD models into VR applications. Presented work to MSFC board of directors as a means of reducing costs for designing and testing ISS system components.

Card image cap
SWISH

SWISH is a mixed-reality interface that provides realistic haptic feedback simulating the behavior of fluid in vessels, thus enhancing the immersive experience in virtual environments.

Card image cap
HoloLucination

A framework allowing presenters to invoke animations simultaneously on HMDs and mobile devices in AR.

Card image cap
Staircase

A home project, we enabled our staircase rail to be responsive to music and motion.