- Create interactive VR environments
- Integrate multiple external controls
- Design and animate 3D objects
Utilizing the Unity UNet or Optitrack NatNet SDK, I connected VR applications to a desktop computer to streamline data sharing. For example, I created an app sends live mocap to a Mecanim-based model.
I collaborated with an international studio to implement live player tracking. We used OptiTrack Motion Capture sensors to pass marker coordinates to the Gear VR.
I created a mobile VR controller experience for sedentary patients. We studied the spatial navigation memory of patients in bed.
An integral part to learning and memory is where a person looks. Using an eye tracking Gear VR SDK, I integrated user interface and outputted the information for further analysis.
I gather information from the VR app related to their behavior, such as distance and speed. I then compare the files to the corresponding participants' iEEG.
With the Artec Eva 3D scanner, I created human models and manipulated them using Autodesk Maya. Regions of the model can have different lengths and conventional body measurements. I then used Unity to create an easy interface, which keeps track on what changes to the model were made. This application is being currently used in a research study aimed to better assess body dysmoprhic disorders.
To study how humans interact with other humans, I created a database of human animated models. Natural face and body movements were recorded with motion capture and mapped onto 3D models created from 3D scanning and manipulated with Autodesk Maya.