This project focused on the development of a simulated environment for eye fundus examination employing VR and AR technologies.
VR comes in a one-size-fits-all package where users adapt to the hardware and the software. While this works for the average person, it may affect how tasks are performed with impacts on immersion, presence and engagement. This project focuses on factoring upper limb ergonomics to customize reach and grasp VR tasks to improve task completion.
This project focuses on understanding gaze tracking to customize VR experiences towards task completion. Currently, the use case centers around cardiac auscultation comparing how gaze tracking employing the head mounted display’s orientation in conjunction with a ray cast and a cone cast method compares to eye tracking.
This iteration focused on employing physiological measures to understand challenges when performing an inspection task in VR.
This iteration focused on creating a virtual experience for visualizing radiation in a laboratory setting.
This iteration presented a top view of Durham Region to visualize a radioactive plume with the goal to select evaluation routes.
This project focuses on the development of a VR reminiscence therapy by exploring different levels of immersion.
This iteration presents a VR room that displays multimedia for helping the patient recall past events.
This iteration presents a non-immersive VR prototype that employs head tracking employing a web camera to create the illusion of immersion through a regular monitor.
The project focused on exploring the use of spatial audio in a webXR application to help users with reduced or no sight to navigate a virtual conference room.