
Extended Reality
Our research is interdisciplinary and covers the spectrum of virtual reality, augmented reality and mixed reality for a wide range of immersive applications where elements from gaming, 3D graphics, digital media and simulations are used. We design and develop innovative applications to enhance learning, cognition, communication, decision-making, usability and user experience in fields such as health care, education, the military, manufacturing and science, pushing the boundaries of immersive technology. Examples of projects include a Brain-computer augmented reality environment to support and trace complex consumer decision-making, political decisionmaking and psycho-therapeutic applications; virtual geriatric patients that aim to improve caregiver training through interactive simulations; anywhere simulation software that facilitates remote healthcare education; advanced visualization techniques for large datasets and immersive simulations, forensic research and digital histories; and measuring and tracking wound progressions in 3D on complex surfaces.
Frank Biocca |
Research Areas: Virtual and augmented reality systems, components for brain-computer interfaces, real-time public opinion measurement Design of Virtual Environments and Interfaces to Support Information, Perception and Cognition Our research focuses on designing VR and AR hardware and software to enhance user cognition and performance across medical, scientific and military applications within the Media Interface and Network Design labs. In collaboration with teams in Spain and Korea, we explore how a brain-computer AR environment can aid complex decision-making and therapeutic processes, using untethered brain and psychophysiological sensors to detect how virtual features influence thinking and enabling real-time environment adaptation. Other projects investigate how VR environments alter body perception and social cognition in negotiation, training and decision-making. In scientific visualization, our AR environments allow users to experience physical forces or microscopic phenomena, as in an astrophysics project where magnetic sensor data visualizes Earth’s local magnetic fields in real time.few, incomplete, imperfect, missing, noisy or uncertain training data. |
Jacob Chakareski |
Research Areas: Immersive communication, augmented/virtual reality Virtual Human Teleportation Virtual reality and 360-degree video are emerging technologies that can enable virtual human. teleportation to any remote corner of the globe. This requires ultra-low latency, gigabit-per-second wireless speeds and data-intensive computing. Our research investigates synergies at the intersection of 6DOF 360-degree video representation methods, edge computing, UAV-IoT, millimeter-wave and free-space optics wireless technologies. It transmits data using much higher electromagnetic wave frequencies to enable the ultra-high data rates and ultra-low latencies required by next generation societal VR applications. Real-Time Structure-Aware Reinforcement Learning Reinforcement Learning (RL) provides a natural paradigm for decision-making in diverse emerging applications that operate in unknown environments and with limited data of unknown stochastic characteristics. Paramount to the effective operation of these ultralow latency applications, such as IoT sensing, autonomous navigation and mobile virtual and augmented reality, is the ability to learn the optimal operation actions online and as quickly as possible. Existing state-of-the-art RL methods either take too long to converge or are too complex to deploy. Our research examines novel structure-aware RL methods that integrate basic system knowledge to compute learning actions updates across multiple states or even the entire state-space of the problem of interest, in parallel. To address the challenge of computational complexity that is introduced, our methods integrate analysis that help effectively tradeoff learning acceleration and computing complexity. Societal Applications Our research focuses on interdisciplinary synergies to enable next-generation applications. For instance, a National Institutes of Health project at the intersection of networked virtual reality, artificial intelligence and low-vision rehabilitation aims to enable novel, previously inaccessible and unaffordable health care services to be delivered broadly and affordably. Other projects include the integration of virtual reality, realtime reinforcement learning and soft-exoskeletons for future physical therapy and the synergy of UAV-IoT and VR towards next generation forest fire monitoring. |
Salam Daher |
Research Areas: Augmented, virtual and mixed reality; physical-virtual, 3D graphics; virtual humans; synthetic reality; modeling simulation and training; distance simulation; healthcare applications and virtual patients; 3D wounds visualization, measurement and tracking Mixed Reality Simulations to Improve Training Our research focuses on creating simulations using computer graphics, multimedia and mixed reality to improve training in different domains including health care simulation. We are especially interested in research involving virtual humans and multisensory experiences. We developed a new class of augmented reality patient simulators called physical-virtual patients that allow health care educators to interact with a life-size simulated patient by providing realtime physical tactile cues such as temperature and pulse; auditory cues such as speech and heart sounds; rich dynamic visual cues such as facial expressions indicating pain or emotions; and changes in appearance such as skin color and wounds. Training Caregivers of Virtual Geriatric Patients We are developing a new generation of Virtual Geriatric Patients (VGP). The VGPs are realistic, embodied, conversational virtual humans who are aware of their surroundings. The VGPs are displayed in Mixed Reality as training scenarios aimed to improve caregivers’ perceptions, attitudes, communication and care towards older adults. This research is supported by a grant from the National Science Foundation Future of Work at the Human Technology Frontier. 3D Graphics for Wound Visualization, Measurements and Tracking Our research focuses on visualizing wounds in 3D for accurate measurements, reduced variability of measurements and improved tracking of patients’ progress. In the clinical setting, this translational research can reduce errors, improve healing estimates and improve patient outcomes. In the training setting, this technology can improve healthcare trainees’ skills in wound assessment, especially when combined with mixed reality. This research is partially supported by the New Jersey Health Foundation and by the NJIT Technology Innovation, Translation and Acceleration (TITA) seed grant. Interactive Remote Simulation for Healthcare Training During the pandemic, healthcare educators rushed to use pre-existing videos or had to record their own videos that they shared with their students to watch as a makeshift “simulation”. Content needs to be interactive to satisfy the interactivity requirement for simulation. Our team developed software called “Anywhere Simulation (AwSIM)” that allows healthcare educators to add interactivity to their existing content to create new healthcare scenarios and share that content remotely with their students. AwSIM provides healthcare educators with the power to create their own simulation scenarios using their own content (e.g., videos, images, text) and run the simulation remotely with their students. The software is content independent and is easy to use. We ran multiple studies with nursing students using our AwSIM technology and found that adding interactivity promotes teamwork, perception of authenticity and higher levels of thinking. Also, the AwSIM software has a high technology acceptance rate among students. We are working on creating and evaluating an immersive standalone version for trainees that they can use on a flat screen or on with a head mounted display with or without a facilitator. |
Erin J.K. Truesdell |
Research Areas: Human-computer interaction, games, game design, serious games, extended reality, tangible interfaces, collaboration, play Design for Multi-User XR Experiences Recent technological developments have made it easier than ever to design games, tools and even educational experiences in virtual and augmented reality. Tetherless headsets, hand tracking tools and high-quality video passthrough allow for a vast new world of interactive media to be created. This project aims to better understand how these technologies can be leveraged to create engaging and informational experiences where users share physical space as well as digital space. Specific research topics include group navigation strategies, novel input systems for games and space layouts, interaction design and audio considerations for setups with multiple co-located users. |
Margarita Vinnikov |
Research Areas: Immersive and collaborative extended reality (AR/VR/MR), navigation, gaze/body tracking, simulations Visualization of Large Datasets & Collaborative Platforms (VROOM Project) Our VROOM project, in collaboration with Professor James Geller’s SABOC (the Structural Analysis of Biomedical Ontologies Center), focuses on developing advanced methods for visualizing large datasets in virtual reality environments. The project is particularly concerned with the challenges of rendering complex datasets, such as ontology trees and visibility graphs, within XR systems. These visualizations are integrated into multi-user, multi-player platforms where users can collaboratively interact with the data in real time. Our research combines cutting-edge user-centered design principles to enhance the efficiency and usability of large-scale data visualization in immersive environments. Forensic Research (CSIxR) We explore how XR technologies can be applied in forensic science to revolutionize crime scene analysis and forensic training. Our work involves creating detailed, immersive virtual environments that simulate crime scenes, enabling forensic professionals to analyze evidence and reconstruct events with a high degree of precision. These simulations are also used for training purposes, providing an interactive platform to practice forensic techniques and methodologies. By integrating XR into forensic research, we aim to improve both the accuracy of investigations and the efficiency of forensic processes. Digital Histories In collaboration with Dean Louis Hamilton and Associate Dean Burcak Ozludil (Albert Dorman Honor’s College), the Digital Histories project focuses on creating immersive historical narratives through the use of virtual reality and digital databases. This includes the development of digital catalogs and visual archives that offer rich, interactive experiences for exploring historical events and sites. By leveraging multimedia, online data and VR visualizations, we provide users with a deeper understanding of history through engaging and immersive platforms. Immersive Cross-Reality Applications Our research in immersive cross-reality applications spans a wide range of projects, including augmented reality platforms and VR simulations. This includes the development of an augmented reality chemistry platform, industrial training tools and HUDs for airplane parking. Additionally, we create VR simulations for driving, flying and exploring animal habitats. These projects focus on creating highly interactive, multi-sensory experiences that push the boundaries of what XR technologies can achieve in both educational and industrial settings. |