
RESEARCH
Mixed Reality & Computer Vision Group
MIXED REALITY & COMPUTER VISION GROUP
CONTACT
E-mail: hanbeom.chang@stonybrook.edu
MISSION STATEMENT
“We aim to develop real-time remote XR (Extended Reality) collaboration method for remote structural assessment by combining vision sensors (LiDAR, Camera), VR (Virtual Reality), and MR (Mixed Reality)"
RESEARCH INTEREST
-
LiDAR-RGB Camera Sensor Fusion
-
High-Fidelity 3D Map
-
Spatial Alignment
-
Remote Collaboration
Detailed Work
1. LiDAR-RGB Camera Sensor Fusion
Development of computer power and vision sensors (such as LiDAR, camera), remote visual assessment has been actively developed. We built LiDAR-RGB camera system with extrinsic calibration, 3D printed sensor mount.

2. High-Fidelity 3D Map
Pre-built LiDAR-RGB camera system builds a 3D-colored point cloud map with SLAM (Simultaneous Localization and Mapping). In order to detect any SLAM errors (e.g., Blind Spot), we have developed LiMRSF (LiDAR-MR-RGB Sensor Fusion) system to detect and visualize the errors for high-fidelity 3D maps.


3. Spatial Alignment
In order to track each device (e.g., VR, MR) in Virtual space, it is important to align the virtual world to the real world. We have applied Umeyama + ICP algorithm and are trying to apply neural networks to automate the alignment process.


4. Remote Collaboration
Remote collaboration is important for structural assessment between the field and office engineers. This system reduces the time and cost of travel and easily connects remote engineers with VR and MR headsets.




