MeshReduce: Split Rendering of Live 3D Scene for Virtual Teleportation.

IEEE Conference on Virtual Reality and 3D User Interfaces(2024)

Cited 0|Views1
No score
Abstract
The pursuit of immersive telepresence has always aimed to capture and stream 3D environments, enabling remote viewers to observe scenes from any view angle. However, realizing this vision remains demanding, especially with current mobile AR/VR devices, due to intricate scene details, network latency, and bandwidth constraints. This demo introduces MeshReduce, an innovative approach that integrates a novel distributed 3D scene capture technique with a split rendering framework. Our demo shows a prototype of a cross-platform, live 3D telepresence system that can be viewed on standard web browsers. The capture setup consists of multiple depth sensors, capturing users and the background scene in real time. MeshReduce uniquely allows for real-time rendering of remotely captured 3D scenes, seamlessly merging them with content on the user's device.
More
Translated text
Key words
Computing methodologies—Computer graphics—Graphics systems and interfaces—Mixed / augmented reality,Information systems—Information systems applications—Multimedia information systems—Multimedia content creation
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined