I have been following HypeVR, a San Diego-based company in the VR space, since I encountered it at Intel’s booth at CES 2017, and I’ve watched the company’s technology improve and grow. During a recent visit to the company’s San Diego offices, founder Tonaci Tran walked me through the progress that the company has made over the last 3 years. The company’s growth is in part due to the involvement of industry luminaries and a co-founder and CSO at 3nfinite like Ted Schilowitz, Futurist at Paramount (who I know from way back in his days at RED Digital Cinema). Beyond that, though, the company has also benefited from the renewed interest in collaboration and streaming platforms during the Covid-19 pandemic. People are looking to XR to increase engagement and immersion and help people connect in an increasingly remote and isolated world. I’m writing today because HypeVR recently unveiled some big announcements—a name change, to 3nfinite, and the launch of its new Enbly app designed to enable the viewing of volumetric video over 5G connections. Let’s take a look at the company and some of its recent developments.
Pivoting from VR only
One of the first demos that I ever saw from 3nfinite was a VR volumetric video scene of Ban Gioc Falls, in Vietnam, complete with a complete 3D bamboo barge, barrels and oxen. This was the beginning of HypeVR’s evolution into something more than just a VR company: a complete volumetric video pipeline for platforms beyond VR. HypeVR’s name change to 3nfinite was made to illustrate this new, broadened scope of the company. The new app in beta, Enbly, gives users the ability to experience 3nfinite spatial volumetric video on iPhones and iPads. The application will launch this fall with several to be named high-profile content partners. I personally believe that 3nfinite’s content will be absorbed in VR, AR, or MR, in a way that is device agnostic.
Proprietary end-to-end solutions
What makes 3nfinite’s end-to-end platform unique is its patented compression algorithms, which preserve the quality of the spatial volumetric video files without losing too much quality. The 3nfinite platform uses a camera agnostic workflow to reconstruct a scene in photorealistic volumetric video, and then compresses the reconstruction in a manner that is streamable through the cloud to the Enbly app on an end device. In developing this creation-to-consumption pipeline, 3nfinite has filed for 9 patents (4 of which have been granted, with 5 more patent pending).
According to the company, 3nfinite’s industry-leading compression algorithms for meshes and point clouds help to reduce data footprints by as much as 15x (a claim that was corroborated by Intel, one of 3nfinite’s partners). 3nfinite works very closely with Intel and utilizes its processors in conjunction with GPUs to accelerate processing times. Additionally, the platform features opportunities for some compute to be done at the edge to lower processing times. The highest quality offline volumetric capture is the closest to photorealism that I have seen, and now it works on mobile devices (like my iPhone). This solution is closer to what Microsoft is offering as a service and may also take dozens of cameras to create a full volumetric capture. A 30-second, 48 FPS video is a roughly 600MB file from 10 gigabytes of raw data, which is why the compression is so important to be paired with a fast 5G connection.
During my visit to 3nfinite’s offices in San Diego, I got volumetrically captured twice—once using the offline super high-quality pipeline and a second time using the company’s real time streaming platform. The offline renderer is probably the best-looking volumetric video content that I have seen to date and may be some of the best in the industry today. I was genuinely amazed by how good it looked, considering no retouching was done in post-production to clean it up. The real-time capture used a three-camera array but allowed me to see an AR version of myself in real-time, which was cool in its own right. You can also view a video of the offline volumetric video that the above screenshot is from here.
Enabling real-time streaming over 5G
What is interesting about 3nfinite’s real time solution is that it can be streamed in real time to a client device at full frame rate (30 fps) unlike other solutions, though admittedly at a lower quality than the offline render. The real-time solution also streams with Intel’s new D455 RealSense depth cameras—commodity hardware that uses three to four cameras to product a relatively quality image in real-time. This turnkey solution could be put to good use in education, telehealth and many other applications that require low-latency communication. This could also prove to be a great application for the uploading and downloading of content on 5G networks. For that matter, the offline render could also utilize 5G for downloading or streaming large, high-quality files. For these reasons, I believe operators will highly likely be interested in adopting 3nfinite’s solution—it will help them save on bandwidth and improve image quality. While it will take time for quality content to be generated for XR headsets and smartphones, the first step is getting the infrastructure in place.
In my opinion, 3nfinite is really onto something with its class-leading compression algorithms. I know that the company is working on improving the real-time volumetric video compression to increase quality or reduce bandwidth, so there is still plenty of room for improvement. The incredibly high quality offline spatial volumetric video will be great for content that is produced to be consumed on demand, but not necessarily live. While this solution faces more competition from others in the industry, it is also one of the highest quality solutions I have seen to date. That said, I believe that 3nfinite’s real-time solution with Intel hardware will be the solution that many people will gravitate to, especially as we remain isolated due to COVID-19.