Adshir is announcing that it can demonstrate the holy grail of computer graphics: real-time ray tracing on mobile devices. That means that it can show physically accurate computer animations in real-time on mobile devices such as tablets and smartphones.
Ray tracing has been possible in high-end computer-animated movies such as Pixar films, but those films sometimes require months of computer processing in high-performance data centers before the animations can be properly rendered for films. Doing this kind of task in real-time requires much more processing, and Adshir’s announcement bodes well for real-time applications such as realistic games and other interactive apps.
At the Siggraph computer graphics event in Vancouver, Canada, Tel Aviv, Israel-based Adshir is showing LocalRay, a demo that shows augmented reality imagery such as the dancer at the top of the screen. The animation plays in real-time, and it can be placed as an animated overlay into a real-time augmented reality application, viewed through a smartphone or tablet or AR glasses, said Offir Remez, head of business development at Adshir, in an interview with VentureBeat.
“It’s trying to simulate light, and that’s very hard,” Remez said. “You have to capture how light bounces and spreads out when it hits different materials. You need to do billions of calculations.”
In the Unity-based demo, a dancer stands in a dark hall with a variety of fully ray-traced light sources coming in from the windows and bouncing around the environment, which includes a number of curved mirrors. Remez said the software runs 1,000 times faster than other ray-tracing solutions.
In the past, real-time ray tracing has been held back by the fundamental limitations of the compute power. Moreover, connected-compute devices, smartphones and smart glasses applications for AR/VR, are price sensitive and run under severely restricted performance and power budgets. In most cases, AR and virtual reality applications are still constrained to an in-door environment with room-lighting coupled with a high-end expensive central processing unit (CPU) and graphics processing unit (GPU).
Adshir has worked on developing proprietary software algorithms for more than five years. It was started by computer graphics expert Reuven Bakalash in 2014. The company has build a ray-tracing engine that does not require “huge preprocessed traversal trees, nor require big, fast memory and storage.”
“There’s a tremendous amount happening in computer graphics,” said Jon Peddie, analyst at Jon Peddie Research, in an interview with VentureBeat. “Real-time ray tracing has been the holy grail of computer graphics, going all the way back to 1981. We have talked about it for decades. Adshir has come up with a clever way to accelerate ray-tracing – the first major algorithm development in along time that not only speeds things up but uses less power.”
The Adshir demo can run on a battery-popwered laptop with a Core i7 processor, a mobile GPU, and Unity 3D running on Windows. It runs at 120 frames per second, has 10 reflective mirrors, glossy reflections, and 250,000 Tri 5 light sources. But Remez said that Adshir will demo the same software running on a Meta AR headset at Siggraph.
Remez said the company has 22 patents. In the past, Nvidia has shown real-time ray tracing with animations showing a very specific task. It did so with a supercomputer with eight Nvidia Tesla graphics cards.
“We want to be in a very different place from that,” Remez said. “We designed something bottom up to run real-time on mobile devices. You see a video using a tablet at 60 frames per second, fully ray traced, with augmented reality graphics. We do not believe any other real-time ray tracing solutions at this quality and speed can run on a tablet.”
Bakalash has been working on real-time ray tracing for many years. He and Remez were cofounders of Lucidlogix Technologies in 2003. Bakalash has more than 150 patents. Remez said the company will have a closed beta test for its software in the fourth quarter. The company has eight employees.