Motion capture company Vicon unveiled its Origin line of location-based virtual reality hardware at the computer graphics conference SIGGRAPH in Vancouver. The product suite includes cameras, trackers, and a wireless relay transmitter along with software that is compatible with popular game engines like Unity and Unreal Engine.
Vicon has partnered with Dreamscape Immersive, which specializes in full-roam VR experiences like Alien Zoo, as well as Bandai Namco, on its four-player multiplayer game Dragon Quest VR. Epic Games also used its motion-capture technology to create the realistic Siren demo.
The new Origin line includes its Viper cameras, which are 2.2 megapixel and can run up to 240 frames per second. Vicon’s VFX product manager Tim Doubleday noted that the team decided to take the strobe off the camera and use LED instead. This enables the camera to tolerate a lot more light in its environment.
“By taking the strobe off, it means you’re not seeing any other visible light,” said Doubleday in a phone call with GamesBeat. “Traditionally, on a mocap stage, even running Vicon cameras, high-end Vantage cameras, if you have a window that has natural sunlight coming in, the camera is going to see that sunlight and you have to mask that area of your camera. Your marker won’t be seen in front of that window. Because we’re not using a strobe, and we’re using an LED instead, all we can see, all this camera can see, is the LEDs on that wavelength, if you like.”
Vicon’s Pulsar trackers are square, and they offer over 50 different configurations of domes, which contain infrared LED lights. These enable the cameras to track multiple players at once, and the different patterns combat occlusion.
“[The domes] allow us to just diffuse the LED light on them. You can’t actually see the LEDs, but basically they can be seen at different angles, rather than looking straight down at the cluster. You can come in from an angle,” said Doubleday. “And then really it’s hand-in-hand with the Viper cameras. You can scale these up, as many as you want. If you have one person in the volume you’re looking at six to eight cameras, and then if you want two [players], you’re looking at more like 16, up to the Dreamscape [Alien Zoo] scenario, which is six people at once, and they settled on a number around 36 cameras, 36 or 38.”
To tie the whole suite together are the Beacon wireless relay, which syncs everything together, and the Evoke software.
Location-based VR is expected to hit $8 billion by 2022, and VR arcades and blockbuster experiences like Star Wars: Secrets of the Empire continue to pop up. The Origin line is aimed to be plug and play, since Doubleday says one of the pain points is staffing up these arcades with folks who may not be familiar with motion capture.
“The big challenge for us is making it usable—traditionally if you’re doing mocap with EA or Ubisoft or somebody, they have trained mocap technicians running the studio, running the mocap system,” said Doubleday. “They might have two or three years’ experience. Whereas with this location-based VR, it’s about getting someone who’s had maybe a couple of weeks of training running the system and then leaving them to it. It has to be bulletproof in terms of reliability, and really easy to pick up.”
The other challenge is making it easy for players to jump in and out of the VR getup. With Origin, people need a minimum of six trackers on to achieve full-body motion capture. Vicon’s solution is to design gloves that have magnetic backs, as well as clips for people’s shoes. Its Pulsar trackers attach easily, and that takes care of four of the tracking points. The head-mounted display and backpack are the last two needed.
Though Origin is designed for location-based VR experiences, motion capture is used in traditional video games as well to make animation look realistic. To that end, Doubleday says that Vicon wants to be able to cater to indie developers as well so that even teams without a lot of mocap experience can use its Origin kit.
“We’d like to see this product as a way of getting a small team into motion capture,” said Doubleday. “That would mean an animator comes in and they can just put on the clusters, walk into the volume, and all of a sudden they appear as a character. Essentially that’s it, that’s all they have to do. It streams directly into a game engine. We have plugins for Unreal and Unity and Motion Builder. The idea is they can very quickly come in and capture a ton of animation, whether final animation or just preview animation, and get their game up and running.”