A bug’s-eye view
Elliot Graves is no stranger to Foundry, having worked closely with our research team in 2019 to build a LiDAR and videogrammetry rig with E2 cameras. The data from this was then used to support the development of volumetric research project VoluMagic.
His experience and expertise meant that Micro Monsters was in good hands. And just as well, because with a credit list featuring just over 100 team members, this was no small project.
“The main objective of the Micro Monsters series was to reveal the micro-world of bugs in a way never experienced before in virtual reality,” Elliot comments. “We wanted to take full advantage of the new Oculus Quest headset and its ability to play 8K video, combining it with genuinely eye-opening stories from the natural world and the latest visual effects pipelines.”
Taking place over the course of four months, the pressure was on to turn the project around in time for the release of the new Quest 2—no mean feat, but one Elliot and his team were well-prepared to handle.
“The whole team worked tirelessly to ensure that what we produced was one of the highest-quality live-action VR projects to date, showcasing the huge potential of non-native immersive media within VR production,” he tells us.
Asked what made the project unique, an obvious one springs to mind for Elliot, born from the uncharted territory that 2020 brought with it.
“Working through a pandemic certainly had its challenges,” he comments. “With a team spread across Thailand, Australia, London and with Oculus based in California, we spent a lot of time on remote calls. Luckily we had our Master Jedi Producer, Vianney Comot on hand constantly, guiding us!”
Breaking new ground
Workforce limitations brought on by COVID-19 present practical challenges that all businesses and studios have had to adapt to. But what project-specific, technical barriers did the team come up against during the production of Micro Monsters?
Crucially, producing high resolution, non-native hero content was key to the success of the project. “This wasn’t our first 8K project, but it was the first where 8K could actually be seen in the headset,” Elliot tells us.
“In the past, creators of live-action VR production have had to come to terms with the lack of detail in deliverables within VR. With the launch of the Quest 2 headset, 8K video fully saturates the headset. This means our 8K deliverable is visible at 100%, and in turn, our delivery had to be flawless—there was nowhere to hide!
For Elliot and his team, then, the main technical challenges of Micro Monsters were three-fold: creating an immersive experience using non-native immersive content, building a new VFX pipeline to support this, and rendering at 8K and 60fps.
“To utilise non-native immersive content, we had to think carefully about how each episode in the series would be composed,” Elliot explains. “We opted to start each episode with the same introduction, easing viewers into the world of bugs whilst using original and native 180 content that was either rendered or captured with ZCAM K2 cameras.”
“This provided viewers a familiar environment, albeit at a much higher quality than previously expected. From here we leveraged the set extended high-res 3D macro rectilinear content that was adapted for VR. Creating the VFX pipeline to produce these conversions in a way that would be seamless and as immersive as native content was challenging.”
Fortunately, Nuke was on hand to navigate challenges like the above, and support the team as they headed into the uncharted territory that came with this unique project. Elliot, as the project’s director, recalls Nuke’s role throughout the production process.
“This pipeline was built using Nuke 12.0 with our team creating a set of gizmos that could some-what automate the process, alleviating a shot-by-shot approach,” he tells us.
“Built into this was the capability of FFMPEG encoding for the multiple headset review formats that were required for a proper QA process. Nuke was used across the set extension work, but also acted as our main comp pipeline for our stereo 180 renders.”
Yet it was not all smooth-sailing, as Elliot explains: “Producing these to such a high resolution and framerate was challenging for our infrastructure and team. QAing heavy EXR renders took much longer and getting quick turnaround on playouts was harder again.”
“Here FFMPEG came to the rescue, automating the process somewhat, allowing headset review of all individual shots as they were worked on through Shotgun. The outcome were shots that felt totally immersive, spherical and 3D but with the visual qualities of cinema cameras that we’ve all come to rely on.”
But what did it take to get to this point? Time, resource, and technical expertise, according to Elliot: “The technical process to take this evolution in live-action content into VR was the post-production challenge where most of the resource on this project was allocated, as David outlines. Our main challenges were converting the footage and processing it through various pipelines at 8K 60fps.”