The project
I was the development lead on this project, involved in every aspect of the technical side as well as engaging deeply with the creative side; I contributed toward everything from sketching out creative storyboarding, to the UX of theatrical grammar in a home scenario, to low-level mesh decompression code and per-frame optimization.
Our performance attempts to explore and democratize a new kind of immersive experience, one within which anyone could experience world class opera in their own home.
Rendering volumetrically captured performances in real-time on a mobile phone using ARCore is a huge undertaking, involving revolutionary capture technology, machine learning, and state-of-the art mobile hardware. More technical and UX details can be found in the SIGGRAPH Asia paper below (Kelly et al.).
Siggraph Asia 2019 paper
As a result of the scientific breakthroughs we developed to make a project like this possible, our work was accepted into SIGGRAPH Asia where we demoed the experience, presented our paper, and gave a technical presentation.
The paper, “AR-ia: Volumetric Opera for Mobile Augmented Reality” (Kelly et al.) can be found here.
And thanks to collaborators: Jonathan Richards, Paul Debevec, Shahram Izadi, Samantha Cordingley, Patrick Nolan, Christoph Rhemann, Sean Fanello, Danhang Tang, Jude Osborn, Jay Busch, Philip Davidson, Peter Denny, Graham Fyffe, Kaiwen Guo, Geoff Harvey, Peter Lincoln, Wan-Chun Alex Ma, Jonathan Taylor, Xueming Yu, Matt Whalen, Jason Dourgarian, Genevieve Blanche, Narelle French, Kirstin Sillitoe, Tea Uglow, Brenton Spiteri, Emma Pearson, Wade Kernot and countless more who’ve helped make this monumental project a reality.