Blair Subbaraman

    Lab Automation
    Machine Maintenance

Project Archive
    Narrative Biosensing
    The Buzz
    Entropy Bound
    Future Storytelling
    Biological Hydrodynamics
    Community Science
    Geometry Synth

    Record Shelf


Future Storytelling

Summer 2018 // Developer

In the summer of 2018, UCLA School of Theater, Film, & Television and the Center for Research in Engineering, Media and Perfomance (REMAP) hosted an intensive summer institute based around ‘Future Storytelling for Social Impact’. A group of actors, animators, playwrights, and engineers worked adapt N.K. Jemisin’s science-fiction/fantasy novel The Fifth Season into an augmented reality (AR) theater experiment.  The summer culminated in test audiences being lead through a 30 minute collaborative (or multiplayer) mixed reality experience.

Audience members are given staffs (monopods) with a Galaxy S9+ running the app we developed in Unity.  Some are given subpacs (wearable audio backpack) which provide an addtional layer of immersion, and others are given headphones.  They are separated into three groups, led by actors who guide them.  While all participants share the same world - that is, the same AR elements appear in the same physical location for everybody - additional audio and visual elements are selectively rendered for each group, and each group learns a different ‘power’ in their track.  Fiducial markers unlock AR elements while also providing tracking information.  The three groups come together in the final act of the experience to work together and combine their powers.

As a staff developer with REMAP, I worked principally as one of four members of the AR Software and Systems group, developing the technology for the performance using Unity.  In addition to general development & testing of the app, some of my specific contributions include: a control infrastructure for cueing audio/visual elements, developing custom fiducial markers that provide tracking for Google ARCore, and various custom interactive elements (i.e. ‘powers’) such as phone shakes to create simulated earthquakes by triggering low-frequency rumbles from on-stage speakers. I also worked as part of a user experience group to find HCI metaphors which bridge the story world with the realities of our technical prototpe- what does it mean in the story when an audience member’s phone loses tracking? How can we intuitively introduce the capabilities and limits of AR to audience members who may have no prior exposure?

A video documenting the experience can be found here.  Captioned pictures below help provide further insight.

Audience members share a collaborative AR environment through their phones:

Additional AR elements are used to augment physical action on-stage, such as this particle system dancing around actors in a particularly dramatic moment:

Large scale projections are provided from desktops that are a part of the same AR environment as the phones, but selectively render more elements to provide further immersion:

Custom fiducial markers can be seen scattered around the stage :