Paper
26 January 2009 Becoming Dragon: a mixed reality durational performance in Second Life
Micha Cárdenas, Christopher Head, Todd Margolis, Kael Greco
Author Affiliations +
Proceedings Volume 7238, The Engineering Reality of Virtual Reality 2009; 723807 (2009) https://doi.org/10.1117/12.806260
Event: IS&T/SPIE Electronic Imaging, 2009, San Jose, California, United States
Abstract
The goal for Becoming Dragon was to develop a working, immersive Mixed Reality system by using a motion capture system and head mounted display to control a character in Second Life - a Massively Multiplayer Online 3D environment - in order to examine a number of questions regarding identity, gender and the transformative potential of technology. This performance was accomplished through a collaboration between Micha Cardenas, the performer and technical director, Christopher Head, Kael Greco, Benjamin Lotan, Anna Storelli and Elle Mehrmand. The plan for this project was to model the performer's physical environment to enable them to live in the virtual environment for extended amounts of time, using an approach of Mixed Reality, where the physical world is mapped into the virtual. I remain critical of the concept of Mixed Reality, as it presents an idea of realities as totalities and as objective essences independent of interpretation through the symbolic order. Part of my goal with this project is to explore identity as a process of social feedback, in the sense that Donna Haraway describes "becoming with"iii, as well as to explore the concept of Reality Spectrum that Augmentology.com discusses, thinking about states such as AFK (Away From Keyboard) that are in-between virtual and corporeal presence.iv Both of these ideas are ways of overcoming the dualisms of mind/body, real/virtual and self/other that have been a problematic part of thinking about technology for so long. Towards thinking beyond these binaries, Anna Munster offers a concept of enfolding the body and technologyv, building on Gilles Deleuze's notion of the baroque fold. She says "the superfold... opens up for us a twisted topology of code folding back upon itself without determinate start or end points: we now live in a time and space in which body and information are thoroughly imbricated."vi She elaborates on this notion of body and code as becoming with each other saying "the incorporeal vectors of digital information draw out the capacities of our bodies to become other than matter conceived as a mere vessel for consciousness or a substrate for signal... we may also conceive of these experiences as a new territory made possible by the fact that our bodies are immanently open to these kinds of technically symbiotic transformations"vii. A number of the technologies used in this performance were used in an attempt to blur the line between the actual and the digital, such as motion capture, live video streaming into Second Life and 3D fabrication of physical copies of Second Life avatars. The performance was developed using the following components: - An Emagin Z800 immersive head mounted display (HMD) allowed the performer to move around in the physical environment within Calit2 and still remain "in game". Head tracking and stereoscopic imagery help to provide a realistic feeling of immersion. We built on the University of Michigan 3D (UM3D) lab's stereoscopic patch for the Second Life client, updating it to work with the latest version of Second Life. - A motion tracking system. A Vicon MX40+ motion capture system was installed into the Visiting Artist Lab at CRCA, which served as the physical performance space, to allow real-time motion tracking data to be sent to a PC running Windows. Using this data, the plan was to map the physical motion in the real world back into game space, so that, for example, the performer could easily get to their food source or to the restroom. We developed a C++ bridge that includes a parser for the Vicon real time data stream in order to communicate this to the Second Life server to produce changes in avatar and object positions based on real physical movement. The goal was to get complete body gestures into Second Life in near real time. - A Puredata patch called Lila, developed by Shahrokh Yadegadi of UCSD, which was used to modulate the performer's voice, to provide a voice system that allowed chat ability in Second Life, which was less gendered and less human.
© (2009) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Micha Cárdenas, Christopher Head, Todd Margolis, and Kael Greco "Becoming Dragon: a mixed reality durational performance in Second Life", Proc. SPIE 7238, The Engineering Reality of Virtual Reality 2009, 723807 (26 January 2009); https://doi.org/10.1117/12.806260
Lens.org Logo
CITATIONS
Cited by 6 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Head

Head-mounted displays

Virtual reality

Stereoscopic displays

Surgery

Cameras

Video

RELATED CONTENT

HoloLens in suturing training
Proceedings of SPIE (March 13 2018)
Virtual reality for spherical images
Proceedings of SPIE (August 07 2017)
Telepresence as a forensic visualization tool
Proceedings of SPIE (October 07 2019)
Wide-angle orthostereo
Proceedings of SPIE (September 01 1990)
High-bandwidth and high-resolution immersive interactivity
Proceedings of SPIE (February 16 1996)

Back to Top