‘Blended Reality’ brings diverse perspectives to emerging tech

Held on April 26 at the Center for Collaborative Arts and Media, the exhibition “Liminal Views” featured 12 “blended reality” projects by students and faculty.
Dance artist Mariel Pettee using a motion capture machine.

Dance artist Mariel Pettee, a Ph.D. candidate in physics, donned a suit adorned with 55 sensors as part of a project that incorporates motion capture and machine learning to generate choreographic sequences.

Four turntables are arranged on a table at a zoo by the bear enclosure. Play a record and the bears stand up and boogie. Turn around and Pegasus hovers over another enclosure. Toss an apple into the pen, and you will be astride the winged steed, poised for a memorable ride. 

Yale sophomore Noah Shapiro created this fanciful virtual zoo. It is an immersive experience he developed in consultation with Yale Cancer Center’s Pediatric Hematology and Oncology Program to provide a measure of joy to children during a frightening and difficult time.

I thought that a zoo experience would be fun for the kids to explore and, in some way, restore some normalcy to their lives,” said Shapiro, who plans to major in computer science. “I hope the fact that they can control the animals in different ways provides them with a sense of agency.”

Shapiro presented his zoo experience at “Liminal Views,” an exhibition of a dozen “blended reality” projects by Yale students and faculty held on April 26 at the Center for Collaborative Arts and Media (CCAM). The work presented was part of the Blended Reality Research Program — a partnership between Yale and HP supporting innovative, cross-disciplinary projects that blur lines between the physical and digital worlds.  

The program seeks to make emerging technologies, such as virtual and augmented reality, 3D fabrication, and digital imaging, accessible to a broader range of people, opening new creative avenues for artists and scholars, explained Justin Berry, the research program’s principal investigator and a critic at the Yale School of Art.

We’re trying to introduce this technology to people who don’t have expertise with it — people who are not approaching it with preconceived ideas about what it is supposed to become,” Berry said. “In doing so, we’re bringing diverse and unique perspectives to an emerging technology. Very often it’s people who aren’t familiar with technology who have the most meaningful insights about it.”

This diversity of perspectives was apparent in the broad range of projects presented at the event, which transported users inside the cell structure of a leaf, took them on a tour of an active Yale archaeological excavation in Egypt’s Elkab desert, or invited them to create a 3D selfie, among other experiences. 

Dance artist Mariel Pettee, a Ph.D. candidate in physics, is collaborating with experts in particle physics and machine learning on “Beyond Imitation,” a project to generate choreography through neural network architecture, a form of artificial intelligence.

Pettee has recorded various dance sequences in CCAM’s motion-capture studio. The motion-capture data is fed into the neural networks, which produce variations on Pettee’s movements or invent completely new dance sequences — insights that can inform her choreography and spark creativity.

I really like the idea of having my choreographic potential recorded in a physical form that I can sample from infinitely,” Pettee said, who is working on the project with Chase Shimmin, a postdoctoral researcher in the Department of Physics, and Douglas Duhaime, a programmer at Yale’s Digital Humanities Lab.

Travis McCann, who is studying for a master’s in science of nursing at the Yale School of Nursing, is collaborating with Yale undergraduate Bobby Berry to create dynamic medical-training simulations through augmented-reality technology, which unlike virtual reality uses real-world objects as part of the experience. 

Typically, medical training simulations are performed on mannequins bearing wounds rendered with makeup. McCann and Bobby Berry, a senior majoring in computing in the arts, have created an app that uses augmented reality to simulate wounds and injuries on the mannequins more realistically. 

A demonstration of a medical-training simulation using augmented-reality technology on a tablet.
Travis McCann, a student at the Yale School of Nursing, and Yale undergraduate Bobby Berry have created a medical-training simulation using augmented-reality technology.

Before it was just makeup that looks like a cut,” said McCann, a teaching assistant in the School of Nursing’s simulation department. “With the augmented-reality headset, you can render an arterial bleed that will only stop once the student applies the correct treatment.”

More realistic simulations will better prepare nursing students to respond effectively in genuine emergencies, McCann said. 

The less shocking that first medical emergency seems, the more likely you are to respond correctly,” he said.

Undergraduates Lance Chantiles-Wertz and Isaac Shelanski are working with Sara Abbaspour, a photography student at the Yale School of Art, to develop a customizable interface that will help people create immersive experiences difficult to achieve with a traditional trigger-based controller.

Most controllers resemble guns,” said Shelanski, a junior majoring in physics. “The truth is that virtual reality and augmented reality have many uses outside of gaming, and even in gaming, outside of shooting things. Trigger-based controllers can be very limiting to the creativity that virtual reality makes possible.”

Their device, called the “Clamshell Controller,” can be customized with any number of different sensors to manipulate a virtual world, explained Chantiles-Wertz, a senior mechanical-engineering major.

The idea is that you can take whatever sensors you choose and plug them into a device that is compact and very easy to use,” he said. “And using it won’t require a background in engineering.”

For the exhibition, the project team customized the device to manipulate an ocean simulation. They outfitted the controller — which resembles an overturned flowerpot — with three sensors. Covering one made the sky go dark. Tapping another disturbed the water. Rotating the third changed the weather from sunny to stormy.

The project’s early stages, which began in the fall, involved creating the platform to stream data from the sensors into Unity, which is the game engine that supports most virtual reality experiences, Shelanski said. The next phase will focus on making the controller’s physical form more compact and capable of supporting more sensors.

Undergraduate Lance Chantiles-Wertz and Justin Berry review a demonstration the Clamshell Controller.
Undergraduate Lance Chantiles-Wertz and Justin Berry, principal investigator for the Blended Reality Research Program, review a campfire experience created to demonstrate the Clamshell Controller, an electronic interface device that will allow user to incorporate various sensors to manipulate a virtual world.

CCAM — an interdisciplinary research center at 149 York St. where traditional arts blend with computer science and technology — serves as the hub for the Blended Reality Project. People from varied fields and backgrounds can collaborate there on projects and tease out ideas using a broad range of media resources, including a range of cutting-edge digital tools.

Shapiro recounted spending long nights at CCAM over the semester while creating his virtual zoo, which features flying whales and hungry rabbits in addition to dancing bears and Pegasus.

He is a member of the Yale Students Immersive Media Group, or YSIM, a team of undergraduates that creates virtual reality experiences for artistic expression, recreational enjoyment, and problem solving. The team created multiple experiences for the young patients at the Yale Cancer Center.

CCAM’s resources were crucial to the project, Shapiro said.

Virtual reality requires a lot of different disciplines,” he said. “You need 3D modeling experience, coding experience, and game-engine software experience. Luckily, we have the opportunity to learn a lot of those things at CCAM. I’ve learned so much doing this project.”

Related

Share this with Facebook Share this with X Share this with LinkedIn Share this with Email Print this