Driftnet I  

Fly Like a Bird

A project by squidsoup.

Imagine flying like a bird through a musical composition that surrounds you, immerses you and reacts to your presence.

Driftnet is a confluence of two ideas – bird-like flight, and a spatialised, navigable musical environment.

At one level, it experiments with intuitive methods for freely navigating 3D virtual space.  Users are invited to ‘fly like a bird’ to navigate through a virtual space.  Using NO worn equipment - just by flapping their arms/wings and tilting their arms and bodies – people can intuitively (and amusingly!) navigate freely in virtual space.  The metaphor used harks back to childhood play, imitating birds and planes in the playground.

The virtual space is regarded as the notation paper for a spatial, navigable musical composition – we use it to create immersive and responsive virtual spaces that can be explored both visually and aurally – an area we have been exploring with “Altzero” (www.squidsoup.org/altzero) since 1999.  Sounds become reactive agents, visualised within the space and with behaviours that respond to one’s presence.  By moving through the space, participants are able to navigate the musical composition, as proximity and relative position directly affect what is heard. 

The immersive experience, and the slightly tongue-in-cheek flavour of the interface, are both enhanced by the use of anaglyphic red/cyan specs to create a strong illusion of 3D depth.

 

Video documentation

Get the Flash Player to see this player.

Low bandwidth? Watch the videos on Youtube

The first video gives a flavour of user interactions at Shunt. The second video is a straight to hard disk screengrab using Fraps.

 

Exhibitions

Driftnet I was exhibited at Shunt (London Bridge, UK) from 13-22 June 2007 as a public trial. The first prototypes were shown at Future of Sound events at SAGE (Gateshead UK) and Goldsmiths (London UK) in early 2007, as part of a Cybersonica artists' showcase.

 

Credits

Driftnet is a collaborative project by Gaz Bushell, Anthony Rowe and Ollie Bown.


Instructions

 

 

Depthmap images

Grabbed from the interface that allows participants to fly like a bird through the virtual space. The system derives depthmap imagery by comparing a pair of camera images in real time, using a Point Grey Bumblebee stereo-vision camera. The colour of a pixel shows its distance from the camera - red is nearest, blue furthest away. Grey is beyond the distance boundaries set.

 

Back to squidsoup.org