18 June 2013

Our colleagues at FACT were keen to create a new commission tapping into the use of augmented reality (AR) technology and incorporating elements of our own work on physiological computing.  

During the initial period of the partnership, we’d had a lot of discussion about blending AR and physiological computing. Our role was firstly to provide some technical input to the concepts produced by Manifest.AR, but as often happens when enthusiastic people get together, our collaboration became more about the development of a shared and realistic vision for the pieces.  

When you put academics and artists together, developing a shared vision is an interesting exercise. I can only speak for ourselves but part of our job was almost to be the party-poopers who said “no, we can’t do this because it won’t work.”  Another (much more rewarding) part of our role was to have an input for new types of interactive experiences as they were in the process of being developed.

The concepts for all the pieces developed as part of the project are represented in the Invisible ARtaffects show and I’ll walk through each with respect to the physiological computing aspects.

The first I’d like to describe is John Craig Freeman‘s EEG AR: Things We Have Lost. This is an elegant piece where John collected interview data on the streets of Liverpool by asking the public one simple question – “what have you lost?”  This question prompted a series of answers, some fairly trite (I lost my keys), some humorous (I lost my hair) and some very moving (I lost my wife).  John then proceeded to use these answers to create a series of augments.

John was working with a NeuroSky EEG unit and using the standard ‘meditation’ settings from the system in order to make these various augment materialise on a phone or tablet. From speaking to Craig and his colleague Will Pappenheimer, it was obvious that our artist collaborators were using the EEG in order to generate ‘chance’ or ‘random’ forms of interaction.  In other words, wearing the neurosky and relaxing would provoke a random augment associated with an answer to the question “what has you lost?”  Here is a 2005 conference paper that describes this approach in the context of biofeedback.  

Sander Veenhoof developed an AR concept that was close to our hearts (quite literally) and any other academic attends a lot of conferences. Human Conference Sensors monitors the engagement of an audience member during a presentation and to introduce augments in order to provoke an increase of attention if they experience boredom.  Hopefully the speaker behind the podium has no awareness of how many augments are being activated during their talk; the idea of this system being active during one of my undergraduate lectures fills me with dread.

I Must Be Seeing Things by John Cleater is similar to the EEG AR piece in that the person must relax in order for an augment to materialise before their eyes.  What is interesting about John’s piece is that he is using AR combined with psychophysiological regulation to enhance and adapt an actual physical object.  This piece works by the person viewing a book of sketched objects via a phone or tablet – as the person relaxes, an augmented object fills out the blank spaces of the sketched object as an overlay.  I liked the idea of materialising a virtual object in this way and it’s a neat example of mixed reality with a physiological computing component.

All the pieces described so far are only available as concepts, screenshots, animations and movies in the FACT exhibition.

Biomer Skelters by Will Pappenheimer and Tamiko Thiel is different in the sense that it is available in an interactive form as part of the show.  The central concept of Biomer Skelters is to create virtual plants in an urban space and therefore to repopulate the lost vegetation of the city.

What Will and Tamiko have created is a mixed-reality experience where the person can walk the streets of city, creating virtual vegetation like a trail of breadcrumbs, which can be viewed on a smartphone or tablet. Biomer Skelters incorporates a physiological computing component that Will created working with his students at Pace University.  As the person walks, they see a representation of their heart rate.  Naturally heart rate tends to increase as people walk, particularly if they walk fast and so the idea is that the person must walk whilst maintaining a relatively low heart rate in order to maximise the number of plants created in their wake.  Why is maximising the number of plants important?  Because there is a competitive element to the piece – Will and Tamiko have designed the software to spawn two types of plants: native species or invasive tropical species and participants play as two individuals or two teams where the objective is to “out plant” one another.  

Biomer Skelters incorporates several aspects that (to my knowledge) have not been incorporated before.  There is a mixed-reality concept using a camera view, a physiological computing component and geo-located game mechanic linked to biofeedback.  The tests we have run so far shows that the code holds up and the system works well, but much depends on having a good quality smartphone.

We really enjoyed working on these pieces and there are lots of things here to follow up, both in terms of interactive art and AR-based games.

For more information on Turning FACT Inside Out and all the artists involved in the exhibition, please visit the Turning FACT Inside Out project page. The exhibition is open 7 days a week, 12 - 6pm (11 – 6pm Saturdays) until 15 September

This text originally appeared as a posting on the physiologicalcomputing.net research blog.  For more information about physiological computing and the research conducted by LJMU, please visit their blog.