Cybernetic Bacteria

In Cybernetic Bacteria 2.0, the chemical communication of bacteria and the live data streams of our own digital networks (the wireless / bluetooth / RFID activity taking place in and around the gallery) are combined in real time to generate a brand new artificial life form. This installation (visited by over 45,000 people, dublin science Gallery, 2009) explores the layers of complexity in both digital and organic communications networks and investigates the relationship of bacteria to artificial life. A collaboration between; Anna Dumitriu, Dr. Simon Park, Dr Blay Whitby, Tom Keene and Lorenzo Grespan.

Bacteria - wires

"The scientist, unconcerned with the ethical implications of his experiment and also unaware of the artist’s intentions, never anticipated that the fusion of the Earth’s global bacterial communications network, with that of human origin would lead to the evolution of a novel and chimeric life form. A new kind of pathogen mutated by the Bluetooth, RFID and Packet Data surveilled in the gallery. Dublin became the centre of the epidemic, and the origin of a new life form able to subvert both biology and technology. What followed was inevitable. What else would a creature with access to: humanity’s entire digital knowledge; the genetic toolbox that drives evolution; the sophistication of the pathogen; and intimate awareness of our vulnerabilities do?"
bacteria - 3 screen
bacteria - orac

Technical Description

The device has acquired the nickname "Orac" (the supercomputer from the TV series Blake 7). It consists of a network of micro-controllers, each searching for electronic devices in the immediate environment. Each time a device is discovered, its unique ID is recorded and sent to an artificial life form based on bacterial communication systems. Each circular flash of pixels is a new device detected; white are the tags worn by visitors, blue are bluetooth devices and red are RFID touch cards.

A bluetooth device (near the hacked mobile phone) scans the area for mobile phones or laptops with bluetooth enabled. The white tag (marked sputnik) detects the tags worn by visitors to the exhibition and the green pad to the right responds to the white cards attached to the plinth. Every 2 seconds each of the three devices are 'asked' by a mini webserver if they have new data, which is sent via the red ethernet cable to the artificial lifeform.

The artificial lifeform follows Conways Game of Life:

There are 2 simple additions (modelled on the communication structure of genetically modified bacteria) to the basic game:

Genetically modified bacteria
The bacteria depicted in the video are:
Chromobacterium violaceum = purple
Serratia Marcasens = red
CV026 = white
Where Purple communicates to white.
White can't communicate, only listen. We know it listens because it turns purple.
Red talks, but has no receiver.

Bluetooth reader (mobile phone)
Bluetooth protocol allows for the exchange of data over short distances from fixed and mobile devices.

Open beacon "sputnik" tag (
An active RFID device which operates in the 2.4GHz band. It detects tags worn by individuals in near proximity, both broadcasting its presence and recording interaction with other similar tags.

RFID Card reader (
A passive device operating in the 13.56 MHz frequency range. Uses for this touch based technology include payment in transport systems, library systems, and passport control.

Exhibited Sept 2010

Unleashed Devices: Watermans An exhibition of DIY, hacking and open source projects by artists who explore technologies critically and creatively.

Exhibited: 17th April 09 to 17th July 09

Bacteria - logoA Dublin Science Gallery commission, Anna Dumitriu, Tom Keene, Lorenzo Grespan, Dr. Simon Park, Dr Blay Whitby. Part of: INFECTIOUS: STAY AWAY

20th Century London
Commissioned by the Museum of London, this collaborative website facilitated the development of The Exploring 20th Century London partnership, which will create online learning resources about 20th century London, including records catalogued according to shared themes and standards. The resources will be tailored to audience needs in a series of gallery and Web outputs.

Museum - screengrab

The primary partners are the Museum of London and London's Transport Museum, who will develop learning resources and 9000 records about their 20th century designated collections for use in their new galleries and on their Web sites. The additional partners are the Jewish Museum and Croydon Clocktower. Between them the additional partners will contribute a further 2000 records about 20th century collections. An overall set of learning resources and the total set of 11,000 records will then be made accessible through an integrated Web site. The resources, standards and systems will be designed to be extensible in the future.

Aristotles Office

9 Objects and an interface: Answer machine, Bin, Fan, Filing cabinets, Lamp, Plant, Telephone, Watercooler.

After hiding out at Lighthouse, Brighton. They are now on the move...

2007 Tom Keene and Kypros Kyprianou, a Lighthouse commission, Brighton.

If video doesn't display it can be downloaded here [13.4mb .mov].

Nine objects and an interface playfully makes visible the underlying software and hardware structures between networked objects. We aim to investigate potential relationships between everyday objects using simple universal rules. How will the office plant respond to the advances of the fan? Will the water-cooler shy away from the flashing office light?

Aristotles Office - Full office

Throughout an increasingly wired and wireless world, objects are being embedded with communicating technologies, and are increasingly drawn into networked behavior where previously they were independent. Objects are no longer passive receivers of one-sided instruction. The machines talk amongst themselves but who knows what they are saying and how our relationships with them evolve as they slowly begin to talk.

Aristotles office - Lamp

Each object can be plugged into each other via a simple patch-bay interface. When one an object is attached to another, they begin their conversation, sometimes they are excited, sometimes get bored. When talking, the objects utilise the rather elderly sounding OAP (Object Action Protocol), over the POLAN (Physical Object Local Area Network), it seems this is their preferred method of communication.

Aristotle - pins

"Clearly this will eventually lead somewhere more interesting than the Ray Bradbury inspired MIT House of the Future where your fridge starts giving you diet tips."
Bruce Sterling 2006.

Related Websites / Links


Development of a website for a creative regeneration project mobilising young people to produce a ‘Young People’s Manifesto for Urban Development and Regeneration’

Billboard - lab
Billboard provides the means for young people to become community reporters, investigators and champions for positive change.

The project is built around a partnership between Creative Partnerships Thames Gateway, Futurecity, Basildon District Council and Southend and Thurrock Unitary Authorities, with media partners Channel 4/Talent.


Billboard - webBillboard is supported by Arts Council England, East and EEDA via their unique shared prospectus which identifies three common areas for strategic joint investment that will harness the power of the arts and creativity to transform the social, economic and material conditions of local people and communities

The Billboard Lab is a customised shipping container that transforms into a fully equipped film studio where young people will develop their creative skills, explore regeneration issues and engage with local communities. The Billboard website will provide a dedicated space for regeneration and showcase work produced through the project.

Billboard lead to a national conference that enabled young people to debate, challenge and refine the young people’s manifesto before presentation to Government.

Billboard - ContainerBillboard - Camera


Biosensing and Networked Performance Workshop at ISEA 2011 in Istanbul led by Anna Dumitriu and Tom Keene with Alex May Venue: ISEA 2011 Istanbul 15th and 16th September 9am-1pm 2011.

Project downloads
Build Instructions [324kb .jpg]
Arduino Code [12kb]
Paper: "The Apple Barrier: An open source interface to the iPhone"

In this two-day workshop participants built and calibrated their own iPhone compatible/connectable Galvanic Skin Response Sensors (GSR) to record subtle changes in their emotional arousal. Participants will also collaborated to develop a networked performance intervention that engaged with the social benefits and ethical implications of disclosing such personal information as arousal levels within the public realm. Participants learnt to solder and connect their own GSR sensors, connect them to their iPhones and share their sensor data online. There was a discussion of the implications of this technology and the increasing issues of privacy as pervasive computing technology is increasingly able to record and reveal personal details.

Arduino Code
I added a smoothing function and startup sounds to DDS Sinewave Generator code developed by
LabIII, The Laboratory for Experimental Computer Science at the Academy of Media Arts Cologne.

const int numReadings = 10;
int readings[numReadings];      // the readings from the analog input
int index = 0;                  // the index of the current reading
int total = 0;                  // the running total
int average = 0;                // the average
int inputPin = A0;              // this pin this sensor is attached to
void initialiseSmooth(){
  for (int thisReading = 0; thisReading < numReadings; thisReading++) // initialize all the readings to 0: 
    readings[thisReading] = 0;     
int readSmooth(){
  total= total - readings[index];         // subtract the last reading  
  readings[index] = analogRead(inputPin); // read from the sensor
  total= total + readings[index];         // add the reading to the total    
  index = index + 1;                      // advance to the next position in the array:                 
  if (index >= numReadings)              // if we're at the end of the array...          
    index = 0;                           // ...wrap around to the beginning                   
  average = total / numReadings;         // calculate the average
  return  average;
void setup()
   // Smoothing setup
   // Wavetable setup
   o1.phase = 0;
   o1.phase_increment = 0 ;
   o1.amplitude_increment = 0;
   o1.frequency_increment = 0;
   o1.framecounter =0;
   o1.amplitude = 0; // zero amplitude
   // startupsounds 
void loop() {
    // Read the analog input
    dfreq = readSmooth();          // read analog pin to adjust output frequency from 0..1023 Hz
    dfreq = 1023-dfreq;            // inverse the reading so tone sounds higher the less resistance there is
    o1.phase_increment = phaseinc(dfreq);  
// Startup sequence
// (Music was composed by Caryl Mann:
// TODO: Should really create a lookup table for this...
void carylMannComposition()
  // Startup sequence (Convert 'midi ticks' to millisencons:  60000 / (BPM * PPQ))
  // At 100bmp 1 tick = 2.36 milliseconds
  o1.amplitude = 255*256; // full amplitude
  o1.phase_increment = phaseinc(440.00); // A4 - ticks
    o1.phase_increment = phaseinc(174.61); // F3 - ticks
    o1.phase_increment = phaseinc(523.25); // C5 - ticks 
    o1.phase_increment = phaseinc(349.23); // F4 - ticks
    o1.phase_increment = phaseinc(659.26); // E5 
    o1.phase_increment = phaseinc(698.46); // F5 
    o1.amplitude = 0; 
    o1.amplitude = 255*256; // full amplitude
    o1.phase_increment = phaseinc(164.81); // E3 
    o1.phase_increment = phaseinc(293.66); // D4 
    o1.amplitude = 0; 
    o1.amplitude = 255*256; // full amplitude
    o1.phase_increment = phaseinc(466.16); // A#4 
    o1.phase_increment = phaseinc(739.99); // F#5
    o1.amplitude = 0; 
  o1.amplitude = 255*256; // full amplitude
    o1.phase_increment = phaseinc(1108.73); // C#6
    o1.phase_increment = phaseinc(369.99); // F#4
    o1.phase_increment = phaseinc(466.16); // A#4
    o1.phase_increment = phaseinc(233.08); // A#3
    o1.amplitude = 0; 
    o1.amplitude = 255*256; // full amplitude
    o1.phase_increment = phaseinc(698.46); // F5
    o1.phase_increment = phaseinc(174.61); // F3 
    o1.amplitude = 0; 
    o1.amplitude = 0; // full amplitude
// using "DDS" with 32-bit phase register to illustrate efficient
// accurate frequency.
// 20-bits is on the edge of people pitch perception  
// 24-bits has been the usual resolution employed.
// so we use 32-bits in C, i.e. long.
// smoothly interpolates frequency and amplitudes illustrating
// lock-free approach to synchronizing foreground process  control and background (interrupt) 
// sound synthesis
// On old ATmega8 boards, output is on pin 11
// For modern ATmega168 boards, output is on pin 3
// copyright 2009. Adrian Freed. All Rights Reserved.
// Use this as you will but include attribution in derivative works.
// tested on the Arduino Mega
#include <avr/io.h>
#include <avr/interrupt.h>
#include <avr/pgmspace.h>
double dfreq;
const unsigned int LUTsize = 1<<8; // Look Up Table size: has to be power of 2 so that the modulo LUTsize
                                   // can be done by picking bits from the phase avoiding arithmetic
int8_t sintable[LUTsize] PROGMEM = { // already biased with +127
int8_t triangletable[LUTsize] PROGMEM = {
const int timerPrescale=1<<9;
struct oscillator
    uint32_t phase;
    int32_t phase_increment;
    int32_t frequency_increment;
    int16_t amplitude;
    int16_t amplitude_increment;
    uint32_t framecounter;
} o1;
const int fractionalbits = 16; // 16 bits fractional phase
// compute a phase increment from a frequency
unsigned long phaseinc(float frequency_in_Hz)
   return LUTsize *(1l<<fractionalbits)* frequency_in_Hz/(F_CPU/timerPrescale);
// The above requires floating point and is robust for a wide range of parameters
// If we constrain the parameters and take care we can go much
// faster with integer arithmetic
// We control the calculation order to avoid overflow or resolution loss
 // we chose "predivide" so that (pow(2,predivide) divides F_CPU,so 4MHz (1.7v), 8Mhz, 12Mhz (3.3v) and 16Mhz 20Mhz all work
 // AND note that "frequency_in_Hz" is not too large. We only have about 16Khz bandwidth to play with on arduino timers anyway
const int predivide = 8;
unsigned long phaseinc_from_fractional_frequency(unsigned long frequency_in_Hz_times_256)
    return (1l<<(fractionalbits-predivide))* ((LUTsize*(timerPrescale/(1<<predivide))*frequency_in_Hz_times_256)/(F_CPU/(1<<predivide)));
// tabulate phaseincrements correspondending to equaltemperament and midi note numbers (semitones)
#define MIDITOPH
#define mtoph(x) ( phaseinc(8.1757989156* pow(2.0, x /12.0) ))
unsigned long midinotetophaseinc[128]= 
  mtoph(0), mtoph(1), mtoph(2),mtoph(3), mtoph(4), mtoph(5), mtoph(6), mtoph(7),    
  mtoph(8),mtoph(9), mtoph(10), mtoph(11), mtoph(12), mtoph(13), mtoph(14), mtoph(15),    
   mtoph(16), mtoph(17), mtoph(18), mtoph(19), mtoph(20), mtoph(21), mtoph(22), mtoph(23),   
  mtoph(24), mtoph(25), mtoph(26), mtoph(27), mtoph(28), mtoph(29), mtoph(30), mtoph(31),   
   mtoph(32), mtoph(33), mtoph(34), mtoph(35), mtoph(36), mtoph(37), mtoph(38), mtoph(39),   
  mtoph(40), mtoph(41), mtoph(42), mtoph(43), mtoph(44), mtoph(45), mtoph(46), mtoph(47),   
   mtoph(48), mtoph(49), mtoph(50), mtoph(51), mtoph(52), mtoph(53), mtoph(54), mtoph(55),   
  mtoph(56), mtoph(57), mtoph(58), mtoph(59), mtoph(60), mtoph(61), mtoph(62), mtoph(63),   
  mtoph(64), mtoph(65), mtoph(66), mtoph(67), mtoph(68), mtoph(69), mtoph(70), mtoph(71),   
  mtoph(72), mtoph(73), mtoph(74), mtoph(75), mtoph(76), mtoph(77), mtoph(78), mtoph(79),   
   mtoph(80), mtoph(81), mtoph(82), mtoph(83), mtoph(84), mtoph(85), mtoph(86), mtoph(87),   
  mtoph(88), mtoph(89), mtoph(90), mtoph(91), mtoph(92), mtoph(93), mtoph(94), mtoph(95),   
   mtoph(96),mtoph(97), mtoph(98), mtoph(99), mtoph(100), mtoph(101), mtoph(102), mtoph(103),    
  mtoph(104),mtoph(105), mtoph(106), mtoph(107), mtoph(108), mtoph(109), mtoph(110), mtoph(111),    
   mtoph(112),mtoph(113), mtoph(114), mtoph(115), mtoph(116), mtoph(117), mtoph(118), mtoph(119),    
  mtoph(120), mtoph(121), mtoph(122), mtoph(123), mtoph(124), mtoph(125), mtoph(126), mtoph(127)  
#undef mtoph

// Timer setup constants
#if defined(__AVR_ATmega8__)

// On old ATmega8 boards, output is on pin 11
#define PWM_PIN       11
#elif defined(__AVR_ATmega1280__)

#define PWM_PIN       3

// For modern ATmega168 boards, output is on pin 3
#define PWM_PIN       3

void initializeTimer() {
 // Set up PWM  with Clock/256 (i.e.  31.25kHz on Arduino 16MHz;
 // and  phase accurate
#if defined(__AVR_ATmega8__)
  // ATmega8 has different registers
  TCCR2 = _BV(WGM20) | _BV(COM21) | _BV(CS20);
#elif defined(__AVR_ATmega1280__)
  TCCR3A = _BV(COM3C1) | _BV(WGM30);
  TCCR3B = _BV(CS30);
  TIMSK3 = _BV(TOIE3);
  TCCR2A = _BV(COM2B1) | _BV(WGM20);
  TCCR2B = _BV(CS20);
  TIMSK2 = _BV(TOIE2);
// this is the heart of the wavetable synthesis. A phasor looks up a sine table
int8_t outputvalue  =0;
   PWM_VALUE_DESTINATION = outputvalue; //output first to minimize jitter
   outputvalue = (((uint8_t)(o1.amplitude>>8)) * pgm_read_byte(sintable+((o1.phase>>16)%LUTsize)))>>8;
  o1.phase += (uint32_t)o1.phase_increment;
  // ramp amplitude and frequency
     o1.amplitude += o1.amplitude_increment;
     o1.phase_increment += o1.frequency_increment;

Tom Keene is an artist whose work focuses on the intersection of technology, communication and participation. Since 1998 he has worked as a freelance programmer and designer for web, networks and physical computing platforms. With his multi-diciplinary work ranging from collaborative website design and development, hi-tech sensor driven environments, reactive video and robotic installations to participatory arts projects in community settings.

Anna Dumitriu’s work blurs the boundaries between art and science. Her installations, interventions and performances use a range of digital, biological and traditional media including live bacteria, interactive media and textiles. Her work has a strong international exhibition profile and is held in several major public collections, including the Science Museum in London. She was a member of the e-MobiLArt project (the EU funded European Mobile Lab for Interactive Art) and Artist in Residence at The Centre for Computational Neuroscience and Robotics at Sussex University (where she is the artist partner on the project “Supporting Shy Users in Pervasive Technology”). She is known for her work as director of “The Institute of Unnecessary Research”, a group of artists and scientists whose work crosses disciplinary boundaries and critiques contemporary research practice. She is currently working on a Wellcome Trust funded art project entitled “Communicating Bacteria”, collaborating with the Adaptive Systems Research Group at The University of Hertfordshire (focusing on social robotics) and has recently commenced her role as Leverhulme Trust artist in residence on the Modernising Medical Microbiology project at The University of Oxford. and

Alex May is an international artist working with digital projection, 3D video mapping, illumination, and optics to create animated trompe l’oeil effects using scientific theories of perspective and projective geometry. He is a veteran programmer specialising in, but not limited to, high performance, real-time audio/visual processing, creating his own software to facilitate his own art projects as well as releasing open-source tools that are in use by digital artists worldwide. Working with sound designer Martin A. Smith, Alex has created a series of major site-specific installations for clients such as Kensington and Chelsea Council, Universal Music, and Canon Europe.

Blushing Mona Lisa

Tom Keene and Kypros Kyprianou’s Blushing Mona Lisa is a reworking of the most looked at portrait in the world. The longer one looks directly at her, the deeper she blushes.

Alongside the admittedly cheap reproduction of the portrait hangs an updating explanatory text. Combining visitor comments with articles discussing the impact of the Mona Lisa, auto-generative software forms a new description and authority over the meaning of the work.

With its famous ‘enigmatic smile’, the Mona Lisa became more famous as an object when it was stolen. Despite the disappearance of the object itself people flocked to gaze at the empty space, the lack of the gaze of the art object turned into an event in which the spectators themselves became the spectacle, the museum experience shifts, reversing the gaze onto the viewer.

No wonder she’s blushing.

Tom Keene is a mixed media artist whose multi-disciplinary work includes sensor driven environments, reactive video and robotic installations to participatory arts projects in community settings.

Kypros Kyprianou is an artist and filmmaker whose collaborative practice focusses on the mediation of science by governmental, public and private organisations.

Their previous collaborative work playfully engages with notions of language, communication and networks using modified objects, unclassified archives, signage and blackboards.

Commissioned as part of "Like Shadows: A Celebration of Shyness"
Supported by the Engineering and Physical Sciences Research Council, as part of Supporting Shy Users in Pervasive Computing project, undertaken by the departments of Informatics and Sociology at The University of Sussex and by Brighton and Hove City Council. Curated by Helen Sloan.

Phoenix Brighton 29th October 2011 8pm (BST) -2am (GMT)
A night of exhibition, performance, intervention, screening, music and contemplation

British Farmers

British Farmers

We created this site for an organisation who's purpose is to enable farmers to produce their own consumer facing marketing activity, they:

British Farmers - Screenshot

Collide, Follow, Set

Part of a series of events (HobNobs) held in InQbate, the experimental interactive media space positioned in Sussex university who I worked closely with during 2009-2010. These events were used as a testing ground for developing a series of interactive digital media installations, most of which have been written using OpenFrameworks. These experiments have informed the development of a new control system with the space to easily enable creative experimentation with the environment.

A collaboration with artist Fiona Geilinger, we created a reactive Kaleidoscopic video work, where after a 20 minute choreographed performance, the audience were free to play with the Kaleidoscopic effect.
Hobnob: KaliedacopeHobnob: KaliedacopeHobnob: Kaliedacope

FollowHobnob: Sparkles
This live video effect detects movement, overlaying directional lines on top of live video imagery.

SetHobnob: Sheep
A miniature film set positioned in the centre of the space invited manipulation by users, where layers of text, props and miniature construction were continual added to throughout the night and projected in huge scale surrounding the entire space.
Hobnob: Sheep

Gasworks to Dome

2006 Tom Keene a Stream Arts commission, London.

Gasworks - header A participatory arts project commissioned by Stream Arts, this multi media website (images & streamed audio) encouraged local people to explore the history of East Greenwich and the Greenwich Peninsula within living memory (1920's to present day).

Gasworks - montage

Using Drupal as a base, we created a website and set of tools so users with varying degrees of technical know-how are able to upload images and tag/comment on oral history recordings, which allows discrete segments of very long interviews can be instantly accessed and categorised along with images.

Created over a number of months during 2006, this website was developed in collaboration with local residents where ideas on: design, interface and functionality were all driven by a series of evening workshops.

Inqbate: Control system

A joint initiative between the University of Sussex and the University of Brighton to explore how a technology-rich environment can support the creative process. An initial engagement to explore the creative potential of the space expanded to full redevelopment of the software control system and partial re-implementation of the hardware infrastructure, including audio, video, lighting and server systems.

 Full software/hardware system audit and redesign. Tools used: C++(OpenFrameWorks), Java (Processing), PHP, MySQL, JQUERY, Apache, creative/technical support.

University of Sussex, University of Brighton, The Centre for Excellence in teaching and learning through Design. HEFCE, SEEDA.

InQbate is a highly configurable multi-media space designed to give you control of all aspects of the environment. You can change the physical environment by moving walls, positioning curtains and arranging different types of seating. A web-based System Controller allows simple and flexible control of the installed technology. You can change the lighting by varying the levels of white (halogen) and switching the coloured (LED) lights throughout the space. You can project visual images (PC desktops, video, CCTV and still images) to any or all the 14 overhead projectors and 10 plasma screens. You can route audio from any PC or auxiliary input to any or all of the ceiling and wall-mounted speakers.

London isn't venice yet

A brightly clad troupe of masked Venetians are travelling along the river Thames and London canal network looking for a new home after floods have engulfed Venice. Alighting from their boat, accompanied by an accordion and singing Italian love songs: "Ladies and gentlemen, senori and senorina of London, dwellers on the banks of this great waterway: greetings! Greetings from Venezia! Our city. City of water and light, and... carnival! City of mystery, romance and… FLOODS!!!"

Producers: Tom Keene, George Butler
Script: Shaun Mc Carthy (Radio 4, Old Vic).
Director: Alison Goldie (Rainbow Theatre Group, The Weird Sisters T.C, Cirque de Soleil).
Cast: A mixture of professional and amateur actors.

[Pictures coming soon]

Norfolk Drugs Action Team

Norfolk Drugs Action Team

This website informed about drugs use, alcohol use and services specific to the Norfolk region. A requirement of this website was to produce a bespoke content management system that allowed for full control over html content. Continuous consultation throughout the life of the project clarified the specific needs of this very active & successful drug action team.

I fully designed and programmed this website, supporting Simplyworks, a community-based project which provides training and work opportunities for individuals with long-term social exclusion problems. These individuals were an integral part of the website development team.

Nordaat - Screenshot

Path finder

A continued investigation into the path finding characteristics of the Viberbi algorithm.

I took a altered Roomba Vacuum Cleaning Robot (RVCR) to two locations within lewisham shopping centre to test its operation within a public environment. There were no alterations to the cleaning algorithm as I was avoiding any programming or electronics. I wanted to get some insight of the nature of algorithmic pathfinding in relation to the Viterbi algorithm and peoples reaction to a physical autonomous entity. Thinking in terms of black box technologies I had given the RVCR a security style aesthetic where a rectangle of black metal acted as a platform to which I mounted a technical looking black box and an obviously consumer grade video camera mounted at the front. The black box served no electrical function though it did act as a counterweight to the camera which on its own tipped the RVCR forward preventing any movement. The effect was very crude with elements covered and held together with black gaffer tape with any visible brand names covered with black electrical tape, so my assumption was that any close inspection would dispel any idea that this was a 'real' security device. I also presumed that because the RVCR itself was still clearly visible it might be recognised as a vacuum cleaner.

There was an expectation amongst all of us (myself, Anila, Charlotte) that security would ask us to leave as soon as they caught site of the device. We wanted to avoid this eventuality for as long as possible by selecting an initial location away from the central hub. We chose a position by a cafe with some outside seating which we thought might provide some interesting obstacles for the RVCR and provide likely interaction with the public. We simply placed the RVCR in a clear location surrounded by shop entrances and the cafe and set it going. Initial fears about the RVCR moving too fast were unfounded as what had seemed like a frantic pace within the closed environment of the goldsmiths lab at Laurie Grove, seemed very slow and methodical within the much larger space of the shopping centre.
It instantly grabbed peoples attention drawing a small crowd of onlookers. We attempted to blend into the background, observing peoples reaction, distancing ourselves from its operation. One of the first comments I heard was "who's controlling it?" and lots of people looked around, I presume for evidence of a remote control. A small child got very interested and started to follow the device his mother initially looked concerned but eventually let the child play. As people passed they would stop and point, mostly looking like they were amused, though one mother with a pram was annoyed as the RVCR got in her way.
While the device was operating in open space and didn't encounter any obstacles, its path seemed random where it would travel in a straight line then turn at seemingly arbitrary intervals. This caused great delight with the child and gave the impression that there was human intelligence behind its control that a controller was playing with the child and the general public. A number of times, after criss-crossing the open space for around 4 minutes the RVCR would stop. This may have been due to its algorithm 'giving up' as it was receiving no 'bump' sensor input. The data space provided by sensor inputs may have been deemed uncomputable. Each time the RVC stopped I would walk over and reset it which gave some members of the public opportunity to make connection between the device and myself.


The RVCR eventually bumped into a wall where it develop a different navigational aesthetic as it rapidly performed a sequence of bumps and turns as it hugged the line of a wall, making its way to the entrance of a clothes shop. At this point it didn't look like it was under human control as its movements were more insect like, more threatening even. As it got closer and closer to the entrance we had a brief discussion that perhaps we should stop the device entering the shop (we decided not to) though it eventually stopped at the shop entrance of its own volition, where a young man had stopped to photograph it, a shop assistant was laughing at it and an older man in his 60's had stopped to observe. When restarting the RVCR it made its way deep into the shop at which point the shop manger politely asked us to remove it.
Each person outside the shop had thought the device was controlled by the shopping centre security, though the shop assistant had seen a camera bag over my shoulder so had made a connection that it might have been me controlling it. They were all surprised when I told them it was a camera on top of an automatic vacuum cleaner. A spoke with the older man for a much longer period of time and he eventually stated that he liked the idea of robotic security devices and that he would like to see more of them as they made him feel safer, he also saw opportunity for this standard device to be covered in different form factors so as to provide alternative services, though he never specified what these services might be.
We took the RVCR to a new central location in the shopping centre outside a sports shop and not too far from an information desk. This location was more compact that the previous area, more like a corridor. The device quickly discovered walls and methodically made its way one edge. It would periodically cross the corridor towards the open front of a shop, never entering more than 1 meter. It did not look like anybody had seen us place the device on the ground so were able to maintain anonymous observation. A shop assistant quickly spotted the RVCR and was immediately concerned. Its movements within that context seemed more threatening, perhaps due to the smaller space, speed and aesthetic of movement.

The shop assistant called over three other assistants repeatedly asking the what it was eventually calling over a security guard. At which point I moved to pick the device up. We were told that we should have asked permission to use the device and that the shop assistant was worried that it "could have been anything" which i took to be a reference to a bomb. The security guard then left and called over a police woman who asked us to explain what we were doing. I gave her full details of the project at which point her eyes glazed over and she said that we needed to ask permission to do anything like that.

Performing Rights Library

A collaboration between The Anthill Social, Space Studios and Queen Mary University London.

Design and development of a website for "Performing Rights" a festival of creative dialogues between artists, academics, activists, and audiences investigating relationships between human rights and performance composed of two distinct but related programmes: a Conference and a series of Manifestations. The performing rights library invited online contributions, networks international communities, creates space for dialogue and enables the transmission and documentation of human rights and performance.

It looks at the significance of human rights in times of war and globalisation, makes links between international and local communities and investigates performance practices that will facilitate human rights work.

Performing rights


Commissioned by the Hayward Gallery in 2005 (in collaboration with Mutiny Arts) 'Phantasmagoria' explored the supernatural through an interactive audio-visual installation and live performance.


This project used electrostatic field detectors, miniature cameras and wired props to trigger ambient soundscapes and reactive visual imagery using Puredata and Processing.


An investigation into wireless communication drawing inspiration from Alexander Graham Bell’s Photophone, which allowed for the transmission and reception of sound over a modulated beam of light. After finding simple instructions online I proceeded to build my own version, where I wirelessly transmitted the audio output of an old MP3 player to a small set of speakers via a flickering LED. I quickly discovered that different light sources would generate their own signature sound. This encouraged me to convert the device so it would work as a pair of headphones so my hands would be free to easily search for interesting lights.

The build

Transmitter components: LED, 9v Battery, 470 ohm resistor, 10uf Capacitor.
Receiver components: Photo resistor, 230ohm resistor, 9v battery, Speakers.

Observations: The build
I enjoyed building an analogue circuit without a microcontroller as it felt easier to comprehend the full set of components, their individual action and construction, which got me thinkingabout the mineral origins of each element, something I would like to explore further.
I discovered that the LED could be replaced with a laser, enabling transmission over long distances. Also, if the laser was reflected against a window, it would be modulated enough by sound waves occurring within a room that we could hear the sounds outside, a development that could take the project into many contexts.
Observations: London Bridge & Borough Market
I took the device to London Bridge Station and borough market, where I wore the headphones, and held the receiver up to different lights, a synesthetic experience that completely changed my perception of the environment. A giant LED screen generated rhythmic pulsing sounds as images changed, each shop on the concourse emitted a slightly different buzz or hum, flashing lights would create rhythm and sunlight was silent. When I showed the device to members of the public, I discovered they were more intrigued with audio transmission from the MP3 player than with the buzzes and hums created by lights.

Observations: Fisbury park
I took the device and presented it at a workshop in Finsbury Park and again, people responded with a sense of amazement when demonstrating audio transmitted through the air over a beam of light, this then drew people into exploring the sounds generated by a projector, halogen, LED and Fluorescent light sources. As part of the workshop, I walked around Finsbury Park (with the headphones on), and made chalk marks whenever I found an interesting sound. Annotating the environment felt like I was taking control of the space, making it my own, though I felt self-conscious and had an urge to ‘look official’ the whole time I was wearing the headphones.


A cross-generational oral history archive that reflects how people think and feel about Kettle’s Yard. This website offers a selection of extracts from interviews with people of all ages, some of whom have known Kettle's Yard for decades, some of whom were visiting for the first time.

Kettle's Yard was the private home of Jim and Helen Ede, who lived there from 1957-1973. Kettle's Yard maintains Jim’s vision of the house as a ‘way of life’. His collection of paintings, drawings, prints and sculptures is arranged amongst ceramics, glassware and natural objects, creating a retreat from the busy hustle of our everyday lives. The house and its contents were gifted to the University of Cambridge in 1966 and have since been expanded several times to include a temporary exhibition gallery.

Sam The Wheels

Publication: Download Publication [.pdf 2.54mb]

This Sam - Bus unique project combines historic and contemporary documentary film, visual arts, collaborative web based technologies and grassroots community action. The purpose is not just to conserve and publish rare film footage of 1960s, 70s and 80s Brixton but to bring it alive in communities now, and create a space for people to interpret and reinterpret it in ways that are relevant to their challenges and experiences today.

In Brixton from the late 50s to the present day, Clovis Salmon aka ‘Sam the Wheels’ has captured accounts of everyday life, protest and people, offering a lens through which the struggles, sufferance and joys of those times can be seen with an authenticity uncontaminated by a media agenda. In sharing his historic footage, Sam has served as a living lens on those times, offering his own experience of arriving in London during the 1950s as the catalyst
for a community arts and heritage collaboration that resonates with present day Brixtonians and beyond.

Sam’s Sam - Jesus3 film, ‘The Great Conflict of Somerleyton Road’, follows the story of Jesus Saves, a Pentecostal Church demolished to make way for the ‘Barrier Block’ on Coldharbour Lane, concluding with the aftermath of the 1981 Brixton Riots. It reveals the important role churches played for new Caribbean communities who built their own places of worship and social spaces after being excluded from English churches, pubs and clubs.

The process
The collaboration, entitled ‘Sam the Wheels’, has engaged artists, activists, documentary film makers, curators, volunteers, youth groups, poets, writers, technologists, community figures and others in a rich creative explorative process over the last year. Funded primarily by the Heritage Lottery Fund, the project has drawn on film, photography, arts, web and community development techniques to directly engage people in a co-production that is at once historic and contemporary; factual and artistic; educational and entertaining; personal and communal; above all, interactive and engaging.

The co-producers of the ‘Sam the Wheels’ project are Mutiny Arts (aka George Butler and Tom Keene) and 198 Contemporary Arts and Learning. Together they invited and enabled a multitude of response through a collaborative website, exhibition, publication and series of events. The resulting multi- layered project brings a fresh reinterpretation of recent history by partnering artistic and critically engaged approaches with a diverse array of social and cultural perspectives.

Sam’s 8mm celluloid films were first digitised, then presented in a series of events and workshops, which ran alongside four artists’ commissions. George Amponsah and George Butler, both with origins in documentary film, worked with volunteers and young participants to tell stories and trace ideas of identity, belief and belonging and Nada Prlja learnt about the varying importance of religious ideology within Brixton’s culturally diverse population. And in Tim Blake’s own words “The more I travel the more I realise what an amazingly diverse place London and a lot of England is, OK this may be something to do with our “old Empire” but as Ghandi said ‘No culture can live if it attempts to be exclusive’”.

A core group of local residents aged between 10 and 80, including Sam himself, volunteered their time and made the project come alive through discussion, researching his own narrative, his films, local places and people. They shared their thoughts, findings, films and photographs on a website created with myself, that acted as a sketchbook and sounding board for their discoveries. Rather than attempting to create a definitive archive, its focus was on collecting snippets of information, recording the collaboration, then retrospectively making sense of what has been shared.

Archive seems to be a word that unifies and helps make sense of the project as a whole. It has been a process that has gathered information on the interactions between an individual, a place, institutions and groups of people. Our approach towards the project’s ethnographic material has not been divorced from our relative social, political and cultural positions, a methodology that a museum or historian may employ, but intimately connected through research driven by personal and group experience.

Sam - GodSam - Jesus

Sam - Jesus2Sam - Family

Film screenings

9th Feb 2009 6-8pm. Tate Modern
A screening of the edited highlights of Sams films and an introduction by the man himself.

An edit of Sam's films shown at the Urban Green Fair on Sept 20th September 2009 @ 3pm in the Solar Cinema.

EXHIBITION - December 2008 to 21st February 2009

People Signs & Resistance: On the front line


14th Jan 7-9pm Publication launch 2009
To launch the publication that documents the process of this project
we will be having a Brixton themed poetry slam hosted by Michael Ryan.
To register your interest in participating email

28th Jan 7-9pm. An evening with Sam 2009
Clovis Salmon AKA Sam the Wheels has seen many wonderful things
through his camera come and join writer Michael McMillan for an
evening of relaxed conversation and music with Sam and friends.

11th Feb 7-9pm. Discussion evening 2009
With more government control over our lives and our means to express
ourselves politically, where are the new spaces of resistance? A round
table discussion where artists, writers, young people and political
activists come together to discuss this issue.

Sign X Here

2011 Tom Keene and Kypros Kyprianou, a Stream Arts commission, East Greenwich, London.

Sign X Here continues conversations started in the past Stream projects (A-X and Now Hear This). We've been having a chat with local residents, businesses and organisations around East Greenwich who have shared their ideas to help devise a trail of signs reflecting the observations and experiences of living in this area.

Using quotes and inspiration from; past projects, conversations with previous participants, local residents, local businesses, youth radio station Fresh FM and local police. Activities include local walks with the local Community Police officers, chats over a cup of tea or a pint, group meetings, photographing making and placing signs. A final publication will bring together images of the signs installed with their location expanded by the thoughts and imaginations of participants, gathered during the life of the project.

Kyp & Paulas Walk (marked in pink)
Josh is up for the walk - he's getting out of bed, Dog shit alley, No one chills there (here) anymore, Getting motivated, Flavell Mews, Watch your step, Paula walking in the middle of the road, All the drivers stop, Its a dead end, The legendary lemon cage, GMVs too far (Josh peeled off), March frog day, Opposing sides (somewhere near the river), The two benches along river near west greenwich. 2 of them near each other, "that's where you hang out.", "we're not stupid enough to hang out there. we hang out at the other one. look up trees. pidgeons. look down. we hang out at the other one", Emilirs shop, Eltons, Payless (posso sign), Blind independence, Hair & the City (he's done a mural in there before), You do a lot of walking in this job, Kids get out of the water then we tell them about the rats, You really know your own beat, Being able to help people in a way that wouldn't be possible otherwise, its about managing expectations.

Tom & Donnas Walk (marked in green)
Over railway lines, report stolen motorbike, brick arch, dog shit is horrible, kids from rival estates, I worry about the vulnerable, its lovely getting to know people, community events are really good, we try and talk to as many people as possible, I hope this is helpful, theres a place I want to show you, get the elderly out and about, how do we make sure everybody can be involved, massive wasps nest in cornette sq, there's been talk of street parties, youth sitting on walls, getting moved on, lots of hanging about, where else are they going to go, you can never please people, Evon at the hatclif, the amount o people you see running for a bus, benches are a focus - thats why they sometimes get removed, Buster the boxer dog is over there, if you look at that it makes your eyes go funny, bubbling on flamstead estate, I love it, sold the riverside dream, its all about playing statistics.

Map: Comments on markers below

The Needle Factory

The Needle Factory collaborative research project, performed at The Foundry Gallery, Lewes, E. Sussex, UK. An exploration of morphing the body into the digital realm. Utilising computer vision techniques of blob detection, vector path analysis and kaleidoscopic imagery reacting in real time to performance.

A collaboration between: Mike Bisho (Sound), Fiona Geilinger (performance), Tom Keene (reactive video imagery), Jeremy Radvan (video imagery).

Uncertain Substance

Uncertain substance: The Viterbi Algorithm
A speech recognition algorithm searches radio waves for conversations about money. As an ongoing investigation of the Viterbi algorithm, this project seeks to understand the agency of a mathematical entity that operates as structural thread within the fabric of contemporary society. (continues below)

Conceived in 1966 the Viterbi was originally used for digital signal processing where it detects and corrects errors in digital codes. Its use has subsequently extended through the technologies of speech recognition, DNA analysis, video encryption, deep space, and wireless communications systems. Physical manifestations of this algorithm exists as microchips installed in billions of mobile devices worldwide, enabling communications networks to permeate every conceivable space, blurring distinction between home, work and social environments.

Used to identify patterns and trends of human behaviour, the Viterbi plays a role in automated systems that interpret, record and report on human activity. These systems increasingly make economic decisions, govern response to crime, disaster, health and manage the everyday flow of cities. The Viterbi operates at a deep social level as it constructs new sets of social relations and radically shapes the development of our cities.

Installation Description
I tested two versions of the system, one as an installation in an old porters office in Goldsmiths University, the other as a mobile version built into a shopping trolley which I tested at Moving Forrest at Chelsea College of Art. The porters office version displayed two very dull looking computers one of which was a speech recognition server (SRS) built around the open source project CMUsphinx, and the other was a software defined radio server (SDRS) which was built around a hacked £10 USB TV tuner. The SRS listened to the audio output of the SDRS and if it detected speech then it would stay on that radio station in the hope that it would find a keyword from a list (Money, Credit, Debt, Thousand, Billion, Trillion etc), if it didn't find any words within 20 seconds, then it would trigger the SDRS to find another station where it would begin the process again.

The porters office added its own narrative which I discovered while cleaning it out and getting rid of years of grime and dumped objects - it recorded a pretty depressing history - there were old letters of redundancy, a broken pair of spectacles, betting slips, a small screen marked "payroll". I incorporated these elements in the space as a subtle way of illustrating the entanglement of algorithms into everyday lives and other media systems, where algorithmic reporting and profiling informs and influences our decision making processes, event though these outputs haven't necessarily been planned or programmed, the technology is then exerting its own power and its that mechanism that I want to understand.
[Further description of the project can be found in an interview I did with Regine Debatty of We Make Money Not Art.

The Build
Radio Server, Speech recognition server, Shopping trolley, CCTV Observation screen, Receipt printer, Speaker, Antenna, Notes, Betting slips, Spectacles. For the speech recognition server I utilised the FLOSS project CMUSphinx and for the radio tuning I created a software defined radio using a cheap £10 USB TV tuner which I hacked to create a simple software defined radio.


Uses GQRX (C++), CMU SPhinx (C with a Python wrapper) and Python servers to communicate between components situated on multiple machines. The install process is not for the faint hearted! Follow instructions of reach of the software packages then use the scripts below to connect them all together.

Startup script

cd /script/root/dir
nohup gqrx-build-desktop-Desktop_Qt_4_8_1_for_GCC__Qt_SDK__Release/gqrx &
sleep 5
nohup python &
nohup python &

Python voice recognition code using CMU Sphinx

#!/usr/bin/env python
# Tom Keene
# Script evolved from: Carnegie Mellon University.
# You may modify and redistribute this file under the same terms as
# the CMU Sphinx system.  See
# for more information.
# =======TODO=========
# - Check / set current audio model.
# - Create audio model.
# - Auto soundcard swap.
# - Keep limited text
# ====================
import threading
import re
import time
import socket
import sys
import errno
import pygtk
import gtk
import gobject
import pygst
import gst
class DemoApp(object):
    """GStreamer/PocketSphinx Demo Application"""
    def __init__(self):
        """Initialize a DemoApp object"""
    def init_client(self):
        self.TCP_IP = '' 
        self.TCP_PORT = 50000
        """Found Money Server"""
        self.TCP_IP2 = ''
        self.TCP_PORT2 = 50001
        """Shared Vars"""
        self.BUFFER_SIZE = 1024
        self.MESSAGE="Change Frequency"
    def client_connection(self): 
        s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
        s.connect((self.TCP_IP, self.TCP_PORT))
        data = s.recv(self.BUFFER_SIZE)
        # print "Received:", data

    def init_timer(self):
        self.myshed = threading.Timer(5.0, self.checkbordem)  
    def checkbordem(self):
        mytimer = self.checktimer()
            print "Don't understand || Bad recieption" 
            print "Changing Frequency" 
        self.myshed = threading.Timer(2.0, self.checkbordem).start()
    def checktimer(self):
        return time.time()-self.timer
    def starttimer(self):
        self.timer = time.time()
    def init_keywords(self):
        """Load External File With Keyword List"""
        keywords = open("keywords.txt").read()
        keywords = keywords.replace("\n", '')
        keywords = keywords.replace(' ', '')
        keywords = keywords.upper()
        keywords = keywords.split(',')
        print "KEYWORDS:"
        print keywords
        self.keywords = keywords
    def init_gui(self):
        """Initialize the GUI components"""
        # Setup the window
        self.window = gtk.Window()
        self.screen = self.window.get_screen()
        w = self.screen.get_width();
	h = self.screen.get_height()/3;
        self.window.connect("delete-event", gtk.main_quit)
        self.window.set_default_size(w, h)
        self.window.set_usize(w, h) # make window fixed size
        self.window.set_title("<!-----Searching Conversation-----!>")
        vbox = gtk.VBox()  
        # Manage the textarea
        self.textbuf = gtk.TextBuffer()
        self.text = gtk.TextView(self.textbuf)
        # Setup the button
        #self.button = gtk.ToggleButton("Report")
        #self.button.connect('clicked', self.button_clicked)
        #vbox.pack_start(self.button, False, False, 2) # refernce expand, fill, padding

    def init_gst(self):
        """Initialize the speech components"""
        # Set audio source to gconfaudiosrc OR  alsasrc OR pulseaudiosrc OR jacksrc
        self.pipeline = gst.parse_launch('alsasrc ! audioconvert ! audioresample '
                                         + '! vader name=vad auto-threshold=true '
                                         + '! pocketsphinx name=asr ! fakesink')
        asr = self.pipeline.get_by_name('asr')
        asr.connect('partial_result', self.asr_partial_result)
        asr.connect('result', self.asr_result)
        asr.set_property('configured', True)
        bus = self.pipeline.get_bus()
        bus.connect('message::application', self.application_message)
    def asr_partial_result(self, asr, text, uttid):
        """Forward partial result signals on the bus to the main thread."""
        struct = gst.Structure('partial_result')
        struct.set_value('hyp', text)
        struct.set_value('uttid', uttid)
        asr.post_message(gst.message_new_application(asr, struct))
    def asr_result(self, asr, text, uttid):
        """Forward result signals on the bus to the main thread."""
        struct = gst.Structure('result')
        struct.set_value('hyp', text)
        struct.set_value('uttid', uttid)
        asr.post_message(gst.message_new_application(asr, struct))
    def application_message(self, bus, msg):
        """Receive application messages from the bus."""
        msgtype = msg.structure.get_name()
        self.partial = 0;
        if msgtype == 'partial_result':
            self.partial_result(msg.structure['hyp'], msg.structure['uttid'])
              #print "Viterbi: Defining most probable sequence"
              self.partial = 1
        elif msgtype == 'result':
            # Print complete message to text box
            hyp = msg.structure['hyp']
            self.final_result(hyp, msg.structure['uttid'])
            self.partial = 0
            searchtext = hyp
            nums = len(hyp.split(" "))
               print "Interesting conversation: "+str(nums)+" words" 
               print "Continue search on this frequency"
            # Perform keyword search
            for item in self.keywords:
                if searchtext.find(item) > -1:
                    print "!!!!Matched Keyword:"+item
                    """Found Money Server"""
                    s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
                    s.connect((self.TCP_IP2, self.TCP_PORT2))
                    s.send("Found - "+item)
                    data = s.recv(self.BUFFER_SIZE)
            # Create a new paragraph /  ivider
            self.textbuf.insert_at_cursor(" | ")               
                self.textbuf.set_text("TEXT BUFFER: ")
    def partial_result(self, hyp, uttid):
        """Delete any previous selection, insert text and select it."""
        # All this stuff appears as one single action
        self.textbuf.delete_selection(True, self.text.get_editable())
        ins = self.textbuf.get_insert()
        iter = self.textbuf.get_iter_at_mark(ins)
        self.textbuf.move_mark(ins, iter)
        nums = len(hyp.split(" "))
    def final_result(self, hyp, uttid):
        """Insert the final result."""
        # All this stuff appears as one single action
        self.textbuf.delete_selection(True, self.text.get_editable())
        print " "
        print "Viterbi matched most likely text:"
        print hyp
        print " "
    #def button_clicked(self, button):
    #    """Handle button presses."""
    #    if button.get_active():
    #        button.set_label("Report")
    #        #self.pipeline.set_state(gst.STATE_PLAYING)
    #    else:
    #        button.set_label("Report:2")
    #       # self.pipeline.set_state(gst.STATE_PAUSED)
    #        #vader = self.pipeline.get_by_name('vad')
    #        #vader.set_property('silent', True)

app = DemoApp()

List of keywords to search for in Keywords.txt

account,add,asset,bank, balance,billion,borrow,broke,buy,cash,cheque,check,cheap,cleared,coin,

Quick hack to get GQRX to change radio channels by using key commands

#!/usr/bin/env python
from socket import *     
import os
# Grab name of GQRX window
p = os.popen("xwininfo -root -all | grep ezcap |  awk '{print $1}'")
WINDOWREF = p.readline()
WINDOWREF = WINDOWREF.replace("\n", '')
wincommand = 'xvkbd -window '+str(WINDOWREF)+' -text "f"'
   print "No windo0w refernece"
   wincommand = 'xvkbd -text "No available window"'
##let's set up some constants
HOST = ''    #we are the host
PORT = 50000    #arbitrary port not currently in use
ADDR = (HOST,PORT)    #we need a tuple for the address
BUFSIZE = 4096    #reasonably sized buffer for data
# If the port is already open then kill the process
#while True:
command = 'kill -9 $( lsof -i:'+str(PORT)+' -t )'
## now we create a new socket object (serv)
## see the python docs for more information on the socket types/flags
serv = socket( AF_INET,SOCK_STREAM)    
serv.setsockopt(SOL_SOCKET, SO_REUSEADDR, 1)
##bind our socket to the address
serv.bind((ADDR))    #the double parens are to create a tuple with one element
serv.listen(5)    #5 is the maximum number of queued connections we'll allow

print 'listening...'
while True:
   conn,addr = serv.accept() #accept the connection
   print '...connected!'
   print "COMMAND:"+wincommand

Python server to receive "Found Money" Notifications

from socket import *
from datetime import datetime
import os
##let's set up some constants
HOST = ''    #we are the host
PORT = 50001    #arbitrary port not currently in use
ADDR = (HOST,PORT)    #we need a tuple for the address
BUFSIZE = 4096    #reasonably sized buffer for data

# If the port is already open then kill the process
#while True:
command = 'kill -9 $( lsof -i:'+str(PORT)+' -t )'
## now we create a new socket object (serv)
## see the python docs for more information on the socket types/flags
serv = socket( AF_INET,SOCK_STREAM)
##bind our socket to the address
serv.bind((ADDR))    #the double parens are to create a tuple with one element
serv.listen(5)    #5 is the maximum number of queued connections we'll allow

serv = socket( AF_INET,SOCK_STREAM)
##bind our socket to the address
serv.bind((ADDR))    #the double parens are to create a tuple with one element
serv.listen(5)    #5 is the maximum number of queued connections we'll allow
print 'listening...'
o = 1
   conn,addr = serv.accept() #accept the connection
   data = conn.recv(1024) # receive up to 1K bytes
   mytime = str(
   print "          Found Money: "+mytime
   os.system('espeak "'+data+'" ')
   conn.send('Got message')

Exhibited / Performed
6th -8th July 2012: MA Interactive media Exposition. Installed Goldsmith University in the Janitors office.
Mobile version performed at Moving Forest 2012 (see last image in series below).

Under Their Skin

A collaboration with Mutiny Arts, this artists commission from Independent Photography brought together 2 distinct groups within the Greenwich community: a group of elderly residents and a group of young people aged 12-16. With these two groups we produced a short film over a 10 week period Summer/Autumn 2005.

Under Their Skin - Banner

The group
Young people consisted entirely of girls from a local youth club and the elderly group were gathered from a residential care home.


Viewpoints - header
A commission to collate images generated throughout the project.
"On October 1st, 2004, 141 children and young people across London all did the same thing: They went out to take photographs of one day in their lives. They had been invited by 16 Children's Fund Partnerships to capture their ViewPoints on digital camera."

Viewpoints - thumbs
Viewpoints - logos

Wireless Woodpecker

As a continued investigation into wireless signals, I wrote a wireless sniffer application and created a wireless trigger (moving towards the form of a woodpecker) that responds to wireless signals found in the landscape. The woodpecker taps out the number of wireless signals in the area, serving as a warning or communicating wireless presence.

Names of routers left on their default setting serve as a advert for a corporate Internet Service Providers, signals promising freedom "BTopen", "SKY+". Signals are encrypted, hundreds of signals are marked <hidden>, and the odd home router broadcasts a personal name such as "paul", or "Steves Router". An exploration of invisible boundaries, I have been testing wireless communication borders, searching for data and network structures, unfolding the wireless landscape. Performing a series of experiments and research tasks that help me to understand what wirelessness is, I aim to make the unseen physical, reading software and data as an everyday language which we can unpack to reveal real world physical power structures.

The Build
The device was built using an old doorbell I found in a skip. The solenoid from the doorbell was controlled using an Arduino board attached to a WiShield (no longer manufactured) purchased from ebay. The intention was for a Perl script running on a laptop or iPhone, to search for personalised wireless routers that had been given personal names, then for the laptop/iPhone to wirelessly trigger the Woodpecker, tapping that a signal had been found. I only managed to partially finish the scripts.

London Community Resource Network (LCRN) is a social enterprise charity supporting organisations and communities working to manage resources sustainably, especially through waste prevention, reuse and recycling.

Their membership consists of a diverse array of charities, social enterprises, community organisations, private companies, local authorities, citizens and experts working in different ways to make better use of resources in London. Members are engaged in waste prevention, composting, recycling and reuse of household and municipal waste, and much more.

Role Full redesign and development of LCRN's website utilising Drupal, enabeling LCRN full control over the content of their website.

Stream Arts