Digital Self Final / MFADT Thesis

null

The Qualified Life is an embodied Dystopian experience that re-imagines the relationship between ourselves and our work, between the Quantified Self and the Quantified Other, and between quantification and qualification.

Imagine arriving at a job interview, and having to prove your qualifications to a machine.  Would you trust this machine to understand your personality and how you are qualified for the job? How would you feel during this experience, having no knowledge of what data sets the interviewer are using to judge your aptitude for the job?

In taking a critical and satirical approach to the current trend of Quantified Self and Internet of Things in product design, this year long research explores the ethical quandaries of employee monitoring and risk management policies in the workplace, as they exist in the form of gamification, pervasive computing technology, and corporate wellness.

FINAL: Feather Skirt- Betty Quinn/ Birce Ozkan/ Jagrat Desai

_dsc0074

First iteration that I came up with  for major studio:

The feather skirt is a smart and fashionable piece, which features the technology of a navigational instrument.

This augmented skirt gives the user an additional layer of sensing that allows her to find the direction by imitating nature. The garment appropriates the migration method of birds; these are known as being excellent navigators; they have built-in biological compasses to tell them which way to fly based on detecting variations in the earth’s magnetic field.

photo 35

The skirt has embedded electronics that include a microcontroller, servo motors and an electronic compass that measures earth’s magnetic field; this last component is adjusted to detect north. If the wearer walks towards the correct cardinal direction, an array of feathers connected to the 2-servo motors rise, else the feathers go down and stop moving.  The piece serves as a metaphor, which highlights a method of appropriation where the human nature is enhanced through the process of bio-mimicry.

10301186_2931651981510_6502755299091712453_n (1)

For the digital self, Betty,Jagrat and I decided to control the movement of feathers with using Nuerosky Mindwave. The speed and height of the top layer of feathers are determined by the attention level. Continue reading FINAL: Feather Skirt- Betty Quinn/ Birce Ozkan/ Jagrat Desai

Treading Waves Final – Julie & Ayo

Treading Waves Final – Julie & Ayo

____________________________________________________________________________

CONCEPT:

The goal of this project is to create a set-up to construct emotional signifiers using visual stimulus.  The first model will be using the example of the emotion disgust.

Our conceptual foundation for this project is based on several psychological and neurological experiments from brain wave monitoring to early experiments in artificial intelligence. For example the works of Alan Turing and subsequently the Turning Normalizing Machine  (http://we-make-money-not-art.com/archives/2013/09/the-turing-normalizing-machine.php#.U3kbuFhdVxs) influenced us greatly to create emotional signifiers that could one day be mapped to artificial intelligence in the future.

We were also inspired by the mindflex homework assignment where a video clip from the movie Clockwork Orange was distorted when using the attention and meditation values while the viewer watched an elicit clip.

____________________________________________________________________________

PRECEDENT

The Turing Test

Turing Normalizing Machine

 Use of Visual Stimulus in Clockwork Orange

____________________________________________________________________________

INTRODUCTION

We plan to use the NeuroSky device to read raw data from the subjects brain. By subjecting the subject to various visual stimulus we expect to be able to measure their rest, excitement, and disgust levels. Each subject will be subjected to a series of visual cues, by monitoring several subjects we expect to get a general (normalized) understanding of the average rest, excitement, and disgust levels associated with the cues presented.

STIMULUS

We would run the experiment in this order:

1.  Image to pick up signal

black-bg1

2. Image of positive food

food1

3. Blank Image

black-bg

4. Image of negative food

moldyFood

____________________________________________________________________________

TESTER

tester

Mock-up of Testing Structure

IMG_2173_ IMG_2176_IMG_2172_IMG_2169_

Testing In Progress

____________________________________________________________________________

BRAINWAVE GRAPH SCREENSHOTS

Using the Thinkgear library created by Akira Hayasaka, we edited the library where the data parses out the different channels, and parsed out every channel available.  Then we kept the tweenlite addon that smooths out the jumpiness of the data influx and translated the data into a polyline graph for the viewers. Each line below represents raw, attention, meditation, delta, theta, high alpha, low alpha, high beta, low beta, high gamma and low gamma.

Attention and meditation are the yellow and blue lines, and raw is the lime green line running along the top.

The delta, theta, high and low alpha, beta, and gamma waves are the displayed below, with higher and more abstract numbers.

Screen shot 2014-05-15 at 4.53.09 PM Screen shot 2014-05-15 at 4.53.11 PM Screen shot 2014-05-15 at 4.55.55 PM Screen shot 2014-05-15 at 4.55.56 PM  Screen shot 2014-05-15 at 4.56.05 PM

Screen shot 2014-05-15 at 4.55.59 PM

We plan to capture the data at each stimulus point to analyze the differences from control state to stimulated state.

____________________________________________________________________________

GITHUB

Link to our openFramesworks sketch for displaying and recording data: Treading Waves Graph

____________________________________________________________________________

CHALLENGES

The following list elucidates the challenges we encountered or will expect to encounter

  1. Syncing video cues to brain wave data recording

  2. Parsing the quantity of data

  3. Analyzing the RAW data values

  4. Attributing RAW data values to significant emotional states

  5. Desensitization of subjects

  6. Eliminating errant data resulting from muscular activity

____________________________________________________________________________

FUTURE TESTS

In our future tests, we will run the experiment using real food as a variable, and also smell exclusively as another variable.

____________________________________________________________________________

POWERPOINT PRESENTATION

Treading Waves Power Point

take control :: radiohead installation :: roula + enrica + fabiola :: final project

Screen Shot 2014-05-17 at 12.56.21 AM

This project is an interactive installation that meditates on the links between real and digital space in relation to the real and digital self. By creating a link between brainwave biometrics and data, the project redefines what it means to be human from a state submersed in genetic memory to being reconfigured in the electromagnetic field of the circuit, in the realm of media in its visual form.

Screen Shot 2014-05-17 at 1.12.12 AM

A clip from “no surprises” official video by Radiohead is modified according to the user’s attention levels. The video shows Thom York drowning. The metaphor is created by linking the attention levels to the fate of Thom through the user’s control over the water level. It is a life and death situation. The higher your attention level is, the lower the water level is and Thom survives. The lower your attention levels get, the higher the water level goes and Thom drowns and suffocates; leaving you with interesting facts about media and awareness.

Screen Shot 2014-05-17 at 12.57.34 AM

Here is a link to a dropbox folder that includes the openframeworks code as well as video documentation

Screen Shot 2014-05-17 at 12.57.03 AM

 

UV_TASTE / fito_segrera

UV_TASTE is a wearable mouth piece that takes UV light and translates it into taste on the tongue of the user. Ultra Violet Radiation is is a phenomena that occurs outside of a humans natural perception range, it is manifested as a wave which frequency is above violet color; the highest frequency perceived by the human eye. This object augments the human body and overlays on the user a new sensory input; it allows him/her to perceive the unperceived; it raises awareness on a natural invisible phenomena that affects our lives without being noticed.

1

dani

ezgif-save (2)

3

 

 

 

BioLite – Maxine and Stephanie

biolite

BioLite is a visualization of neurofeedback. Laser cut from acrylic, the lamp is modeled after the traditional wave formation of EEG data- in this case, Mediation levels. The LED inside the lamp  changes  color and fade speed based on varying levels of attention.

Watch our final concept video here: https://vimeo.com/95454375

This is the code we used:

//this is a mash up of code from the BrainSerialTest that we used in class, code from Kyle Li to break up the CSV, and code from Julie Huynh to assign the serial output to a PWN and thus connect it to the LED

#include <Brain.h>
int lightPin = 0;  //define a pin for Photo resistor
const int ledPin = 6; // the pin that the LED is attached to
int incomingByte;      // a variable to read incoming serial data into
int brightness = 0;    // how bright the LED is
int fadeAmount = 10;    // how many points to fade the LED by
Brain brain(Serial);
void setup()
{
    Serial.begin(9600);  //Begin serial communcation
    pinMode(ledPin, OUTPUT);
    analogWrite(ledPin, 255);
}
void loop()
{
     if (brain.update()) {
        //Serial.println(brain.readErrors());
        String temps = brain.readCSV();
        String arrays[3];
        //String[] arrayS = temps.split(“,”);
        //Serial.print(brain.readCSV());
        int index1 = -1;
        int index2;
        for(int i=0; i<3;i++){
        index1 = temps.indexOf(“,”,index1+1);
        index2 = temps.indexOf(“,”,index1+1);
        arrays[i] = temps.substring(index1+1, index2);
        }
        Serial.print(“data:”);
        Serial.println(arrays[2]);
        //String [] subtext = splitTokens(temps,”,”);
    }
    //Serial.println(analogRead(lightPin));
   delay(10); //short delay for faster response to light.
   // see if there’s incoming serial data:
  if (Serial.available() >= 0) {
    // read the oldest byte in the serial buffer:
    incomingByte = Serial.read();
//    Serial.print(“sensor = “);
//    Serial.print(sensorValue);
//    Serial.print(“/t output = “);
//    Serial.print(outputValue);
    analogWrite(ledPin, incomingByte);
    delay(20);
  }
}

 

IMG_1928

 

IMG_1930

 

visualizer-screenshot

The most challenging part of this project was getting the code to work. The hardest step was parsing the serial data in order to create a variable for the ‘attention’ data.  Also, the Mind Flex was not easy to work with. We would love to implement this concept with Open BCI in the future. We like the idea of embodying the qualified self in an object through neurofeedback visualizations.

There are so many implications of what this kind of data can do for people, both good and bad. Does seeing your level of attention improve it? Or are you really unable to control it? There is a lot of talk of using neurofeedback as behavior therapy. It would be interesting to see a study done on how effective this idea really is. Or comparing the attention levels of someone diagnosed with ADHD with someone not.

 

 

Final Assignment: Barb, Ashley, & Soo

Our concept for the final assignment was similar to our original plan for the first BCI assignment that we could not get to work. This new plan also included adding a physical component to the project. Determined to make it work, we continued to push our skills.

The first attempt at prototyping, we created an Open Framworks program that played a sound file whenever the meditation level reached a certain point. This prototype did not work well and seemed glitchy.

Next we tried to use the ToneAC library in arduino. We successfully got the code to run, but it too was spotty and very electronic sounding.  Arduino Prototype

Screen Shot 2014-05-15 at 9.54.24 AM

Moving forward we began looking back into processing libraries, as it was simpler to code in. Eventually we agreed it was easier and we’d have more potential for growth if we used our own code in Open Frame works. We began by building a simple line that adjusted with the tone and frequency of the wave. Sample video can be found here.  The code can be found here. We could not use this for our project because we could not make multiple lines that connected tonally and linear to other neural oscillations.

Screen Shot 2014-05-15 at 2.28.47 AM

The next step we took to incorporate all the brainwaves with their individual sound frequencies was to map their motions to a bouncing ball that plays the equivalent note to each peak.  This gave us more of an effect and we were able to have multiple headset inputs, so people could play together. Unfortunately, most of the time it just sounds like dissonant melodies. The code can be found here, and the video here.

Screen Shot 2014-05-15 at 9.58.56 AM

I took this project further in my Major Studio collaboration with Gabrielle Patacsil. Gabrielle took the code even further while I created hardware hacks to emphasize the mindscape. Here is our documentation and video.

 

 

 

NEURO GAMING… AWESOME!

Last week I was lucky enough to attend the Neuro Gaming conference in San Fransisco. Not only was I able to listen to the panel discussions, but I spent a large portion of my time hooked up to Open BCI.

First, I attended my very first hackathon. I assisted in making the training software for a classifier, as well as hardware hacking a hex robot for the team to use. The goal was to have the robot move as the user thought of moving their right/left/forward… but the classifier was not completed in time. The end result was the robot moving on EMG readings through the Open BCI. The team was given honorable mention.  Here is some video documentation taken during the second hackathon at Berkley. http://youtu.be/Adpo_ZuyIBI

The conference itself was not limited to neuro gaming. Representatives from Oculus, and many famous game designers attended. I was particularly interested in the therapy based aspect of the conference, as much of my work revolves around this topic. I took notes and brought back pamphlets of all the cool stuff they had there. It will be accessible on my google drive here.

Below are some pictures and a video of some of my own brainwaves from the conference/demos.  Note how I have a hard time producing an alpha wave.

my brainwaves with Open BCI

Stephanie + Maxine Final Project Update

We decided to change our original concept of a wearable piece to a lamp. The idea is that the LED fades based on the wearer’s level of attention. We like the image of someone reading a book or doing work and if their attention wavered their light source would too. Acting as a manifestation of the wearer’s internal state, the light could provide critical feedback to the wearer (the loss of attention and thus progress) and perhaps motivate them to behave differently.

While getting an LED to change brightness based on the output of the mindflex seems relatively simple, it actually was quite challenging to figure out the code to make this successful. We tried a multitude of different strategies (such as changing from using an RGB strip to just a red LED) and employed the help of anyone that could point us in the right direction – Special shout out to Kyle Li and Julie Huynh !!! They both provided us with critical pieces of code that made our project work.

Progress documentation:

photo 1 copy 2

The LED responds to the different levels that the mindflex outputs.

photo 2 copy 2

The brightness changes based on the higher level of attention.

Here is a video of it working : https://www.youtube.com/watch?v=tau6435PHvg&feature=youtu.be

The final code we used :

//this is a mash up of code from the BrainSerialTest that we used in class, code from Kyle Li to break up the CSV, and code from Julie Huynh to assign the serial output to a PWN and thus connect it to the LED

#include <Brain.h>
int lightPin = 0;  //define a pin for Photo resistor
const int ledPin = 6; // the pin that the LED is attached to
int incomingByte;      // a variable to read incoming serial data into
int brightness = 0;    // how bright the LED is
int fadeAmount = 10;    // how many points to fade the LED by
Brain brain(Serial);
void setup()
{
    Serial.begin(9600);  //Begin serial communcation
    pinMode(ledPin, OUTPUT);
    analogWrite(ledPin, 255);
}
void loop()
{
     if (brain.update()) {
        //Serial.println(brain.readErrors());
        String temps = brain.readCSV();
        String arrays[3];
        //String[] arrayS = temps.split(“,”);
        //Serial.print(brain.readCSV());
        int index1 = -1;
        int index2;
        for(int i=0; i<3;i++){
        index1 = temps.indexOf(“,”,index1+1);
        index2 = temps.indexOf(“,”,index1+1);
        arrays[i] = temps.substring(index1+1, index2);
        }
        Serial.print(“data:”);
        Serial.println(arrays[2]);
        //String [] subtext = splitTokens(temps,”,”);
    }
    //Serial.println(analogRead(lightPin));
   delay(10); //short delay for faster response to light.
   // see if there’s incoming serial data:
  if (Serial.available() >= 0) {
    // read the oldest byte in the serial buffer:
    incomingByte = Serial.read();
//    Serial.print(“sensor = “);
//    Serial.print(sensorValue);
//    Serial.print(“/t output = “);
//    Serial.print(outputValue);
    analogWrite(ledPin, incomingByte);
    delay(20);
  }
}

 

 

Final Project Proposal – Barb, Ashley, Soo

Music has a profound effect on one’s body and psyche. As with any physiological rhythm, a strong beat or counter rhythm can simulate a change in their tempos. Brainwave Music Therapy (BTM) uses tones that prompt the brain to synchronize its electrical pulses with the pitch of the tones. Changes in electrical pulses can trigger the user into meditative, calm, alert, or sharper concentrative states.

The therapeutic experience relies on repetition and awareness of self. Through creating a physical artifact of a person’s brainwave composition, we create an opportunity to revisit the mind and give an instance of reflection. The object magnifies the alpha-theta waves in auditory and physical form to influence a state of relaxation and awareness.

Slide04 Slide05 Slide10    Slide20 Slide23 Slide24