All posts by Stephanie

BioLite – Maxine and Stephanie

biolite

BioLite is a visualization of neurofeedback. Laser cut from acrylic, the lamp is modeled after the traditional wave formation of EEG data- in this case, Mediation levels. The LED inside the lamp  changes  color and fade speed based on varying levels of attention.

Watch our final concept video here: https://vimeo.com/95454375

This is the code we used:

//this is a mash up of code from the BrainSerialTest that we used in class, code from Kyle Li to break up the CSV, and code from Julie Huynh to assign the serial output to a PWN and thus connect it to the LED

#include <Brain.h>
int lightPin = 0;  //define a pin for Photo resistor
const int ledPin = 6; // the pin that the LED is attached to
int incomingByte;      // a variable to read incoming serial data into
int brightness = 0;    // how bright the LED is
int fadeAmount = 10;    // how many points to fade the LED by
Brain brain(Serial);
void setup()
{
    Serial.begin(9600);  //Begin serial communcation
    pinMode(ledPin, OUTPUT);
    analogWrite(ledPin, 255);
}
void loop()
{
     if (brain.update()) {
        //Serial.println(brain.readErrors());
        String temps = brain.readCSV();
        String arrays[3];
        //String[] arrayS = temps.split(“,”);
        //Serial.print(brain.readCSV());
        int index1 = -1;
        int index2;
        for(int i=0; i<3;i++){
        index1 = temps.indexOf(“,”,index1+1);
        index2 = temps.indexOf(“,”,index1+1);
        arrays[i] = temps.substring(index1+1, index2);
        }
        Serial.print(“data:”);
        Serial.println(arrays[2]);
        //String [] subtext = splitTokens(temps,”,”);
    }
    //Serial.println(analogRead(lightPin));
   delay(10); //short delay for faster response to light.
   // see if there’s incoming serial data:
  if (Serial.available() >= 0) {
    // read the oldest byte in the serial buffer:
    incomingByte = Serial.read();
//    Serial.print(“sensor = “);
//    Serial.print(sensorValue);
//    Serial.print(“/t output = “);
//    Serial.print(outputValue);
    analogWrite(ledPin, incomingByte);
    delay(20);
  }
}

 

IMG_1928

 

IMG_1930

 

visualizer-screenshot

The most challenging part of this project was getting the code to work. The hardest step was parsing the serial data in order to create a variable for the ‘attention’ data.  Also, the Mind Flex was not easy to work with. We would love to implement this concept with Open BCI in the future. We like the idea of embodying the qualified self in an object through neurofeedback visualizations.

There are so many implications of what this kind of data can do for people, both good and bad. Does seeing your level of attention improve it? Or are you really unable to control it? There is a lot of talk of using neurofeedback as behavior therapy. It would be interesting to see a study done on how effective this idea really is. Or comparing the attention levels of someone diagnosed with ADHD with someone not.

 

 

Stephanie + Maxine Final Project Update

We decided to change our original concept of a wearable piece to a lamp. The idea is that the LED fades based on the wearer’s level of attention. We like the image of someone reading a book or doing work and if their attention wavered their light source would too. Acting as a manifestation of the wearer’s internal state, the light could provide critical feedback to the wearer (the loss of attention and thus progress) and perhaps motivate them to behave differently.

While getting an LED to change brightness based on the output of the mindflex seems relatively simple, it actually was quite challenging to figure out the code to make this successful. We tried a multitude of different strategies (such as changing from using an RGB strip to just a red LED) and employed the help of anyone that could point us in the right direction – Special shout out to Kyle Li and Julie Huynh !!! They both provided us with critical pieces of code that made our project work.

Progress documentation:

photo 1 copy 2

The LED responds to the different levels that the mindflex outputs.

photo 2 copy 2

The brightness changes based on the higher level of attention.

Here is a video of it working : https://www.youtube.com/watch?v=tau6435PHvg&feature=youtu.be

The final code we used :

//this is a mash up of code from the BrainSerialTest that we used in class, code from Kyle Li to break up the CSV, and code from Julie Huynh to assign the serial output to a PWN and thus connect it to the LED

#include <Brain.h>
int lightPin = 0;  //define a pin for Photo resistor
const int ledPin = 6; // the pin that the LED is attached to
int incomingByte;      // a variable to read incoming serial data into
int brightness = 0;    // how bright the LED is
int fadeAmount = 10;    // how many points to fade the LED by
Brain brain(Serial);
void setup()
{
    Serial.begin(9600);  //Begin serial communcation
    pinMode(ledPin, OUTPUT);
    analogWrite(ledPin, 255);
}
void loop()
{
     if (brain.update()) {
        //Serial.println(brain.readErrors());
        String temps = brain.readCSV();
        String arrays[3];
        //String[] arrayS = temps.split(“,”);
        //Serial.print(brain.readCSV());
        int index1 = -1;
        int index2;
        for(int i=0; i<3;i++){
        index1 = temps.indexOf(“,”,index1+1);
        index2 = temps.indexOf(“,”,index1+1);
        arrays[i] = temps.substring(index1+1, index2);
        }
        Serial.print(“data:”);
        Serial.println(arrays[2]);
        //String [] subtext = splitTokens(temps,”,”);
    }
    //Serial.println(analogRead(lightPin));
   delay(10); //short delay for faster response to light.
   // see if there’s incoming serial data:
  if (Serial.available() >= 0) {
    // read the oldest byte in the serial buffer:
    incomingByte = Serial.read();
//    Serial.print(“sensor = “);
//    Serial.print(sensorValue);
//    Serial.print(“/t output = “);
//    Serial.print(outputValue);
    analogWrite(ledPin, incomingByte);
    delay(20);
  }
}

 

 

Final Proposal – Maxine & Stephanie

We are interested in producing a physical output/interpretation of the brain waves through a wearable piece. Something along the same lines as the GER Mood Sweater – http://sensoree.com/artifacts/ger-mood-sweater/ . This item basically categorizes different GSR levels and assigns a color to correspond with it. Another similar project is the intimacy dress – http://www.studioroosegaarde.net/project/intimacy-2-0/.  This dress becomes transparent based on the heartbeat of the wearer. Is it possible to communicate with someone solely through a ‘brain language’ – much like ‘body language’?

5634286849_dd3491dfc4_z

What if you could display different levels of meditation through a similar kind of color indicator system? Would this system be useful ? We are also interested in how this type of wearable could affect your brain waves. Would your brain behavior change based on the sole fact that you are wearing this device? We are interested in exploring how these factors all feed into each other and what the implications of that are.

Thoughts on creating a ‘mind’

According to Kurzweil’s  theory, the brain learns and recognizes patterns, and then arranges this information in hierarchies that make it possible to recall ‘memories’ and make associations in order to understand how language works. But memories have a certain fuzziness to them. What is truth? What are lies? Do these things make up your identity? Somewhere between what you experience in the world and how you act in it, is a space for processing information and building knowledge. Memories are our interpretations of our experiences versus direct representations of our actions. They seem to be represented in the brain as networks of related concepts. One hypothesis is that a healthy memory system copes with massive amounts of information by forming connections between concepts based on associations we’ve developed through experience. (http://musicandmemory.org/) Then, when we’re trying to remember a piece of information our brain  pulls up an assortment of associated concepts which ‘feel’ right, and thus, our cognitive experience is solely embedded in our understanding of the world as related to the stories we create for it. When trying to understand the origin of memories, scientists are led to this topic of synapses and their electrochemical relationships. Thus, memory, and therefore , language is regarded as means of expressing ones identity. There are many languages besides those that are written or spoken. By learning a new language, a person acquires a new way of knowing reality and of passing that knowledge onto others.

It’s curious to me that we automatically associate artificial intelligence with replicating the human brain. Isn’t there validity in modeling a programming system after a different kind of intelligence? Maybe one that is a hybrid of complex systems of thought that include other species. Thinking in this post-human way, we can start to speculate the future of memory enhancement capabilities that go beyond genetic modification/engineering. Only then can the future begin to enter the transhumanist world of cybernetics. Many people (ie. Ray Kurzweil) believe the absorption of technological devices into our bodies, to improve our day-to-day lives, is an inevitable next step. How would altering ourselves in this way change our identity? And furthermore, how would it effect our connection to other humans?

Building the circuit

We prototyped a GSR using an arduino and a breadboard. We then tested our code by making an LED light up in response to the sweat in our fingers. Our next step is soldering the aroma usb to the arduino and having scent be the output.

-Betty and Stephanie

photo

 

Screen Shot 2014-03-13 at 7.52.14 PM Screen Shot 2014-03-13 at 7.51.54 PM

Code:

This example code is in the public domain. We modified it to fit our needs

http://www.arduino.cc/en/Tutorial/Graph

*/

int led = 3;
void setup() {
// initialize the serial communication:
Serial.begin(9600);
pinMode(led, OUTPUT);
}

void loop() {
// send the value of analog input 0:
Serial.println(analogRead(A0));

if(analogRead(A0) > 30) {
digitalWrite(led, HIGH);}
else{
digitalWrite(led, LOW);
}
// wait a bit for the analog-to-digital converter
// to stabilize after the last reading:
delay(2);
}

 

 

Inspirational Work

When I attended the 3D printshow in February in NYC, I was lucky enough to get to sit in on a lecture given by Manu Mannoor -one of the scientists at Princeton University who created a 3D printed ear that can “hear” radio frequencies beyond the normal human range. Using the 3D printing of cells and nanoparticles followed by a cell culture, Mannoor combined a small coil antenna with cartilage, creating what they termed as a ‘bionic ear’.  This type of work really opens the door for future biotechnological projects and reminded me of a lot about what we have been discussing in the digital self. The integration of biosensors in the body  presents a lot of ethical questions as well.

http://www.medgadget.com/2013/07/princeton-bionic-ear.html

3d-printed-bionic-ear

 

Also at the 3D print show, I listened to prosthetic designer, Tom Most talk about how 3D printing is revolutionizing traditional orthopedic and prosthetic practices. His methods combine traditional hands-on methods with digital / 3D print processes, creating a better fitting socket, which is lighter, stronger, and less expensive than the traditional methods of fabrication. I am particularly interested in how bio sensors can be used with prosthetics.

Betty and Stephanie’s project proposal

digital self midterm

 

Animals emit distinct smells for two reasons:

1) To attract a mate

2) As a form of aggression

We were inspired by these instincts to create a series of fashionable, wearable gloves that emit scents triggered by galvanic skin response.  We would use a pungent scent that would be triggered if stress levels were high, as indicated by a GSR system.

- The GSR triggers different odors ranging from bad smells to good smells depending on your mood when interacting with someone

- The GSR would be connected to two fingertips while the scent dispenser would be located on the inside of the wrist

- Aesthetically, the gloves would look like the animals they are representing

6a0120a698251e970b01348668c1f9970c-800wi

DIY GSR sensor-

http://www.google.com/url?q=http%3A%2F%2Fmakezine.com%2Fprojects%2Fthe-truthmeter-2% 2F&sa=D&sntz=1&usg=AFQjCNF9v9NFlz_yoFnKhSQnZjSNkyHkmg

Hello!

Hello- my background is in costume design for film, television, and theater. Even though a large portion of this field resides in the tactile production of the costumes themselves, there is an equal important role of the research that goes into the design process. A costume designer must understand both the objective and the subjective factors that work together to determine why people clothe themselves as they do in order to fully develop the possibilities presented by the characters in any given situation. This understanding comes from observation and research, and shares a lot of similarities with the human centered design process.

My intention since the beginning of the program has always been to focus on wearable technology. Having spent quite some time bouncing around various ideas, I realized that through wearable tech I would be able to utilize all my skills and interests in one field. I hope to concentrate my research in this class on the future of prosthetics, and therefore reframing the very definition of it. I am particularly interested in the intersection between the body and the artificial, and the implications of design that merges the two.

- stephanie

image