The Dream Pattern Generator – Roula, Enrica, Fabiola

:: Concept

The idea is to map your dreams in visual form by generating patterns in relation to the Delta waves. Delta waves are usually associated with the deep stage 3 of NREM sleep, also known as slow-wave sleep (SWS), and aid in characterizing the depth of sleep. They characterize an unconscious state. By the end of a sleep cycle, we aim to print out the created artwork on a thermal printer.

:: Inspiration for visual

Although we have unlimited ways to represent brain activity and so “dreams”, we chose to be inspired by the Typewriter Art because it is a metaphor that perfectly balances abstraction and concreteness. Reuse letters and language codes to create images, a process very similar to what our mind does when it combines logic and abstraction, or when our culture uses codes and symbols to represent and describe the world.

9781780673479.in07

:: Code

The drawing tool is programmed using the Random Walker and modified according to the brainwave values.

Here is a sample screenshot of the outcome.

Screen Shot 2014-04-24 at 8.40.08 PM

 

And here’s a link to the code in dropbox.

This code is now using Attention values to generate the drawings.  The intention is to apply this same method but using the Delta waves values during sleep.

MindFlex Assignment – Julie & Ayo

Title: Mind Bender

Concept: 

Our concept was to use a movie clip from the movie Clockwork Orange where they are brainwashing the protagonist.  We thought it would be a comical parody to reference brainwashing in a 1971 science fiction movie that comments on brainwave technology.  Our goal was to set the attention variable to a red overlay on the playing movie, so that it translates to the more attention you give to the movie it evokes red which is a color that connotes anger and violence.  The more attention you give the movie the images are distorted by the red overlay, so it censors the playing movie based on your attention level.  Then we wanted to control the running frame rate of the movie with the meditation level, so the movie would play faster or slower based on how meditative the user is while watching the movie.  The attention and meditation creates a cyclical interaction of the movie with the viewer by using those variables to control, but also be effected by the viewer’s reaction to the playing movie clip.

However, we experienced technical difficulties with receiving a serial read for attention and meditation, so we substituted those values with High Beta and High Alpha.  We used the ofMap(); function to map those numbers to be translatable for the color tint and frame rate by inputting the lowest and highest values for those variables and mapped it to the color tint, and to the frame rate.

Screenshots: 

Effects of the red overlay on movie:

screenshotscreenshot3screenshot4screenshot3screenshot2 

Video:

Code:

https://github.com/huynj316/digitalSelf_S14

Meditation Moon demo

We used the data coming in from the meditation channel to manipulate the visual of the moon.  More of the moon is revealed as the person’s meditation level increases. Main part of the Processing code is below, the channels, connection light and graphs are left untouched:

import processing.serial.*;
import controlP5.*;

ControlP5 controlP5;

Serial serial;

float[] intData = new float[11]; // this is where we are going to store the values each frame

Channel[] channels = new Channel[11];
Monitor[] monitors = new Monitor[10];
Graph graph;
ConnectionLight connectionLight;
String serialPort = “/dev/tty.usbmodem1411”;

int packetCount = 0;
int globalMax = 0;

int myValue = 0;

String scaleMode;

PImage moon;

void drawStar(float dataValue) {
// float newDataCalc = map(intData[2], 0, 100, 0, 360);

for (int i = 15; i < 360; i+=30) {
// println(“intData[2] is ” + intData[2]);
// println(“dataCalc is ” + newDataCalc);
float newDataPoint = dataPoint*3.6;
// println(“new data point” + newDataPoint);
float angle = radians(random(0,dataValue));
//float angle = radians(newDataCalc);
// float angle = radians(random(0, 360));
//println(“random is ” + angleRandom);
// println(“angle is ” + angle);
// println(“intData2 is ” + intData[2]);
// println(“angle is ” + angle);

float x = (200/2 + cos(angle)* (min(200, 200)/2)*0.72);
float y = (200/2 + sin(angle)* (min(200, 200)/2)*0.72);
//line(width/2, height/2, x, y);
for (int z = 0; z < 360; z+=30) { //30 or 120
float angle2 = radians(z);
float x2 = (200/2 + cos(angle2)* (min(200, 200)/2)*0.15);
float y2 = (200/2 + sin(angle2)* (min(200, 200)/2)*0.15);
//line(width/2, height/2, x2, y2);
//this line creates the big star and middle star
line(x2, y2, x, y);
float x3 = (200/2 + cos(angle)* (min(200, 200)/2)*0.85);
float y3 = (200/2 + sin(angle)* (min(200, 200)/2)*0.85);
//this line extends the big star
line(x, y, x3, y3);
float x4 = (200/2 + cos(angle2)* (min(200, 200)/2)*0.88);
float y4 = (200/2 + sin(angle2)* (min(200, 200)/2)*0.88);
//this line closes the extensions
line(x3, y3, x4, y4);
}
}
//delay(3000 – intData[2]*30);
}

 

void setup() {
// Set up window
size(1024, 768);
frameRate(60);
smooth();
frame.setTitle(“Processing Brain Grapher”);

moon = loadImage(“moon.jpg”);

// Set up serial connection
println(“Find your Arduino in the list below, note its [index]:\n”);

for (int i = 0; i < Serial.list().length; i++) {
println(“[” + i + “] ” + Serial.list()[i]);
}

// Put the index found above here:
serial = new Serial(this, serialPort, 9600);
serial.bufferUntil(10);

// Set up the ControlP5 knobs and dials
controlP5 = new ControlP5(this);
controlP5.setColorLabel(color(0));
controlP5.setColorBackground(color(0));
controlP5.disableShortcuts();
controlP5.disableMouseWheel();
controlP5.setMoveable(false);

// Create the channel objects
channels[0] = new Channel(“Signal Quality”, color(0), “”);
channels[1] = new Channel(“Attention”, color(100), “”);
channels[2] = new Channel(“Meditation”, color(50), “”);
channels[3] = new Channel(“Delta”, color(219, 211, 42), “Dreamless Sleep”);
channels[4] = new Channel(“Theta”, color(245, 80, 71), “Drowsy”);
channels[5] = new Channel(“Low Alpha”, color(237, 0, 119), “Relaxed”);
channels[6] = new Channel(“High Alpha”, color(212, 0, 149), “Relaxed”);
channels[7] = new Channel(“Low Beta”, color(158, 18, 188), “Alert”);
channels[8] = new Channel(“High Beta”, color(116, 23, 190), “Alert”);
channels[9] = new Channel(“Low Gamma”, color(39, 25, 159), “Multi-sensory processing”);
channels[10] = new Channel(“High Gamma”, color(23, 26, 153), “???”);

// Manual override for a couple of limits.
channels[0].minValue = 0;
channels[0].maxValue = 200;
channels[1].minValue = 0;
channels[1].maxValue = 100;
channels[2].minValue = 0;
channels[2].maxValue = 100;
channels[0].allowGlobal = false;
channels[1].allowGlobal = false;
channels[2].allowGlobal = false;

// Set up the monitors, skip the signal quality
for (int i = 0; i < monitors.length; i++) {
monitors[i] = new Monitor(channels[i + 1], i * (width / 10), height / 2, width / 10, height / 2);
}

monitors[monitors.length – 1].w += width % monitors.length;

// Set up the graph
graph = new Graph(0, 0, width, height / 2);

// Set yup the connection light
connectionLight = new ConnectionLight(width – 140, 10, 20);
}
void draw() {

//println(currentValue);

//HERE’s the good stuff ========================================//
/*NOTE: In order to see these values being printed, you must have a signal quality better than 200…
this can be changed at the bottom of this page of code here:
if ((Integer.parseInt(incomingValues[0]) == 200) && (i > 2)) {
newValue = 0;
}
*/

for (int i = 0; i < channels.length; i++) {    //loop through all 11 channels
Point tempPoint = channels[i].getLatestPoint(); //create a temporary point to hold the latest point (from points) in your channel class… getLatestPoint is a method of the Channel class
intData[i] = tempPoint.value; //store the value property of your new point (you can see that this value is set on instatiation) into the corresponding intData cell (we created intData at the top for this reason)
//println(“intData[” + i + “]: ” + intData[i]); //print out intData[i] to be sure
// println(“real data ” + intData[2]);
//println(“multiplied data ” + intData[2]*3.6);
}

//HERE’s the good stuff ========================================//
// Keep track of global maxima
if (scaleMode == “Global” && (channels.length > 3)) {
for (int i = 3; i < channels.length; i++) {
if (channels[i].maxValue > globalMax) globalMax = channels[i].maxValue;
}
}

// Clear the background
background(255);
// Update and draw the main graph
graph.update();
graph.draw();

// Update and draw the connection light
connectionLight.update();
connectionLight.draw();

// Update and draw the monitors
for (int i = 0; i < monitors.length; i++) {
monitors[i].update();
monitors[i].draw();
}

pushMatrix();
scale(0.5, 0.5);
image(moon, 0, 0);
popMatrix();
noFill();
stroke(255);

pushMatrix();
scale(2.5, 2.6);
translate(10, 0);
drawStar(intData[2]);
// println(incomingValues[2]);
popMatrix();

println(intData[2]);
// delay(3000 – intData[2]*30);
}

void serialEvent(Serial p) {
// Split incoming packet on commas
// See https://github.com/kitschpatrol/Arduino-Brain-Library/blob/master/README for information on the CSV packet format

String incomingString = p.readString().trim();
// print(“Received string over serial: “);
// println(incomingString);

String[] incomingValues = split(incomingString, ‘,’);

// Verify that the packet looks legit
if (incomingValues.length > 1) {
packetCount++;

// Wait till the third packet or so to start recording to avoid initialization garbage.
if (packetCount > 3) {

for (int i = 0; i < incomingValues.length; i++) {
String stringValue = incomingValues[i].trim();

int newValue = Integer.parseInt(stringValue);

// if(i==2){
// myValue = newValue;
// println(myValue);
// }

// Zero the EEG power values if we don’t have a signal.
// Can be useful to leave them in for development.
if ((Integer.parseInt(incomingValues[0]) == 200) && (i > 2)) {
newValue = 0;
}

channels[i].addDataPoint(newValue);
}
}
}
}
// Utilities

// Extend Processing’s built-in map() function to support the Long datatype
long mapLong(long x, long in_min, long in_max, long out_min, long out_max) {
return (x – in_min) * (out_max – out_min) / (in_max – in_min) + out_min;
}

// Extend Processing’s built-in constrain() function to support the Long datatype
long constrainLong(long value, long min_value, long max_value) {
if (value > max_value) return max_value;
if (value < min_value) return min_value;
return value;
}

 

 

– Project by Betty, Jagrat, Stephanie, Fei and Maxine

Mindflex 1st Assignment Update – Ashley, Barb, Soo

Using ofSplitString and ofToInt, we were able to parse the mindflex data into integers. Unfortunately we were unable to get the numbers to update. After reworking the code, and modifying the order, we were able to get a successful data input stream.

Screen Shot 2014-04-24 at 2.16.28 PM

 

Using a if/else if we broke the data down into the individual channels. Using the meditation channel, we created a program to encourage meditation levels to rise. As the level increases calming sounds and imagery is added.  (See some screen grabs below).

Screen Shot 2014-04-24 at 2.07.31 PM

Screen Shot 2014-04-24 at 2.09.42 PM

Screen Shot 2014-04-24 at 2.09.36 PM

Code can be found here.

Mindflex 1st Assignment – Ashley, Barb, Soo

Our first attempt to work with the mind flex did not turn out well. There was no power feeding into the headset. Our second attempt did not relay any data. After spending sometime struggling with things, it was becoming clear that there was a loose wire connection to the Mindflex board. While inspecting the cables, the white data cable inside was attached by a thread. This wire was re-soldered, and fixed the data input issue.

mindflex

IMG_0057

With the brain wave data feeding into Processing (as seen in this test video), the code was then able to be broken down. The sound libraries were then added to the processing file, as well as a simple wav file. The sound file plays when there is no data input. This file is a work in progress file, and still needs work. The code can be viewed here.

Currently, the OF version is connecting and parsing, but when printed in the console only 2 digits are importing from the string. This code is also a work in progress and can be found here.

 

Digital Taste- Midterm Presentation – Birce/Enrica/Fito

digitaltaste                                                                             

Midterm Presentation of “Digital Taste”

digital self midterm pres2

digital self midterm pres3 

digital self midterm pres4

digital self midterm pres5 

digital self midterm pres6

digital self midterm pres7

digital self midterm pres8digital self midterm pres9

digital self midterm pres10

digital self midterm pres11

 

Conclusion:

From beginning to end, we were so enthusiastic to give a try to create digital Taste.  We ended up building our first prototype with a satisfactory result. We could have different tastes changing the voltage amount (0-5V) on the user’s tongue through silver electrodes and a heating pad.

During the all process, we had different feedbacks from the users that experienced the first prototype of “Digital Taste”.  Some of them tasted just sour. Some of them tasted sour and spicy. Also, some users experienced just spicy. Not only, we  achieved to create digital taste. But also, we understood that taste is not objective; it is subjective.

At the end we generated a synthetic salty-sour-spicy  taste.

digitaltaste

Final Presentation

Final Peculiar Patterns Presentation Link

Final Prototype:

3-D Sketches of New Prototype:

tray4

tray5

tray1

 

tray3

We decided to fabricate our test receptacle out of acrylic sheets by creating clear tubes to hold the copper sulfate and then a black tray to hold the clear test tubes to see the chemical reaction.  We choose black acrylic for the tray to frame the viewing of the test tubes using contrast.

IMG_2871

Custom Laser-Cut Receptacle

IMG_2867

 

Fabricated Acrylic Test Tubes

IMG_2873

Test Tubes in Black Acrylic Tray

IMG_2875

IMG_2874

Circuitry:

IMG_2888

 

Etching Solution for Custom Circuit

IMG_2887

Custom Circuit

We created a circuitry with switches and transistors, the switch allowed the device to switch between a learning and memory state.  We set the voltage to 3V, so the chemical reaction does not happen too quickly to observe the reaction over time.  In the future, we will set add a “demo” state to increase our voltage to display the immediate reaction with a higher voltage.

IMG_2891

The learning state involves the electrolysis and the depositing of copper on the contacts to form a connection of copper over time.

The memory state used a current that passes through the anode and the test electrode.  If there is a connection between the anode and the test electrode then an LED is lit.

IMG_2892

Prototype on LED Platform

IMG_2894

 

Prototype with Light Sensor Trigger on LED Platform

Final Conclusion:

We concluded that our tests were more effective with creating specific controls with our custom receptacle.  We would like to add improvements for the next version of our receptacle designed with better viewing of the test tubes.  Also we will add a “demo” state to increase the voltage to demonstrate an immediate chemical reaction.  We will create a new design with a better light sensor to redirect the light that will effect the learning state, and make a unique memory deposit based on the analog lighting changes.

BBC Discovery Podcast

This is a wonderfully diverse podcast from the BBC that features some very intriguing questions about the world.   You can download them and listen to them in your commutes, and travel while you get inspired learning new interesting facts about our bodies and our minds. http://www.bbc.co.uk/podcasts/series/discovery/all

Here are a few that I recommend you listen to:

 

Build Me a Brain 01 Jul 13

 

Roland Pease meets scientists in the UK and the US who are hoping to connect cultures of living human neurons to robots to understand how the mind works.

Quorum Sensing 08 Jul 13

Bacteria need to talk to each other to cause disease. The bugs coordinate their assaults on use with a communication system known as quorum sensing. Geoff Watts explores whether we can stop their virulent chatter to prevent the looming antibiotic resistance crisis – the global phenomenon which threatens to leave us defenceless against bacteria infections.

Gut Microbiota 25 Nov 13

Adam Hart discovers the role gut microbes play in our health and development, and asks if we should consider ourselves an ecosystem rather than an individual?

Self-Healing Materials 09 Dec 13

Quentin Cooper takes a look at the new materials that can mend themselves, from bacteria in concrete which excrete lime to fill cracks, to car paint that heals its own scratches.

Hack My Hearing 10 Mar 14

Aged 32, science writer Frank Swain is losing his hearing. But could he hack his hearing aid to give him supersenses?

Fructose: the Bittersweet Sugar 17 Mar 14

Dr Mark Porter asks is fructose a “toxic additive” or a healthy fruit sugar?