Treading Waves Final – Julie & Ayo
____________________________________________________________________________
CONCEPT:
The goal of this project is to create a set-up to construct emotional signifiers using visual stimulus. The first model will be using the example of the emotion disgust.
Our conceptual foundation for this project is based on several psychological and neurological experiments from brain wave monitoring to early experiments in artificial intelligence. For example the works of Alan Turing and subsequently the Turning Normalizing Machine (http://we-make-money-not-art.com/archives/2013/09/the-turing-normalizing-machine.php#.U3kbuFhdVxs) influenced us greatly to create emotional signifiers that could one day be mapped to artificial intelligence in the future.
We were also inspired by the mindflex homework assignment where a video clip from the movie Clockwork Orange was distorted when using the attention and meditation values while the viewer watched an elicit clip.
____________________________________________________________________________
PRECEDENT

The Turing Test

Turing Normalizing Machine

Use of Visual Stimulus in Clockwork Orange
____________________________________________________________________________
INTRODUCTION
We plan to use the NeuroSky device to read raw data from the subjects brain. By subjecting the subject to various visual stimulus we expect to be able to measure their rest, excitement, and disgust levels. Each subject will be subjected to a series of visual cues, by monitoring several subjects we expect to get a general (normalized) understanding of the average rest, excitement, and disgust levels associated with the cues presented.

STIMULUS
We would run the experiment in this order:
1. Image to pick up signal

2. Image of positive food

3. Blank Image

4. Image of negative food

____________________________________________________________________________
TESTER

Mock-up of Testing Structure



Testing In Progress
____________________________________________________________________________
BRAINWAVE GRAPH SCREENSHOTS
Using the Thinkgear library created by Akira Hayasaka, we edited the library where the data parses out the different channels, and parsed out every channel available. Then we kept the tweenlite addon that smooths out the jumpiness of the data influx and translated the data into a polyline graph for the viewers. Each line below represents raw, attention, meditation, delta, theta, high alpha, low alpha, high beta, low beta, high gamma and low gamma.
Attention and meditation are the yellow and blue lines, and raw is the lime green line running along the top.
The delta, theta, high and low alpha, beta, and gamma waves are the displayed below, with higher and more abstract numbers.


We plan to capture the data at each stimulus point to analyze the differences from control state to stimulated state.
____________________________________________________________________________
GITHUB
Link to our openFramesworks sketch for displaying and recording data: Treading Waves Graph
____________________________________________________________________________
CHALLENGES
The following list elucidates the challenges we encountered or will expect to encounter
-
Syncing video cues to brain wave data recording
-
Parsing the quantity of data
-
Analyzing the RAW data values
-
Attributing RAW data values to significant emotional states
-
Desensitization of subjects
- Eliminating errant data resulting from muscular activity
____________________________________________________________________________
FUTURE TESTS
In our future tests, we will run the experiment using real food as a variable, and also smell exclusively as another variable.
____________________________________________________________________________
POWERPOINT PRESENTATION
Treading Waves Power Point