DIGITAL TASTE – The Heating pad testing stage 1

With this post we are going to explain our Heating Pad Testing. According to the research paper about the “digital taste synthesizer” the heat is very important into the perception and creation of tastes.

foto 3

img above: Arduino UNO and a thermistor

We are Programming the micro controller (Arduino UNO) to control the heating pad.

foto 1

img above: Heating pad

 After  a first testing we realized that the heating pad became very  hot, so we decided to add a temperature sensor (Thermistor) to measure the variation of the temperature and then we can work on the control of the heat.

foto 4

img above: setting the Thermistor

The temperature sensor is giving data with numbers that we have to convert into degrees. (we are taking this info into the – wikipedia page of thermistor http://en.wikipedia.org/wiki/Thermistor )

 

Rasperian Screen Update

Production update:

The Raspberry Pi arrived with a 7″ LCD and LCD controller, at this time the next steps were taken to create the “realistic” screen effect.

Step 5: Load Noobs onto Raspberry Pi. This task proved challenging to a beginner. First the software had to be downloaded from the Pi Website downloads page. After downloading this disc image an 8 GB SD card was formatted using SDFormatter. Formatting the disc with this tool allowed the Pi to boot off of the system copied to it.

 

Step 6: Configure Noobs. After placing the SD card in the Pi. The Pi was connected to share the network connection from a laptop through Ethernet and a TV set using an HDMI connection. Upon boot, see image below, the screen loads with a minimal menu. From this screen, the timezone, and sd card configurations are set.
photo 2

 

Step 7: Install XBMC. Once configured, XBMC was installed onto the SD card through the “sudo” commands in terminal. Once installed, the SD card was placed back in the Pi. After booting to make sure the Media Center loaded, a video screen recording was taken from a working desktop and loaded onto the Pi through a jump drive. The test video playback can be seen on the video below.
http://youtu.be/FTWUWZ_TwIg

Step 8: Installing the Raspberry Pi to the Laptop LCD. The monitor was then taken apart stripping the shell away from the LCD. The inverter was  and LCD cables were pulled out and found to be incompatible to the the controller purchased. An extensive search was made with no luck to find an adapter. xfLUhGTeHn3TeiAB.huge
http://youtu.be/-pqjWFVj0h0

Step 9: External monitor fix. Since no fix could be made to have a functioning screen work within the laptop an HDMI to DVI converter was used to create this effect on an external monitor. The Pi was then placed in the laptop. Wires were placed into the laptop in their usual places to ensure all connections needed were made and give the appearance the computer was actually running. (See image below.)
2014-03-27 17.12.42

Step 10: Close casing. The casing was then closed and the machine was tested again.

 

Breathing Fans Production Update

Production update:

Since the iMac was unable to hold a power on for a period more then a few minutes, the work around was to “fake it” using the shell and a raspberry pi to replicate a screen. The iMac was heavy and clunky to carry around. After making a trip to Parsons Electronic Green Space, a G4 – model A1106 (see pictures below),  was acquired. This made an easier and more compact shell for transportation. It also is more personal to the creative, it is rare to find someone who does not have an intimate relationship with their laptop today.

laptop1  laptop3
laptop2

 

Step 1: Gut the computer insides. The first step taken was dismantling the laptop, removing all of the components that were not needed. 

Before & After
laptop4  laptop5

 

 

Step 2: Mount Arduino and breadboard. The next step was to mount the Arduino and breadboard into the casing using some double sided foam cushioned tape (see image below). To do this the framing also needed to be removed.

laptop6

 

Step 3: Adding simple components and code tests. The component added was the small LED that lights the latch was added. (with a 330 resistor) This was programmed to fade in an out using sample code from the LED brightness tutorial (see video 1).  The next component added was the fan. The internal fans were 4 wires, and hard to adapt to the means needed, so an additional fan was added with temporary placement. The sample code from the blink tutorial was modified and incorporated into the LED breath code (see video 2). LED Test Video 1
http://youtu.be/eNmoN0S8Z9o

Fan Test Video 2
http://youtu.be/tzXMQSFwLqM

Step 4: Adding interactivity to breath. To add interactivity of the chest expanding upon inhale, a stretch sensor (conductive rubber) was used. One end was plugged into power, and the other plugged into Analog Out 1, a 330 resister and then fed into ground. The code was then modified again to include the analog read from the stretch sensor, and a stretch limit (for the exhale start value). The fan was moved to a permanent placement (so the casing could close), and the wires for the stretch sensor were fed through the USB slot.

Stretch Fan Test Video 1
http://youtu.be/Hc9jib5vJbg

Step 5: Adding fake screen with Raspberry Pi. The Rasperry Pi has not arrived yet, this will be updated after it arrives.

All code can be found at https://github.com/compagnb/DigitalSelfProject1

Precedents / Fabiola & Ashley

Here are some additional precedents we wanted to share for our project. Just to recap, it’s called the “Intimacy Experiment” where we basically hook up two cameras to a pulse sensor, and the camera’s are mounted on the head. We will have two strangers approaching each other, and the more their heart beat rises, the more pictures will be taken.

First Kiss” went viral these past few weeks, exemplifying the interest in experiments about human emotional connections and proximity.

The  Gender Swap” experiment was also shared quite a bit, showing how augmented reality can demonstrate outer body experiences. We are aiming for the same type of setting for our project.

The Strangers Project” is a project where photographer Richard Renaldi asks strangers to hold hands in pictures, acting like a beloved family. The project exemplifies the many ways in which human intimacy can turn from awkward to beautiful quite quickly.

Duracell recently demonstrated how their batteries work in a very smart way, urging people to hold hands in order to get current to warm up their bodies and close the circuit. It shows that people are more willing to touch for warmth than one might expect.

Imponderabilia is a project from 1977 by Marina Abramovic, pushing the boundary of touch in a simple and artistic way.

Here is an article describing a simple experiment that could be done to measure the precision of touch. The reason we bring this up is that it gives us another aspect to strongly consider for the project: namely which body part to choose for the experiment, and the difference in sensibility between different body parts.

 

Midterm Presentation | Progress and Revised proposal| HAL 2.0

Concept

The body is finding it increasingly difficult to match the expectations of technology. HAL 2.0 is an attempt to transgress the boundaries between human and machine by putting them in osmosis.

The idea is to create a connection between human biometrics by mapping breath, and the machine’s “lungs”, hacking into the the machine to control the fans.

The metaphor is direct link between human and machine components that emphasizes this idea of transgression.

Inspiration

Here are some cool precedents that inspired us!

PrintHAL, super computer with human logic, and his reaction to being shut down.

Watch : Hal 9000 VS Dave – Ontological scene in 2001: A Space Odyssey

her-movie-posterHer explores the possibility of the development upon our complex, perplexing relationships with our computers, phones, etc

Screen Shot 2014-03-21 at 2.42.47 PMArtist Bjork explores cybernetic emotions in her son All is full of love

Watch the music video

Artistic Precedents

Screen Shot 2014-03-21 at 2.49.10 PMAnaisa Franco

“Anaisa Franco has been creating suspended robotic sculptures that interconnect the physical with the digital in form of animations and intensities, searching for a chemical between materials, using concepts of psychology and dreams she provides an imagination and feelings for the sculptures.” – artists statement

Watch the Interview : The Creators Project

Technological Precedents

Instructables

Screen Shot 2014-03-21 at 2.55.53 PMCreate an Apple/Arduino Alert Flag

FDP23FYFQ6DXM8E.MEDIUMRun AppleScript by the wave of a hand

Prototype

Part one

By using conductive rubber we will be able to capture the user’s breath intervals in analog/digital data that can be imported.

Here is a video of a successful test.

Part two

Option 1: Our initial plan was to connect the breath sensor with Arduino and control through AppleScript  the fans of a Mac.After several attemps of CPR on this machine (iMac Model A1058) it quickly went into sleep mode.

Option 2: After facing the technical issues with the iMac we opted for another solution.

Simulation of screen: Using a Raspberry Pi we are now working on simulating the functioning of the iMac through its screen.

Here’s a link to some tech research on that

How to Choose an LCD Screen For Your Raspberry Pi Media Panel

Connecting the breath sensor to computer fans: imbedded in the back of the iMac

Here is our google doc presentation

Project proposals | Explorations into a Digital Self | Brainstorm

Group :  Barbara Compagnoni | Roula Gholmieh | Niamh Parsley

ENVIRONMENTAL GENESIS

Theoretical Precedents | Scientific & Play Studies

Brainball is a game where you compete in relaxation. The players’ brainwaves control a ball on a table, and the more relaxed scores a goal over the opponent.

Beats Down : Using Heart Rate for Game Interaction Increasing and decreasing heart rate is used for game interaction. The mobile scenario allows involving the environment to influence the heart rate.

Proposal 1 : Heart and Breath

Pulse Sensor SingleScreen Shot 2014-03-21 at 2.18.15 PM

 – Heart monitor and Breath sensor to capture both natural and accelerated rhythms.

– Visual and Audio Stimulus are effected by speed of user’s body, enabling the user’s creativity and self consciousness to generate an “environment”.

Proposal 2 : Brain

– Using EEG signal from a NeuroSky device to map ‘attention’ and ‘meditation’ levels to create or modify an audiovisual experience

Proposal 3 : Emotional Stimulus triggering physiological responses

Screen Shot 2014-03-21 at 2.16.47 PM

Here is a more detailed presentation

 

 

 

 

roula gholmieh | into

roulag

Born on 16.02.1989, in Beirut, Lebanon, I graduated in 2011 from the American University of Beirut with a BA in Architecture & Design and Studio Arts. I worked in product design, interior design and architecture since then. My interests now lie in using new media and technology in my creative work to build theoretical and aesthetic discourses around topics and questions such as the real and digital self in relation to the real and digtal time and space.

website: http://roulagholmieh.squarespace.com

 

DIGITAL TASTE – precedents and inspirations

1- DIGITAL TASTE SYNTHESIZER

Our inspiration is the DIGITAL TASTE SYNTHESIZER a work made by Nimesha Ranasinghe, an object that creates electronic taste sensations to explore the possibility of stimulating different taste sensations.

Screen Shot 2014-03-20 at 8.59.36 PM

Goals of the reaserch

  • To develop an ubiquitous electronic taste stimulating system using non-chemical methodologies
  • To determine the parameters for stimulation (change of temperature, magnitude of current, and frequency)
  • To determine the controllability and repeatability of generated taste sensations

click on the link below to read all the research.

Digital Taste Synthesizer for Ubiquitous Taste Interactions

 

2- DIGITAL LOLLIPOP by Nimesha Ranasinghe (PhD Thesis)

Advisory Committee: Prof. Hideaki Nii, Prof. Ponnampalam Gopalakrishnakone, Prof. Adrian David Cheok, Prof. Ryohei Nakatsu

this object Digitally Stimulating the Sensation of Taste Through Electrical and Thermal Stimulation.

dtl2

Digital Lollipop (Electric lollipop) utilizes electrical stimulation on the human tongue to simulate different taste sensations. At present, the system is capable of manipulating the properties of electric currents (magnitude, frequency, and polarity: inverse current) to formulate different stimuli. Currently, we are conducting experiments to analyze regional differences of the human tongue for electrical stimulation.

click on the ling to see the reaserch

digital lollipop research

 

3 – Eye Candy from the Sensory Plasticity project, USB Lolly 

CRI_238062

by Eyal Burstein (Israeli, born 1977), Michele Gauler (German, born 1973), Beta Tank (UK, est. 2007)

“Electrodes on the surface of Eye Candy transmit visual information (uploaded to the device via USB) through the tongue to the brain at the same frequency as the eyes send visual information. The mind decodes the taste of the sweet candy as vivid pictures.”

Eye Candy project

Midterm Precedents

1. BreathingPaints (2010) by Wil Lindsay.
Screen Shot 2014-03-20 at 7.46.01 PM
The work is let users to draw and paint on the mirror through their blowing.

2. Breathalising Games (2011)
breathgames1-300x207
http://www.cs.nott.ac.uk/~jqm/?p=548
A collection of games based on breadth interfaces. A new way of looking at old games like ping pong.

3. Inside-out (2011) by Andre Borges
INSIDE-OUTpic-1-1024x573
http://www.saxcretino.com/solo/inside-out-a-sound-art-performance/

The artist, Andre Borges, performs to bring the internal sounds of the human body to our ears. He performs different breathing patterns and captures sounds from his throat which cannot be made out loud.

Work in progress / Fabiola & Ashley

We faced some technical problems (surprise, surprise) getting the Pi running on our card and the image set up, but we are going to work on it more during the break, and with the great resources, even though we’ve never worked with Python or the Pi, I think we should be good and that it will be an exciting experience.

photo (2)

Here are the resources we’ve been using:

https://github.com/sightmachine/SimpleCV/blob/develop/doc/HOWTO-Install%20on%20RaspberryPi.rst

Here is the code we’ve been using to get it running:

$ sudo apt-get install ipython python-opencv pythonscipy python-numpy python-pygame python-setuptools python-pip

$ sudo pip install https://github.com/ingenuitas/SimpleCV/zipball/master

$ sudo add-apt-repository ppa:gijzelaar/opencv2.3

$ sudo apt-get update

We need to get a web camera for mac, a wifi dongle, a micro usb pin and a usb hub.

Update on form factor:

The change of hardware will not effect the form factor of the piece much, luckily. Here is our plan for the user tests:

3 Phase Structure

1) Prepare each subject by having them fill out a short questionnaire that puts them in a headspace conducive to the project. The questionnaires will be filled out together.  Strangers will occupy the space together but focus on the prompts given by Fabiola and I rather than on each other.

2) When finished they will begin the touch experiment. After the first person starts touching then the 5 minutes begin. They have 5 minutes – perhaps longer, to touch each other and explore sensations caused by the experiment. This will be documented by video and also through the camera attached to the subject.  We are deciding if we want to put them in a quiet space alone, or in a space with a small crowd.

3) We will be introducing strangers of all races, genders, sexualities, etc.  Since this is still in a testing phase we will be looking at how reactions vary when changing components.

Here is a great article that discusses many of the topics we are concerned with.