Our concept for the final assignment was similar to our original plan for the first BCI assignment that we could not get to work. This new plan also included adding a physical component to the project. Determined to make it work, we continued to push our skills.
The first attempt at prototyping, we created an Open Framworks program that played a sound file whenever the meditation level reached a certain point. This prototype did not work well and seemed glitchy.
Next we tried to use the ToneAC library in arduino. We successfully got the code to run, but it too was spotty and very electronic sounding. Arduino Prototype
Moving forward we began looking back into processing libraries, as it was simpler to code in. Eventually we agreed it was easier and we’d have more potential for growth if we used our own code in Open Frame works. We began by building a simple line that adjusted with the tone and frequency of the wave. Sample video can be found here. The code can be found here. We could not use this for our project because we could not make multiple lines that connected tonally and linear to other neural oscillations.
The next step we took to incorporate all the brainwaves with their individual sound frequencies was to map their motions to a bouncing ball that plays the equivalent note to each peak. This gave us more of an effect and we were able to have multiple headset inputs, so people could play together. Unfortunately, most of the time it just sounds like dissonant melodies. The code can be found here, and the video here.
I took this project further in my Major Studio collaboration with Gabrielle Patacsil. Gabrielle took the code even further while I created hardware hacks to emphasize the mindscape. Here is our documentation and video.