I’ve been obsessed with the human brain.
As a software engineer looking to learn more about this incredible organ, I’ve found myself using the tools I know (code), to try to understand more of what’s going on inside our minds.
My early experiments involved visualizing brainwaves (EEG) in the browser. My thinking was, if I could visualize it, I could better understand it. From there, I could start exploring behavioral experiments. And that is how my journey to connecting the brain to the browser began.
From all the crazy ideas about the potential uses of brainwaves, I kept going back to the thought of “mind-controlling” stuff. And by this, I mean attempting to steer the brain frequencies and use these changes to detect intent. For example, it is known that during meditation, the alpha waves produced by your brain increase. The same way beta waves are associated with active thinking and concentration.
During the first experiment, I was able to sharpen a blurred image based on concentration levels. The more you focus, the clearer the image gets.
But, what if we could use meditation levels to control a sequence of images? If given a video of a flower blooming (starting with a bud), could we make a flower bloom if we reach a deep state of mindfulness while meditating? That got me thinking. If we could map some mind states to certain UI controls on the web, like a video player, that would be a fun experiment.
Let’s go through how we can capture brainwaves, get meditation and attention levels, send the data to the browser, and map it to the playback of a video element. In other words, let’s build a mind-controlled HTML5 video!
- Brain-Computer Interface: Neurosky MindWave Mobile
- Data Acquisition & Transmission: Bluetooth / Node / Web Sockets
- User Interface: Angular / RxJS / HTML5 Video
The brain-computer interface