A hackathon project
A couple of months ago I bought an EEG headset from Neurosky. It’s a wearable that you wear on your head. And yes it doesn’t look fashionable. But it’s a lot of fun. EEG stands for (take a breath) Electroencephalography: a method to measure electromagnetic waves that your brain generate. Scientists and doctors have been using the technology for almost a century (a pivot from the inventor’s quest for a tool to enable telepathy) to better understand our brains and since recent the technology has hit the consumer market. The most obvious use case of EEG for consumers is a headset that helps you improve your meditation skills. But that’s not all. There is a world of possibilities, the most promising one being the ability to convert brainwaves into commands for machines. The fun part is figuring out how to take this a step further and create interesting user experiences.
Linking EEG to audio-visuals
The first idea that popped into my mind after the headset arrived was translating brainwaves to audio-visuals. A sort of head candy where you are immersed in an audio-visual experience that adjusts itself to your state of mind . There was just one problem: I can’t really code. So as an easy intro, I read books and articles about EEG and I used the headset on myself to get a sense of the data. Translating that data into action with my newbie coding skills however proved to be a Herculean task. I gave up.
The thing with fun ideas is that they stick around in the back of your head waiting for an opportunity to reemerge. The Fusehack hackathon in Amsterdam was that opportunity. Around 200 gifted developers would gather to build cool stuff and this was my only chance of making even the slightest progress in exploring this idea. So there I stood in a big hall holding the headset high in the air and pitching the possible use cases as loud as I could. About fifteen people heard me yelling and at the end four decided to stay: three full-stack developers and a physicist/machine learning enthousiast. Team mindKraft (Daniel Jansen, Edwin van Manen, James Rampersad, Joseph Siu and myself) was born.
This is what we built:
Playing Minecraft with your mind
We started brainstorming about the connection with audio-visuals but quickly realized that we had a content problem. There was no way to go beyond some simple sounds and color variations that would respond to the change in brainwaves. There was an Oculus Rift available so we decided to play around with it for inspiration and soon we had our content: Minecraft for Oculus. Wouldn’t be cool if you could move around the game using your thought? You have left,right and backward simply by turning your head. The only command you need to move 360 degrees is forward. The EEG headset’s API could measure focus so we linked that to the keystroke w (commonly used in games to move forward). Blinking was another measurement and we used that as a command to stop moving in the game by ‘releasing’ the w keystroke. Next to this, we used machine learning (James pulled an all-nighter for this) to distinguish between left and right arm movements. Unfortunately, we did not have enough time implement this in the prototype.
A promising combination
Using your mind to navigate in virtual environments makes a lot of sense. There’s the bumping into walls with the headset on when you actually try to walk around on your own feet. A joystick is an intermediate tool but it undermines the immersive character of VR which is its biggest promise. The current technology is not refined enough to fully replace a joystick but one could imagine other virtual environments (like musea) where simple navigation is enough. That is already within reach.
But it’s not comfortable, yet
Aside from a bunch of technical problems we also had an comfort issue. Both the Oculus Rift and the Neurosky headset need the same real estate: the forehead. Our team member Daniel was the guinea pig because we could just about cram both headsets onto his head and still have a working bluetooth signal from the EEG headset. The university of Michigan has a more advanced prototype but without using a VR headset. Their stronger and thus bulkier EEG headset is not that ‘crammable’. This is a serious limitation but it can be overcome.
We had a lot of fun we see great potential for the combination of these two technologies. It’s a logical mix because it solves one of the biggest problems of virtual reality: navigating the environment. What excites us the most is that it could be a powerful new tool for disabled people and allow them to explore virtual environments more easily. Are you a hardware hacker or do you know one and think this is cool? Get in touch with us and help us prototype an EEG headset that can be combined with an VR headset. Leave a comment and we will get in touch 🙂
Think this is cool? Please like this post so we can share it with more people.
This article is written by Amiran Yaghout and published on Medium.com He has allowed us to publish it here aswell.