Neurotech SF VR Hacknight: Brainduino+WebXR+OculusGo #4
Will be held Friday, August 3rd 2018, At Noisebridge in San Francisco From 5pm to 9pm, feel free to arrive /depart anytime during that time…
Will be held Friday, August 3rd 2018, At Noisebridge in San Francisco From 5pm to 9pm, feel free to arrive /depart anytime during that time window. Come work for 1,2,3,4 hours.
You can optionally bring your own brain computer interfaces (BCI), vr headsets (like Oculus Go or Mirage Solo or Vive Focus) and your own laptop to develop your own WebXR application, and you can test your webXR application with our Brainduino and our Oculus Go if you like.
The global objective: NeurotechX is a global organization of Neurohackers with 17 chapters worldwide.
This document is regarding the San Francisco branch of Neurotech and a second meetup called San Francisco Virtual Reality that was founded by Matt Sonic. As an organizer of both meetups, and as an organizer of Noisebridge meetups I have been combining these meetups at Noisebridge because BCI devices are on track to converge with AR VR devices.
The local objective of the NeurotechSF & San Francisco Virtual Reality meetups is currently to merge Brain Computer Interfaces with XR (XR is a single API that does both Augmented Reality + Virtual Reality) and AR VR headset and combine these with Machine Learning (Deep Learning Artificial Intelligence) and possibly also create 3D point clouds of data from medical imaging and 3D point clouds of data around a participant, and do correlations between BCI data and environment data, that can include virtual environment data. We also want to create next generation biofeedback data visualizations and next generation user interfaces that predict users intention and emotion at the interface of AR and VR with the biofeedback and machine learning.
The objective of this specific new series (this document is about the 4th in the new series that started Summer 2018) is to combine the brainduino with the oculus go via webXR.
The goals for this specific hacknight on Friday are:
Broadcast our html page, with the webvr & the brainduino EEG live stream to the open web so we can access it from inside the Oculus Go.
2. Migrate from aframe WebVR to the new WebXR protocol
3. Build or integrate FFT into the pipeline either on the Python side or on the WebXR side. So we can get the component waves, Alpha, Beta, Theta, Delta, and Gamma from the brainduino and display them in VR, or at least create visualizations from the component waveforms.
4. If we have time we will explore how to send data into Tensor Flow (Deep Learning) via Keras and how to receive the output back into WebXR.
Previous Progress: We made significant progress at that last meetup, we were able to cause voltages from the skin to move objects in WebVR. Watch the video that explains the recent progress we made in our Sunday July 29th meetup.
Progress! On the NeurotechSF + SF VR Hacknight #3 we made major progress towards the goal of integrating EEG with WebXR, and VR via an Oculus Go. As you can see in the video the actual voltages from the sensors are causing the shapes in our aframe application to run! This took a lot of work and time over many meetups to figure out! Thank you to everyone who has contributed. I feel like this was a significant milestone!
https://www.facebook.com/worksalt/videos/2332211350138832/
Annoucing the Neurotech SF VR Hack #3: Brainduino WebXR Oculus Go.
It’s free but it’s not a party or a presentation. You come to hack, to work on technology, to do something new, if hacking sounds like fun…medium.com