Link to bio, ways to connect, and featured stories
Micah Blumberg, programmer, journalist, founder, author.
Micah Blumberg, programmer, journalist, founder, author.
Founder
I am the founder of The Self Aware Networks Institute located at github.com/v5ma check the wiki for the table of contents, eventually the Institute will be accessible with VR & AR headsets, and eventually the Institute will be accessible via brain implants, that we will build.
v5ma - Overview
You can't perform that action at this time. You signed in with another tab or window. You signed out in another tab or…github.com
Author
I am the Author of the upcoming book Self Aware Networks: Neurophysics, Artificial Neurology, and Brain Machine Interfaces.
Self Aware Networks: Neurophysics, Artificial Neurology, Brain Machine Interfaces
Buy Self Aware Networks: Neurophysics, Artificial Neurology, Brain Machine Interfaces: Read Books Reviews - Amazon.comwww.amazon.com
Link to bio (About my past work)
Micah Blumberg is a programmer, and a journalist.
I’ve been writing code with javascript since 2010, I create VR and AR applications with WebXR, Aframe, Threejs, and…medium.com
Ways to connect
Ways to connect
Email : micah@vrma.iovrma.medium.com
Featured stories
WebAR, Wearable Digital Fashion NFT, DNN Shape Completion, DNN Animations, Gan Synthesis, Sensors…
We interviewed Emma-Jane MacKinnon-Lee the CEO of Digitalax for Part 2 of our series, and this time we were joined by…medium.com
Digital Fashion, Art, NFT, AR, VR, WebXR, 3D Deep Learning, BCI, and the future native digital…
We interview Emma the CEO of Digitalax, to learn why NFT underpins Digital Fashion, Art, and how this ties into WebXR…medium.com
Synaptic unreliability, a foundational concept, found in deep learning, and in computational…
This new research may impact companies like Numenta, Google, Deepmind, Tesla, OpenAI, and the way neural networks are…medium.com
Coronavirus, Covid-19, Sars-CoV-2, Research on the causes and potential therapeutics.
April 2020medium.com
Functional NIRS imaging
This conversation “Optical Imaging with Kyle E Mathewson”, dived into how FNIRS, or Optical imaging technology might detect the firing of a neuron, because the body of the neuron swells when firing, and it also points to how we might detect disease like Covid-19, because the virus Sars-Cov2, causes vasoconstriction (by degrading the Ace2 receptors largely in the endothelial lining), and thrombosis (blood clots) (the damaged enthothelial lining releases blood clotting factors like von Willebrand factor (VWF), which affects the flow of blood that is what FNIRS and similar technologies detect.
FNIRS Functional Near Infrared Spectroscopy: The Neural Lace Podcast Season 2 Episode 4
This podcast was recorded just over a week before Jonathan Toomim will give a talk about FNIRS at NeurotechX in San…medium.com
Varjo, with XR-3, has made the first fusion algorithm on top of lidar with video pass through.
Varjo’s fusion algorithm is combining volumetric capture (lidar) with video, with Nvidia’s deep learning optical flow…medium.com
Here is everything you need to know about the Varjo VR headset, including how its uniquely…
You may have seen in recent news an annoucement for the new Varjo VR-1 Headset. Well I’ve gotten my hands on the Varjo…medium.com
Brandon Jones on WebXR Graphics, Oculus Quest, Location Based Social Networking, Pokemon Go, and…
At Oculus Connect 5 I spoke to Brandon Jones from Google about the WebXR spec and standalone VR devices. WebXR is a…medium.com
#2nd GDC Goal Achieved: Oculus Go hands on!
Micah Blumberg writes for Silicon Valley Global News and is reporting from GDC 2018medium.com
John Carmack interviewed at Oculus Connect 4
by Micah Blumberg, Silicon Valley Global Newsmedium.com
Two more episodes of the Neural Lace Podcast:
Business Card
Micah Blumberg, Journalist Researcher, Neurohaxor, at Silicon Valley Global News http://svgn.iomedium.com
The research for Neural Lace also known as Nerve Gear leads directly into creating artificial cortex and artificial brains. It means Augmented Reality and Virtual Reality without Glasses, and it means downloading what you see, taste, feel, and hear to be shared with others on a computer, and the ability to upload experiences that other people have shared into your mind via a computer port.
Jules Urbach on RTX & VR, Capture & XR, AI & Rendering, and Self Aware AI.
Urbach on how RTX may vastly improve VR and AR, and on how scene capture with AI to enable turning the world into a CG…medium.com
Addressing criticism for my “Humans are metal robots in a valid sense” story:
I made no claim that an electronic transister experiences sensations. Going back to Peter Tse, neurons are coincidence…medium.com
Humans are metal robots in a valid sense.
Mankind via neuroscience, computational biology, computational neuroscience and deep learning neural networks has been…medium.com
3D Cross-Hair Convolutional Neural Networks
+ Holographic Medical Imagining Devices + Volumetric Video Rendering + Brain Machine Interfaces (NerveGear) + Deep…medium.com
Cafe X. Creator.
We are the automation of all jobs, the artificial intelligence that will assimilate even humanity, replicating…medium.com
Neural Lace and Deep Learning
Meet Polina Anikeeva, Associate Professor of Materials Science and Engineering at Massachusetts Institute of…medium.com
After a mind exploding keynote at GTC 2018 I got the chance to ask Jensen Huang the CEO of Nvidia a…
I wanted to know about Blockchain, whether it will be necessary for Self Driving Cars to become Self-Aware Networks with a Self-Concept…medium.com
My interview with Jules Urbach the CEO of OTOY at GTC 2018
We talked about some of the big ideas behind the big news about RTX Real Time Ray Tracing, we talked specifically about AI Lighting, AI…medium.com
3 New Medical Imagining Technologies that have Neuroscientists salivating like Pavlov’s dogs.
These three exciting new medical scanning technologies have neuroscientists dreaming about the prospects of next…medium.com
Hack Days in San Francisco
Hack days is a show that live streams from San Francisco about machine learning, bots (chat bots), crypto currency…medium.com
Death Star Robot: Anonymous Global Warfare. Resistance is futile.
The inevitable & unstoppable future of killer autonomous sentient micro drones, and how to build them, by Micah…medium.com
The NerveGear Show: Neuroscience, Artificial Intelligence, Virtual Reality, Brain computer…
Article by Micah Blumberg, host of the Neural Lace Podcast and the NerveGear Show reference link http://vrma.iomedium.com
GTI 2017 GPU Technology Conference, The Neural Lace Podcast #5 Guest Jules Urbach, CEO at OTOY
The Neural Lace Talks is a podcast about Science and Technology.
Main website http://vrma.io Contact via micah@vrma.iomedium.com
The brain as a special kind of hard drive.
June 7th, 2017 Written by Micah Blumberg, Journalist, Neuroscientist by hobby since 2005, Founder of the Neural Lace…medium.com
A guest article:
What EEG Can Bring to Your VR Experience
Article written by Fifer Garbesi for VRMA.io Virtual Reality Media
Regarding the 8th Augmented World Expo AWE2017…medium.com
Another article I wrote on this topic:
Mind Code // Brain Code: Go (Ancient Game) and Alpha Go.
The Neural Lace Journal — Article by Micah Blumberg on June 30th 2017medium.com
The following was written in partly in 2017 and partly in the intervening years between 2012 and 2017 and so it does not represent my current up to date analysis.
I believe we have never been closer to hacking into the VR system of the brain. So we can create our own reality at the push of a button.
Have you ever wondered how Neural lace might work? I have some amazing guests talking about it on my podcast, give it a listen if you have time today.
In the 4th episode of the Neural Lace Podcast, I talk to Andre Watson, the CEO of Ligandal, a genetic nano-medicine company developing personalized gene therapies. goo.gl/cgCNwX Watson and I take a deeper dive into the synapse physiology and molecular biological basis of consciousness.
How much do we really need to understand and observe to effectively create neural lace? Andre presents his argument for the biological basis of consciousness.
My new podcast is being recommended by high level science folks to other high level science folks, people with letters like DR, PHD, MD, before or after their names! My podcast is being listened to by the executives of major tech companies. I am getting great feedback on the new podcast, and it’s getting global attention, people with five star professional backgrounds from all over the world are writing to me for example from countries like India, Germany, and Japan asking for more things they can read about the topic of Neural Lace related to the contents of my podcast. It’s truly a podcast for the Global Silicon Valley community! The frequency at which new people are reaching out to me to talk, and to listen to the podcast feels very special to me, like a count down sequence to lift off. 10, 9, 8, ….
The Neural Lace Podcast: Four Episodes have now been published.
The Neural Lace Podcast Playlist Summarymedium.com
The Neural Lace Podcast Playlist Summary
The Neural Lace Talks is a podcast about Science and Technology. I am your host, a journalist who has been a student of…medium.com
The Neural Lace Podcast #4 Guest: Andre Watson
The Neural Lace Talks Host: Micah Blumberg Editor: Adam Alonzimedium.com
The Neural Lace Podcast #3 Guest: Eric Matzner
Listen to The Neural Lace Podcast: Episode 3 Realizing Neural Lace here https://youtu.be/_yKjtTVoVlUmedium.com
Neural Lace: AR / VR no glasses
Neural Lace is Augmented Reality and Virtual Reality without glasses. The Neural Lace Podcast is about cutting edge…medium.com
I recently have had to think hard about how much GPU power it might take to read the human mind. The fact that I have a meeting today with a major corporation in which the question will come up is part of why I have been thinking about it. In all honesty I am not at all certain at what the answer should be.
What do you think the hard number is? How much AI am I going to need to learn to read brainwaves as easily as reading a newspaper?
Someone’s answer to this was about the raw complexity of the human brain with all it’s synapses and dentrites and connections.
Another person said that we will need quantum computing to save the day, because it’s just too complex otherwise.
Yet for me neither person really attempted even to answer my question. How much GPU power will be necessary to just barely crack the code of the human mind, we don’t need to brute force our way into every secret (Yet) we just need to stick a crow bar in the doors of perception long enough to take a peak. Then the computer can help model the rest in time.
My anticipation is that the depth of the sophistication of the information passing through the nervous will be incredibly complex, but we should be able to build a working model of how our minds work with the research I have planned.
I’m very interested to study how a small tiny portion of brain activity corresponds to activity in other areas, such as in the environment of the individual being studied.
So in a sense I need AI to study both the person, with a new brain computer interface I am designing, and I need it to study the environment of the person and that person’s reactions to that environment, their heart beat, their eye movement.
There are two directions for the new brain computer interface.
There are a number of new chips coming to the market, we may or may not gain approval to implant some of these chips into the nostrils of human beings for basic research into the properties of self-awareness which could not be conducted on an animal because in this study we will need the self reflection of the person involved to give us feedback on a few small parts of the experiment.
Soon after basic research is over however we will look at the options for the wireless reading of signals from the Thalamus region, and the wireless transmission of signals back into that area.
Regardless of how that particular direction of the research goes there will be numerous other insights from this extremely detailed study of the human nervous system with it’s environment.
What I can say is that we will be able to make major advances in medical research.
We will begin to map the information channels like never before, using a computer to model the network of information throughout the nervous system, from the fingers, to the toes to the brain to the eyes, to voice, to the ears. To how we listen and see, to understanding how the metaphors of smell are encoded digitally in networks of neurons.
So we are going to begin to listen in, on the complex information patterns of the human nervous system, I am like taking a Stethoscope to your brain, to see what patterns are inside it that the computer can understand and translate for us. That is the plan.
If you can help me answer some of these hard questions, if you are stuck up late nights reading the latest paper in computational biology and also totally alert to all the advances in AI coming out of Google via DeepMind Technologies, then I want to talk to you.
I think it takes people who are students of both biology and computer science to understand where the world is going to go next.
If you are interested in discussing the science and technology that might go into building next generation brain computer interfaces please connected with me in these groups:
Nerve Gear Technology Convergence * r/nervegear
Science and Technology Convergence, Web AR, Mobile 6dof VR, Neural Networks, Eye Tracking, Blockchain, Drones, BCI…www.reddit.com
Thank you for reading all this!