Analog biological backpropagation: A new conjecture "Self Aware Networks" explains how derivatives & loss functions are represented in the brain.
Also discussed is a comparison between analog computing and digital in the context of computational biology. In this video I am reading a recent discussion of my notes with Self Aware Networks.
Have you ever wanted to know how Gradient Descent could work in the real human brain? Here is a new conjecture that establishes a new argument about back propagation, derivatives, and loss functions in the brain with the purpose of connecting to ideas in Deep Neural Networks.
Watch the video here:
Read along with the video above via the text here:
https://github.com/v5ma/selfawarenetworks/blob/main/02san.md
If you just want the link to the video inside the video here it is: