Too Long; Didn't Read
<a href="https://hackernoon.com/tagged/deep-learning" target="_blank">Deep Learning</a> may not deliver the AI revolution you have been led to expect. Is this season of hype simply a repeat of that of the late ’80s, just at a much bigger scale? Perhaps Winter is coming again for the Beatles-era technology, based as it is on the Neuroscience of WWII and the Statistical Mechanics of Victorian times. Fortunately, the renewed energy, enthusiasm, and investment in Deep Learning need not go to waste, if a new approach can fuse recent knowledge from Computational Neuroscience and Applied Mathematics with the power of today’s GPUs. The <a href="https://arxiv.org/abs/1609.03971" target="_blank">Feynman Machine</a> is both an accurate description of how the brain really works, and a blueprint for Machine Intelligence. Combining recent discoveries in the Applied Maths of coupled, communicating, chaotic Dynamical Systems with those in Neuroscience, we formed <a href="https://ogma.ai" target="_blank">Ogma</a> a year ago to turn <a href="https://arxiv.org/abs/1512.05245" target="_blank">theory</a> into <a href="https://github.com/ogmacorp/OgmaNeo" target="_blank">working software</a> and build a foundation for a new AI technology.