Before you go, check out these stories!

0
Hackernoon logoWhy Neuromorphic Matters: Deep Learning Applications by@radna

Why Neuromorphic Matters: Deep Learning Applications

Author profile picture

@radnaAndrew Vo

DeepTech Advisor

Volume 78, October 1990, IEEE.

It was in this seemingly unremarkable publication thatĀ Carver MeadĀ laid the rudimentary foundations of "Neuromorphic Electronic Systems."

Okay, what in the world is Neuromorphic Electronic Systems?

When I first read this seminal paper more than a decade ago, I was a naĆÆve engineering undergraduate at U.C. Berkeley.

What struck me most were these final words byĀ Dr. MeadĀ - a decorated Professor of Engineering and Computer Science at the renowned California Institute of Technology.

"I expect large-scale adaptive analog technology to permit the full utilization of the enormous, heretofore unrealized, potential of wafer-scale silicon fabrication."

Before we dive in, I encourage you to first read the two other articles in this 3-Part Series:Ā The Venture Capital Guide to Machine Learning and How Machine Learning Can Scale AllemansrƤtten's Effect on the Music Industry.

As you explore this subject, it can be easy to get lost in the forest, so keep this in mind as you're learning aboutĀ Neuromorphic Computing: it is simply theĀ physical extensionĀ ofĀ Artificial Intelligence'sĀ elusive goal of finding the optimalĀ Representation of Reality.

Okay, let's see what Neuromorphic Computing really means.

An Impending Crisis in Machine Learning

Before we get to Neuromorphic Computing, let's first understand the current state of technology. Almost every computer today is based on the "von Neumann architecture."

Named after one of the greatest scientists of the 20th century ā€“Ā John von NeumannĀ ā€“ the von Neumann architecture separates a computer's computational cellsĀ (the CPUs) from itsĀ memory cellsĀ through a communication bus.

Source: Radna Intellectual Ventures; Architecture of von Neumann Computing.

ThisĀ hardware architectureĀ was ultimately adopted because of the incompatibility between a silicon fabrication technology known as CMOS utilized in CPUs versus memory chips.

Indeed, the von Neumann architecture has driven the exponential growth of computing known asĀ Moore's LawĀ for the past 50 years. However,Ā this growth is unsustainable.

First, under the von Neumann architecture, power consumption and computing capability need to be balanced and cannot be achieved simultaneously.

Thus even as transistor sizes have shrunk below 10 nanometers, there is still a power consumption/computing tradeoff that imposes aĀ limiting lower bound.

Source: Radna Intellectual Ventures; Moore's Law Correlation with Transistor Downsizing.

Second, there is a physical limit on how small we can manufacture transistors due to an effect known asĀ quantum tunnelingĀ for transistors smaller than 1-3 nanometers, which we have reached.

Although Intel and AMD ā€“ the two largest CPU manufacturers ā€“ have tried to get around this using parallel processing (or simply connecting many existing CPUs together), the fundamentalĀ von Neumann bottleneckĀ remains.

Finally, the nail in the coffin forĀ Artificial IntelligenceĀ is that Machine Learning algorithms frequently move large amounts of data, aĀ power- and memory-intensiveĀ task which is not scaleable for CPUs and GPUs operating under the von Neumann architecture.

This monumental problem has led to projects likeĀ SpiNNakerĀ (Spiking Neural Network Architecture) andĀ GraphcoreĀ developingĀ Graph Processors, planting the seeds forĀ Neuromorphic Computing.

Neuromorphic Computing for Deep Learning

There are three principal neuromorphic computing architectures.

Type AĀ is the Distributed Computing Architecture (DNCA) which intersperses computing units ("neurons") and memory units ("synapses") in a distributed brain-like network.

Type BĀ is the Cluster Neuromorphic Computing Architecture (CNCA), which is motivated by the brain's sensory signals (somatic, tactile, auditory, visionary, olfactory, and gustatory), clustering different processing functions according to the Type A architecture.

Type CĀ is perhaps the most promising as it addresses a fundamental problem of most machine learning agents (including the remarkableĀ GPT-3 NLP agent developed byĀ OpenAI), namely the capacity of AI forĀ associative learning.

Source: Radna Intellectual Ventures; Architecture of Neuromorphic Computing.

In this Associative Neuromorphic Computing Architecture (ANCA), signals are first processed in specialized regions akin to the Type B architecture. Additionally, the processed information is then coupled to each other to construct anĀ associative neural network.

Okay, this all sounds promising.

But how do we actually build a neuromorphic chip?

Albert EinsteinĀ was once asked what was the secret to his genius, to which he answered:

"Look deep into nature and you will understand everything better."

Indeed, it was perhaps by looking into nature thatĀ Leon ChuaĀ ā€“ Professor of Electrical Engineering at U.C Berkeley whom I had the privilege of taking one of his courses ā€“ discovered theĀ memristor, an electronic device that mirrors the function of a human neuron'sĀ synapse.

Source: Radna Intellectual Ventures; Memristor as a Silicon Synapse, (Top) Physical Structure and Behavior of a Human Neuron Synapse, (Bottom) Physical Structure and Behavior of an Artificial Memristor Synapse with Electron Microscopy.

Long a topic of academic research,Ā memristorsĀ have only recently gained attention with the remarkable accomplishments of Deep Learning (seeĀ AlphaGo,Ā AlexNet, andĀ Transformers).

As theĀ Director of Tesla's AI ResearchĀ highlighted, the computational challenges of Deep Learning cannot be solved in the long-term by traditional von Neumann computers.

Especially for life-critical applications like Autonomous Driving,Ā Deep LearningĀ in the "cloud" or even throughĀ edge computingĀ is not acceptable when reliability is paramount.

Without getting into the technical details ofĀ "3D Monolithic Integration," below is an illustrative mapping of an Artificial Neural Network (ANN) to a 3D Neuromorphic Chip using memristors for theĀ "synaptic layer."

Source: Radna Intellectual Ventures; Physical Implementation of a Neuromorphic Chip, (Top Left) Artificial Neural Network at the Software Layer, (Top Right) Neuromorphic Memristor Chip Layout using 3D Monolithic Integration, (Bottom) Illustrative Diagram of Information Flow from Neurons through the Synapses of a Neuromorphic Chip.

Okay, so why are we doing all this?

Let's take a step back.

The human brain consumes justĀ 20 WattsĀ of electricity ā€“ less than a typical lightbulb ā€“ yet can outcompete OpenAI'sĀ GPT-3Ā on any simple associative learning task.

It is estimated that it costs aboutĀ $5 millionĀ to train theĀ 175 Billion parameters of GPT-3.

Thus, it becomes clear that current (von Neumann-based) computers won't be able to scaleĀ efficientlyĀ and keep up with theĀ exponentialĀ advances in Deep Learning during this decade of theĀ Roaring 2020s.

Being an investor of frontier technologies atĀ Radna Intellectual Ventures, I would like to turn our attention now toĀ Investable Themes in Neuromorphic Computing.

Investable Themes in Neuromorphic Computing

Having talked with VCs on Neuromorphic Computing, I often hear these two complaints.

First, the technology is "too early."

AsĀ Thomas KuhnĀ noted in his concept ofĀ paradigm shifts, technological innovation occurs in jumps. Thus, I believe that extrapolating from prior historical growth of say Digital Computing severely underestimates the adoption rate of disruptive technologies like Neuromorphic Computing.

Second, "hardware is not VC-investable."

There is a misconception that the only opportunities in Neuromorphic Computing are hardware. Neuromorphic Computing is simply a technology stack like Digital Computing, and as such, there is aĀ Hardware,Ā Software, and anĀ Abstraction LayerĀ connecting the two.

Source: Radna Intellectual Ventures; Investable Themes in Neuromorphic Computing.

At theĀ HardwareĀ layer, investment opportunities areĀ capital intensiveĀ as expected, with the problem compounded by competition from well-funded incumbents likeĀ Intel,Ā Qualcomm, andĀ IBMĀ developing their own Neuromorphic chips.

However, opportunities do exist forĀ niche and ultra-low power applicationsĀ like IoT, with startups likeĀ BrainchipĀ competing in this space.

At theĀ AbstractionĀ layer, this is still very much anĀ unresolved problem with limited opportunities until there is a consensus on theĀ abstraction protocolĀ for Neuromorphic computing, similar to the adoption of theĀ x86 instruction setĀ for traditional computing.

That said,Ā Applied Brain Research's development of theĀ NengoĀ Neural Engineering FrameworkĀ is gaining some traction, along with the development of a Neuromorphic Operating System by the EU-fundedĀ BrainScales.

Finally, we arrive at theĀ SoftwareĀ layer, where I anticipate most of the investable opportunities for VCs, family offices, and other direct early-investors will lie.

VicariousĀ andĀ InivationĀ are two startups working inĀ computer vision andĀ NumentaĀ is licensing their proprietary neuromorphic technology for developers to build upon.

I expect most of these opportunities to be in domain-specific industries whereĀ localizedĀ Deep Learning is critical likeĀ Autonomous Driving, Industrial,Ā andĀ MobileĀ applications.

Source: Yole DĆ©veloppement; Neuromorphic Sensing / Computing Market Size Projections.

Let's now shift toĀ Quantum Computing, which may be the final frontier of computing.

On the Road to Quantum Computing

I'll be directā€”Quantum Computing in its current state isĀ more hype than reality.

AlthoughĀ IBM,Ā Intel,Ā Google,Ā Alibaba, andĀ MicrosoftĀ have all developed "working quantum computers," I believe these are allĀ pet projectsĀ as the fundamental problem of quantum computing ā€“ā Ā the decoherence of quantum qubits at scaleĀ ā€“ā  remains unsolved.

Furthermore, I believe that quantum computing may be the first major technological shift where we need to move away fromĀ SiliconĀ as a substrate, and towardsĀ organic compounds.

Source: Radna Intellectual Ventures; Illustration of a Quantum Computer like IBM Q.

Why is that necessary?

Well, recentĀ researchĀ has provided some remarkable evidence that ourĀ brain operates as a quantum computerĀ at its most fundamental level.

However, there is limited research and development here, so although I am excited about Quantum Computing, I remainĀ skepticalĀ about its feasibility within our lifetime.

Final Thoughts

Well, if you're still reading this, hopefully, this 3-Part Series has been informative and can provide you with guidance either as an Entrepreneur or an Investor in Artificial Intelligence.

Returning to our first quote fromĀ Part One:

ā€œArtificial Intelligence began with an ancient wish to forge the Gods."

Indeed, I find it remarkable that we're on the path to creating a God of some sortā€”a human-level intelligence devoid of all the fallacies and emotions that ironically make us human.

It is both a humbling experience and an interestinglyĀ spiritualĀ one as well.

By Andrew Vo, CFA

Related Links: The Venture Capital Guide to Machine Learning, How Machine Learning Can Scale AllemansrƤtten's Effect on the Music Industry

About Radna Intellectual Ventures

Radna Intellectual VenturesĀ is a venture startup studio with a mission to enable the next generation of Deep Technologies or Deep Tech companies, with a focus on Machine Learning, Natural Language Processing, and Blockchain technologies.

Our General Partner,Ā Andrew Vo,Ā spent a decade in finance, working at some of the largest investment managers like J.P. Morgan, before pursuing the entrepreneurial path. Andrew serves as an Advisor for startup companies focusing on the FinTech and DeepTech sectors. Andrew is a CFA Charterholder and holds a Masters of Science in Computer Engineering from Cornell University, and a Bachelors of Science in Electrical Engineering and Computer Science from the University of California, Berkeley.

Learn more aboutĀ Radna Intellectual VenturesĀ here.

Tags

Join Hacker Noon

Create your free account to unlock your custom reading experience.