By: Craig Badrick on June 5th, 2019

Print

What Is Neuromorphic Computing and How Could It Impact Enterprise IT?

Artificial Intelligence

Digital rectangle brainNeuromorphic computing uses analog circuitry to mimic the way the human brain processes information. Recent progress in the field could have a major impact on enterprise IT in the years to come.

We are often given the impression that digital computation bests human cognition in every way imaginable. Computers are powerful, accurate, and consistent. They can make calculations faster than entire armies of people. If a computer is wrong, it’s because a human programmed it that way.

But in some critical aspects, the human brain is still the best computer in the world. Humans are especially good at detecting patterns and improvising within undefined circumstances — far more effectively than even the most powerful artificial intelligences on the market. And using only 22 watts of electricity to operate (about half that of a laptop), the brain’s energy-efficiency is unparalleled when it comes to certain tasks.

Neuromorphic computing represents the bridge between the relative strengths and weaknesses of the human brain and traditional computer processors. It is an interdisciplinary field that lies somewhere at the crossroads of computer science, electrical engineering, and cognitive neuroscience, and attempts to create processors that operate more like the human brain by artificially mimicking the human nervous system. By doing so, scientists hope to create processors that are both more powerful and more energy-efficient than anything available today.

 

How Neuromorphic Computing Works

The concept of neuromorphic computing was pioneered by Caltech professor Carver Mead in the 1980s. But neuromorphic computing (sometimes called neuromorphic engineering) is still considered an emerging field, and only in the last few years has it become feasible for commercial use cases.

To mimic the human brain and nervous system, researchers are building artificial neural networks that replace synapses with nodes. One of the obstacles to these networks is the binary nature of digital processing. CPUs send messages through circuits that are either on or off; there is no room for degrees of subtlety.

Ironically, engineers solved this problem by going back to analog circuits. As a result, they have built processors that can modulate the amount of current flowing between nodes, similar to the varying strength of electric impulses in the brain that form and alter brain chemistry.

Currently, the most powerful neuromorphic processor available simulates an impressive 16 billion synapses — though it’s still a far cry from the brain’s 800 trillion. This type of processor remains an infinitesimally small piece of the market, and is mostly intended for research and defense purposes. But the technology has proven theoretically feasible, and if you believe some experts, practical applications are only a matter of time.

 

What Neuromorphic Computing Seeks to Solve

Dealing With Uncertainty: Currently, standard computational processes are very good at specific tasks, like analyzing situations under clearly-defined circumstances. Neuromorphic computing seeks to make computers better at operating with uncertainty. That means dealing with probabilities rather than deterministic models, which in turn requires analog transistors that can express more than just binary options.

Improved Machine Learning: Neuromorphic computing aims to deliver an evolution in machine learning by allowing computers to “learn” through observation rather than programming. For example, a computer program identifying types of birds would currently be programmed by feeding it millions of photos of birds, all clearly labeled by human programmers. A neuromorphic computer would be able to “learn” to recognize patterns much faster, with fewer inputs.

Improved Energy Efficiency: Another (hoped for) outcome of this technology is massively improved energy efficiency. By some calculations, a neuromorphic chip would consume 1,000 times less energy than traditional CPUs performing some functions.

Improved Portability: With all that additional computing power and energy efficiency, neuromorphic chips could put the ability of an entire rack of servers onto a single chip.

Neuromorphic Computing and Enterprise IT: The potential applications for this technology are profound. Driverless cars that can sense and respond to erratic behavior in nearby vehicles; smart home devices that adapt seamlessly to new behaviors or surroundings. Neuromorphic computing would allow computers to learn quickly enough to effectively make obsolete much of the programming that would otherwise be required.

For enterprises, this technology could mean massive improvements in a host of areas, from predictive data analytics to automation and process optimization. These improvements aren’t likely to happen soon, however. Our reliance on old-fashioned digital computation is here to stay — for at least the next decade or so. And the challenges that neuromorphic computing seeks to solve will need to be addressed (in the meantime) with the tools that enterprises have at their fingertips today: powerful networks and robust IT services.

These can be achieved with the help of networking experts like Turn-key Technologies (TTI). With nearly three decades of experience helping enterprises deploy tools to meet the business needs of the present and the future, TTI brings the tools and capabilities that enterprises need to implement the right technologies — today.