Neuromorphic calculations; A powerful way to solve the crisis facing the technology industry

The best and most powerful computer in the world is your brain. After many years and the increasing advancement of the technology world, brains are still far ahead of their competitors. They are small and lightweight, consume little energy and are amazingly adaptable. They are also set to be used as a model for the next wave of advanced computing.
These brain-inspired designs are collectively known as “neuromorphic computations.” Even the most advanced computers cannot come close to the human brain (even the brains of most mammals) in terms of power. However, the gray matter in this section can give engineers and developers tips on how to make computing infrastructure more efficient by mimicking synapses and neurons.
We first look at issues related to brain biology. Neurons are nerve cells that act like a cable that carries messages from one part of the body to another. These messages are transmitted from one neuron to another until they reach the right part of the body; That is where the work should be created. For example, they help us understand pain or be able to move a muscle in our body.
The way neurons transmit messages to each other is called a “synapse.” When a neuron receives enough input to stimulate it, it sends chemical or electrical impulses to the next neuron or to another cell, such as a muscle or gland.
Now let’s go to technology and see what can be done by modeling the brain. Neuromorphic computing software seeks to recreate these potentials through neural networks (SNNs). SNNs are made up of neurons that generate action potentials that signal other neurons and transmit information as they move. The power and timing of the messages allows the neurons to reconnect with each other, allowing the SNN to learn as the inputs change. In fact, it is the brain that works and learns.
In terms of hardware, neuromorphic chips are also a fundamental change from the processors and GPUs used in most computing devices today. Traditional architectures have been disappearing for some time, and it is more difficult for manufacturers to place more transistors on one chip; Because they face various problems such as physical limitations and power consumption and heat generation.
Meanwhile, the main issue is that we are always producing more computational data and need more computing power. This means that the highly adaptable, powerful, and low-power computer in our skulls is becoming more and more interesting to model when building computing devices. سوهاس کومار“Researcher at Hewlett Packard Enterprise says:
We are now in a hurry to find new models that can help advance new computer science. People are looking for different technologies, and neuromorphic is probably the most promising option among the other options.
Instead of separating memory and computing, like most chips used today, neuromorphic hardware holds the two together. With processors that have their own local memory, this model has a brain-like arrangement and saves energy and processing speed. Neuromorphic computing can also help create a new wave of artificial intelligence applications. Current AI is usually limited. The technology is developed by learning from stored data and modifying algorithms, and this continues until the desired result is achieved.
However, the use of brain strategies such as neuromorphic technology can allow artificial intelligence to perform new tasks. Because neuromorphic systems can function like the human brain, they may make artificial intelligence more powerful than ever. Neuromorphic systems can cope with problems such as confusing data and have significant compatibility. Mike Davis“About neuromorphics,” says the director of Intel’s Neuromorphic Computing Lab.
There are various functions in which ordinary computations are not efficient; Therefore, we were looking for new architectures that could provide further improvements.
Neuromorphic computing is rooted in computational systems developed in the late 1980s to model the functioning of animal nervous systems. Since then, neuromorphic computing has accelerated to the point where some of the big names in neuromorphic hardware technology have produced it. For example, IBM’s TrueNorth chip and Loihi chip Intel and the Neuromorphic System Pohoiki Beach They are currently produced.
Today, most uses of neuromorphic systems are in research laboratories. Intel hardware, for example, is being used to develop an experimental robotic arm on a wheelchair for people with spinal cord injuries, and on artificial skin to help robots feel artificial touch. However, this technology is unlikely to remain the same.
The first commercial systems that rely heavily on neuromorphic computing are said to be available in the next five years. Most of the advances we see in neuromorphic computing are very different. Abronil Singopta“Assistant Professor at the School of Electrical Engineering and Computer Science at Pennsylvania State University,” says
There are problems; But I also feel that significant progress is being made and that we can get over the problems soon.
It seems that the first place where the serious presence of this powerful computing system can be used is the robotics industry and automated machines. In these two areas, the speed of calculations and consideration of different scenarios is very important, and neuromorphic can be a great option. Suppose this system can detect the risk of a possible accident with a person who suddenly enters the street. In general, this computing system will be very accurate and reliable.
Neuromorphic can do all of this with hardware such as smartphones, tablets, drones, and wearable markets, without consuming much energy, instead of delegating AI tasks to cloud systems that require a lot of energy and cooling.
Until now, companies have been trying to fit more components into the chip’s small space to achieve more powerful computing; But in the future it will focus more on aggregating more intelligence, or in other words more functions. Such a move would require innovation in all areas, from materials to chip architecture and software.
For neuromorphics to have a significant effect, many changes must be made in the world of technology. For example, sensor technologies are not set up to work well with neuromorphic systems and must be redesigned to extract data so that they can be processed with neuromorphic chips.
In addition, hardware alone is not doomed to change; But human beings must also change. The previous sentence may be a little strange; But it is true. One of the issues facing this field in basic software programming models is algorithmic maturity; For this reason, we need a real partnership with neuroscientists so that we can become more familiar with a new kind of machine learning.
Neuromorphic computing can make the participatory technology industry more integrated. In the process of advancing this technology, collaboration with neuroscientists seems to be increasing; Because the brain has access to more information.
For example, Sengopta of Penn State is working to recreate the way glial cells, known as brain-supporting cells, affect neuron phase synchronization for neuromorphic computations. He believes that brain design can be used for many things in the world of technology. He explains:
Examining various other aspects of the brain, such as individual components or the underlying architecture, can create a path that is very promising for the future.
Finally, it must be said that the world of technology is evolving very fast and newer solutions must be used to achieve the desired results. This article covers only a very small part of the vast world of neuromorphic computing; But if you are interested in this topic, we suggest you read other articles.
Source link