How will neuromorphic computing change our lives?

avatar

Neuromorphic

Many of us get excited about how computing will develop especially with Artificial Intelligence (AI) where machines will be able to solve problems much quicker than the human mind. However, traditional computing approaches are very bad at modelling the human brain.

Most computers today follow the simple Von Neumann architecture that was proposed way back in the mid 1940's. The architecture Von Neumann described had a CPU (which had a control unit and an arithmetic/logical unit), input and output devices and a memory unit that could store instructions and data. Most modern computers have evolved around this architecture.

However, as we progressed and machine learning became more sophisticated, we realised how different this approach is to that of the brain. Machine learning is ruled based and develops its learning by storing loads of similar patterns. The problem being, that the more it stores the more it needs to shift through to make decisions. Not only that, the computers got bigger and bigger, taking huge data centres to store supercomputers that require huge amounts of power and generate a lot of heat.

Compare all of this to our simple brain that is so compact, we manage to carry it around with us every day. It takes very little energy compared to a supercomputer and our body is super efficient and keeping it cool.

Neuromorphic computing tries to overcome the limitations of a traditional Von Neumann architecture by finding ways of mimicking the human brain.

One key area, in a traditional computer architecture model, everything is controlled by a clock and happens perfectly in time. As explained in this Intel video Neuromorphic computing works asynchronously by making new connections, in parallel, in a very similar way to the brain. To understand this parallel capability of the brain we need to under synapses.

Synapses are structures in the brain that allow nerve cells (neurons) to pass an electrical signal to another synapse. They can be triggered by multiple inputs or inputs building over a period of time. They are also very well interconnected and could be connected to 10,000 other synapses. This means information can be quickly delivered and different patterns matched in parallel. This allows the brain to recognise things quickly and solve problems.

There are already some great developments in this area. Intel have a research Neuromorphic chip called Loihi. You can find out more about their progress with Loihi in this video.

So what are the potential uses of neuromorphic computing? Well, first, we might one day see a neuromorphic chip in our mobile phones that will be able to provide advanced processing at a much lower power consumption. Mobile phones currently use cloud processing for much of the heavy lifting due to the drain on the battery life. However, for the time being, these types of chips are going to be way too expensive.

The biggest and most exciting use will be around AI. By building neuromorphic computing systems that mimic some of the workings of our brain, we are not only finding out more about how our brains work, but we are finding new ways of optimising this technology. The ability of the brain to sort huge amounts of data, learn patterns effectively and all at high speed with low energy. This could revolutionise the world of AI.

This technology really could change our lives with one day being the backbone behind robots and droids with sophisticate AI processing capabilities.

Image source: Pexels



0
0
0.000
4 comments
avatar

I wish it happens soon

0
0
0.000
avatar

So do I, there are just so many positive applications.

0
0
0.000
avatar

Hello @awah
Reading your publication again, I realize that I know very little about the history of computers.
I only know that they started out very big, but no more.
I will give a review on that subject.

0
0
0.000
avatar

I remember the first computer I ever saw (and used) it was massive. It had its own room yet was incredibly limited. I was very young but still shows my age.

We have got so much better and getting more processing power into a smaller space but underneath its the same architecture.

0
0
0.000