Analogue Artificial Intelligence

ARTIFICIAL INETELLIGENCE

Artificial intelligence is taking the world by storm. More and more organizations are using AI to obtain insights into their businesses.

Artificial intelligence is being implemented in business enterprises, research facilities, military applications, industrial processes, healthcare, security surveillance, and much more.

Artificial intelligence applications, to function correctly, require huge amounts of input data sets. To process these huge amounts of data sets huge computer processing power capability is needed. Large computer processing capability requires large amounts of electrical power usage. This means that running artificial intelligence algorithms places a considerable carbon footprint on the environment.

AI algorithms, particularly deep learning algorithms, need huge amounts of input data to learn patterns manipulated mathematically and thus require huge amounts of processing power. Today’s current deep learning algorithms are executed on GPUs (Graphical processing units used in computer graphics generation), which are fast enough to process the data compared to a standard computer CPU (Central processing unit). However, the problem with using GPUs is the large amount of electrical energy required to run them. It is estimated that some deep learning algorithms running on GPUs can consume the equivalent of 3 households’ electrical energy in a period of one year. GPUs use silicon-based digital microchips. Today we are reaching the limits of Moore’s law, where transistors on silicon chips are the size of an atom and cannot get smaller, limiting the number of transistors that can be placed on a silicon wafer. However, a promising development in the microchip industry is the advent of AI analogue processing microchips. These chips are faster, smaller, energy efficient, and cheaper, ideal for AI applications. Although AI analogue processors are promising, they do come with inherent drawbacks. They are application specific, meaning you cannot yet design a general-purpose analogue computer, and they are less accurate than their digital counterparts. But although they are not as accurate, they still fulfill the requirements of AI applications.

Advantages and disadvantages of Artificial Intelligence(Opens in a new browser tab)

Interestingly enough, the first neural network, the perceptron, was designed using analogue technology back in the 1950s. Digital technology took over as analogue technology at that stage was still very primitive. Following the first AI winter in the 1970s, AI technology became much more feasible due to the advent of fast digital technology. But now, with new developments in analogue chip technology, this could change. Analogue chips could pave the way for cheaper and more energy-efficient AI technology, thus converging the ability of computers to think more like us.

Exit mobile version