The Powerful New AI Hardware of the Future – CDOTrends

As an observer of artificial intelligence over the last few years at DSAITrends, it is fascinating to observe the dichotomy between the sheer amount of research and development in AI, and its glacial real-world impact.

No doubt, we do have plenty of jaw-dropping developments from AI-synthesized faces that are indistinguishable from real faces, AI models that can explain jokes, and the ability to create original, realistic images and art from text descriptions.

But this has not translated into business benefits for more than a handful of top tech firms. For the most part, businesses are still wrestling with their board about whether to implement AI or struggling to operationalize AI.

In the meantime, ethical quandaries are as yet unresolved, bias is rampant, and at least one regulator has warned banks about the use of AI.

One popular business quote comes to mind: We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.

So yes, while immediate AI gains seem lacking, the impact of AI in the long term might yet exceed our wildest expectations. And new, powerful AI hardware could well accelerate AI developments.

More powerful AI hardware

But why the fascination with more powerful hardware? In the groundbreaking Scaling laws for neural language models paper published in 2020, researchers from OpenAI concluded that larger AI models will continue to perform better and be much more sample efficient than previously appreciated.

While the researchers cautioned that more work is needed to test if the scaling holds, the current hypothesis is that more powerful AI hardware could train much larger models that will yield capabilities far beyond todays AI model.

Leading the charge on the hardware front would be data center-class GPUs from NVIDIA and AMD, as well as specialized AI processors from technology giants such as Google. For example:

Stepping outside the box

There are research fields that could impact the development of AI, too. For example, the Loihi 2 is a second-generation experimental neuromorphic chip by Intel. Announced last year, a neuromorphic processor mimics the human brain using programmable components to simulate neurons.

According to its technical brief (pdf), the Loihi 2 has 128 cores and has potentially more than a million digital neurons due to its asynchronous design. The human brain does have roughly 90 billion interconnected neurons, so there is still some way to go yet.

Chips like the Loihi 2 has another advantage though. As noted by a report on The Register, high-end AI systems such as DeepMinds AlphaGo require thousands of processing units running in parallel, with each consuming around 200 watts. Thats a lot of power and we havent even factored in the ancillary systems or cooling equipment yet.

On its part, neuromorphic hardware promises between four and 16 times better energy efficiency than other AI models running on conventional hardware.

Warp speed ahead with quantum computing

While the Loihi 2 is made of traditional transistors there are 2.3 billion of them in the Loihi 2 another race is underway to make a completely different type of computer known as quantum computers.

According to a report on AIMultiple, quantum computing can be used for the rapid training of machine learning models and to create optimized algorithms. Of course, it must be pointed out that quantum computers are far more complex to build due to the special materials and operating environments required to access the requisite quantum states.

Indeed, experts estimate that it could take another two decades to produce a general quantum computer, though working quantum computers of up to 127-qubit exists.

In Southeast Asia, Singapore is stepping up its investments in quantum computing with new initiatives to boost talent development and provide access to the technology. This includes a foundry to develop the components and materials needed to build quantum computers.

Whatever the future brings for AI in the decades ahead, it will not be for lack of computing prowess.

Paul Mah is the editor of DSAITrends. A former system administrator, programmer, and IT lecturer, he enjoys writing both code and prose. You can reach him at [emailprotected].

Image credit: iStockphoto/jiefeng jiang

See the original post:
The Powerful New AI Hardware of the Future - CDOTrends

Related Posts

Comments are closed.