Skip to content Skip to footer

AI will prove a key driver to Industry 4.0

Image: Adobe Stock / Connect world

Hardware has a critical role in improving artificial intelligence and machine learning, says Colm Lysaght.

Last week’s World Economic Forum in Davos, Switzerland yielded considerable discussion on artificial intelligence (AI) with discussions focusing on its potential and opportunity, while raising questions of ethics and the societal use of AI in the future.

“We are living in a time of multiple technological innovations, where AI is one of the key technologies that is driving the Fourth Industrial Revolution,” commented World Economic Forum founder and executive chairman Professor Klaus Schwab.

“Even as AI takes centre stage in this revolution, we have a long way to go before we reach its true potential. We know that memory and storage innovations deliver dramatic improvements in moving, accessing and analysing data. Combining the new techniques of AI with ever faster computing power and vast volumes of data results in computers that can learn new skills and transfer them to new applications quickly and with high quality,” Schwab added.

A study by Forrester Consulting highlights how hardware architecture affects the return on investment for artificial intelligence and machine learning implementations. Commissioned by Micron, the research identifies the most critical factors necessary for optimal performance of advanced AI and machine learning analytics.

Although advanced analytics offer a great deal of promise for business transformation, most companies are only beginning to explore the execution challenges that complex AI and machine learning models bring. As use cases like image recognition, speech recognition and self-automation become more advanced, the hardware used to train and run those models will become increasingly important. To better understand the gaps and opportunities, Forrester surveyed IT and business professionals who manage architecture, systems and strategy for complex data.

The study identified several key trends and challenges. The location of compute and memory is crucial to performance and success when architecting hardware for AI and machine learning. Eighty-nine percent of respondents said it was important or critical that compute and memory are architecturally close together.

Although 72 percent of firms run advanced on-premise analytics today, that percentage is expected to shrink to 44 percent in the next three years. Meanwhile, more firms will be running analytics in public clouds and at the edge. For example, 51 percent of respondents said they are running analytics in public clouds, which will increase to 61 percent in the next three years. And while 44 percent run analytics at the edge today, that will grow to 53 percent in by 2021.

The study also found that of the possible hardware constraints limiting AI and machine learning today — including compute constraints, programmability and thermal management issues — memory and storage are the most commonly cited concerns. More than 75 percent of respondents recognise a need to upgrade or rearchitect their memory and storage to limit architectural constraints.

While those at Davos focused on the higher-level issues surrounding AI, this study shows that before we get there, we need to take a detailed look at compute, memory and storage configurations to enable the next-generation of AI.

The bottom line is that system architecture matters. Whether it’s at the edge, in the cloud or on premises, advanced hardware is necessary to deliver the performance that companies need to drive faster, better results with AI and machine learning analytics.

To learn more and receive the full study, register to attend the upcoming webinar Hardware Matters – Why Memory and Storage Are Critical to Better AI and ML. Click this link now as the webinar will air on Tuesday, February 5th.

Colm Lysaght is Micron’s VP Corporate Strategy

 

You may also like

Stay In The Know

Get the Data Centre Review Newsletter direct to your inbox.