Why 2017 is setting up to be the year of GPU chips in deep learning

TTlogo 379x201 Why 2017 is setting up to be the year of GPU chips in deep learning

It’s been a while since anyone in the tech world really cared about hardware, but that could change in 2017, thanks to the ascendance of deep learning.

According to Forrester analyst Mike Gualtieri, we may be entering a golden age of hardware driven largely by graphics processing units, or GPUs. This alternative to traditional CPU technology is optimized to process tasks in parallel, rather than sequentially. This makes GPU chips a good fit for training deep learning models, which involves crunching enormous volumes of data again and again until the models can recognize images, parse natural speech or recommend products to online shoppers.

“There will be a hardware renaissance because of the compute needs to train these models,” Gualtieri said.

Need grows for GPU tools

GPU technology has been around for decades, but only recently has it gained traction among enterprises. It was traditionally used to enhance computer graphics, as the name suggests. But as deep learning and artificial intelligence have grown in prominence, the need for fast, parallel computation to train models has increased.

“A couple years ago, we wouldn’t be looking at special hardware for this,” said Adrian Bowles, founder of analyst firm STORM Insights Inc., based in Boston. “But with [deep learning], you have a lot of parallel activities going on, and GPU-based tools are going to give you more cores.”

He said he expects the market for GPU chips to heat up in the year ahead. NVIDIA, one of the first companies to start marketing GPU chips for analytics, teamed up with IBM and Microsoft this year to put its GPU chips in server technology aimed at cognitive applications. The biggest player in the processing market, Intel, may also look to get deeper into GPU technology.

GPUs mean changes for developers

Bowles said the growing popularity of this hardware, in some cases, may mean developers will have to learn new ways of working. Since the manner in which a GPU processes data is so different from a CPU, applications will need to be built differently to take full advantage of the benefits. This means developers will need more training to keep their skills sharp going forward.

Similarly, more enterprises will start to build their data architectures around GPU technology, said Tom Davenport, founder of the International Institute for Analytics, based in Portland, Ore., in a webcast. He said more businesses are looking at ways to implement image-recognition technology, which is built around deep learning models. Larger companies, like Google, Uber and Tesla, are also pushing forward with autonomous vehicles, which lean heavily on learning models. The growing role for these types of tasks will increase the demand for GPUs.

“I think we’re going to start seeing more computing architectures that focus specifically on GPUs,” Davenport said. “That’s been a part of the success of these deep learning models for image recognition and other applications.”

GPU technology could also enhance the trend of the citizen data scientist, said Paul Pilotte, technical marketing manager at MathWorks Inc., based in Natick, Mass. He pointed out that Amazon this year introduced GPU chips to its Elastic Compute Cloud platform. The fact it’s all hosted means users don’t need to be hardware experts. This lowers the bar to entry and makes the technology available to a wider user base.

“The workflows for deep learning, prescriptive analytics and big data will become more accessible,” Pilotte said.

Let’s block ads! (Why?)


SearchBusinessAnalytics: BI, CPM and analytics news, tips and resources