Microsoft launches Project Brainwave for deep learning acceleration in preview

Microsoft today announced that Project Brainwave, its system for running AI models with specialized chips, is now available in preview on Azure. Brainwave allows developers to deploy machine learning models onto programmable silicon and achieve high performance beyond what they’d be able to get from a CPU or GPU. Microsoft claims Project Brainwave makes Azure the fastest cloud to run real-time AI today.

Brainwave uses specialized field programmable gate array (FPGA) chips to achieve its speed and low latency, and Brainwave on Azure will use Intel Stratix 10 chips and be able to support ResNet50-based neural networks.

Microsoft first revealed its use of Brainwave for serving AI models last summer, and in March said it’s being used to make AI that powers Bing search results 10 times faster.

The news was announced onstage at Build, the Microsoft annual developer conference being held May 7-9 at the Washington State Convention Center in Seattle, Washington.

Also announced today were updates to Azure Bot Service and IoT Edge, a Speech Devices SDK, a $ 25 million initiative to support the creation of AI for people with disabilities, and Amazon joined Microsoft onstage for a demonstration of how Cortana and Alexa will work together.

 Microsoft launches Project Brainwave for deep learning acceleration in preview

Let’s block ads! (Why?)

Big Data – VentureBeat