Microsoft’s drive into all things cloud-related continues with the news it has developed a deep learning acceleration platform, codenamed Project Brainwave.
The development is noteworthy as is it rare for Microsoft to make a hardware-only move, but it believes that its own acceleration platform should offer the ability to deliver real-time artificial intelligence (AI) in the cloud for customers.
Microsoft has steadily being ramping up its AI capabilities of late. Earlier this year for example it acquired to two AI startups (Agolo and Bonsai), and last year Microsoft Ventures invested in Element AI, an AI research lab and incubator based in Montreal, using a new fund earmarked for AI investments.
Project Brainwave is touted as a “major leap forward in both performance and flexibility for cloud-based serving of deep learning models” and was demonstrated using Intel’s new 14 nm Stratix 10 FPGA.
“We designed the system for real-time AI, which means the system processes requests as fast as it receives them, with ultra-low latency,” blogged Doug Burger, an engineer at Microsoft. “Real-time AI is becoming increasingly important as cloud infrastructures process live data streams, whether they be search queries, videos, sensor streams, or interactions with users.”
Project Brainwave is built with three main layers, namely a high-performance, distributed system architecture; a hardware DNN engine synthesised onto FPGAs (field-programmable gate array); and finally a compiler and runtime for easy deployment of trained models.
The platform starts off by utilising the FPGA infrastructure that Microsoft has been deploying over the past few years. The attachment of FPGAs directly to Redmond’s data centre network, allows it to offer deep neural networks (DNNs) as hardware microservices.
This system architecture reduces latency, since the CPU does not need to process incoming requests, and allows for very high throughput, with the FPGA processing requests as fast as the network can stream them.
The second element to Project Brainwave is the use of a powerful “soft” DNN processing unit (or DPU), integrated with commercially available FPGAs. Its design is scalable across a range of data types, which allows it to offer “performance comparable to – or greater than – many of these hard-coded DPU chips.”
The third component sees the incorporation of a software stack to support a wide range of popular deep learning frameworks. It already supports Microsoft Cognitive Toolkit and Google’s Tensorflow, but also plan to support many others.
“We architected this system to show high actual performance across a wide range of complex models, with batch-free execution,” wrote Burger. “We are working to bring this powerful, real-time AI system to users in Azure, so that our customers can benefit from Project Brainwave directly, complementing the indirect access through our services such as Bing.”
Microsoft has stated for a while now that AI is the future and has doubled down with a fund to invest in AI companies focused on “inclusive growth and positive impact on society.”
CEO Satya Nadella’s relentless drive to the cloud has also seen him champion machine learning and artificial intelligence (AI) as core component to the future of digital transformation in businesses.
Microsoft was also one of the founding members of a non-profit AI partnership with other tech giants including Google, Amazon and Facebook.
Yanluowang ransomware hackers claim credit for compromise of Cisco's corporate network in May, while Cisco…