Elon Musk's OpenAI ties up with Microsoft for cloud computing
OpenAI, a non-profit research organisation for artificial intelligence (AI) backed by Tesla Motors chief executive Elon Musk, has tied up with Microsoft Corp's flagship cloud service, Azure.
"This will make Azure the primary cloud platform that OpenAI is using for deep learning and AI, and will let us conduct more research and share the results with the world," OpenAI's co-founders Greg Brockman, Ilya Sutskever and Sam Altman, wrote in a blog post on Tuesday.
OpenAI has been an early adopter of Microsoft's Azure N-Series virtual machines. N-Series virtual machines are powered by Nvidia Corp's graphics processing units (GPUs) and provide customers and developers access to accelerated computing and visualization experiences. "Azure has impressed us by building hardware configurations optimized for deep learning — they offer K80 GPUs with InfiniBand interconnects at scale," the co-founders said.
The Azure N-Series will be made available in South Central US, East US, West Europe and South East Asia on 1 December, Corey Sanders, director of compute at Azure, said in a blog post on Tuesday.
The N-Series is designed for intense AI workloads such as deep learning, simulations and training of neural networks.
OpenAI was established in December 2015 by Musk, Brockman, Sutskever and Altman to develop safe AI applications that can be widely distributed.
Microsoft has partnered with Nvidia to provide enhanced GPU capabilities via Azure to its customers. Esri, which provides geographic information system (GIS) software, and Jellyfish Pictures, an animation and VFX studio, are some other adopters of Azure.
GPU-powered computing is used to accelerate applications in deep learning, analytics and engineering. The single-chip processor was introduced and engineered by Nvidia, which designs graphics for gaming companies and develops chip sets for mobile computing and the automobile market.
Microsoft has also launched the Azure Bot Service, a server-less environment where developers can develop, deploy and manage bots. "Run on Azure Functions, these bots can scale on demand and you only pay for the resources your bots consume," Microsoft said in a statement.