Azure Batch AI

Batch AI At Scale with Azure

Azure Batch AI helps you experiment in parallel with your AI models using any framework and then train them at scale across clustered GPUs. Simply describe your job requirements and configuration to run, and we’ll handle the rest.

scroll down for more
A.I. On Azure

Easy Deployment and Flexibility
Get started in no time by leaving the plumbing and resource management to Azure Batch AI. Package your code in Docker containers for a consistent runtime. We’ll deploy virtual machines, containers, and wire up any shared storage and SSH connections. Batch AI provides a flexible programming model and SDK so you can easily integrate your own pipeline and workflow.
High Performance Training
Microsoft Azure provides you leading edge NVIDIA GPU’s connected with InfiniBand to scale to the largest models and training data. The same powerful infrastructure Microsoft uses for its AI development is now available to you on demand.

Supports Any Framework
Use any AI framework on Linux or Windows. We’ve got great support for CNTK, TensorFlow, Chainer, and more. Or bring us your code in a Docker Container and we’ll handle the rest.

Work With Your Tools
Get started with Notebooks, script the service using our Python library, and integrate workflows with the REST API and SDK for C#, Java and other languages. GUI to script to code, we’ve got you covered.

Open For Experimentation
Developing AI is all about experimentation. Easily scale to hundreds or thousands of GPU’s and only pay for what you use. Run more experiments, complete training runs faster, and process more data. Clone models to save time in training. Monitor logs and real-time output with your favorite tools.

Anywhere & Anytime
Extend your global reach with the cloud service that offers more countries and regions than any other provider. Azure runs on a worldwide network of Microsoft-managed datacenters across 38 announced regions.