Using Azure Machine Learning service, you can train the model on the Spark-based distributed platform (Azure Databricks) and serve your trained model (pipeline) on Azure Container Instance (ACI) or Azure Kubernetes Service (AKS).
In this post, I show you this step and background using AML Python SDK.
In this post, I proceed to more advanced topics by showing you how to set up (customize) your Azure Machine Learning Compute (AmlCompute) for the practical training. In the last part of this post, I’ll show you Apache MXNet distributed training example with Azure Machine Learning service.
In this post I show you top 7 key benefits for Azure Machine Learning service using programming code along with the development lifecycle.
In this post I introduce NVIDIA TensorRT for Azure engineers (data scientists).
Azure Machine Learning Hardware Accelerated Models (Project Brainwave) provides hardware accelerated machine learning with FPGA.
In Github tutorial, there are several useful helper classes and functions (with python) which encapsulate boilerplate code to achieve provisioning steps. In this post I show you the same steps without these helpers. With these steps I hope it helps you to understand new FPGA-enabled services and how it’s working.
Here I show you TensorFlowOnSpark on Azure Databricks. With this tutorial, you can learn how to use Azure Databricks through lifecycle, such as – cluster management, analytics by notebook, working with external libraries, working with surrounding Azure services, submitting a job for production, etc.
In this post, we quick view how to run the workloads with neural networks (deep learning workloads) in SQL Server.