17

I have a fairly simple Unix shell script packaged up in an Alpine Linux Docker container hosted on an Azure container registry. A VM runs this script with cron:

docker login <snip>
docker pull example.com/bar:latest
docker run  example.com/bar:latest

Can I do without the VM and use Azure services instead, perhaps with some sort of scheduler running this on an Azure Container Instance?

My motivation is not wanting to maintain and pay for the VM.

Sijmen Mulder
  • 273
  • 1
  • 2
  • 5

4 Answers4

15

Azure Container Instances (ACI) may be a good option as you suggest. These let you run a container directly on Azure, without having to manage a VM, with per-second billing for the time the container is used.

Although one of the demos on that blog mentions Kubernetes, the idea of ACI is that you can create a container through the Azure CLI with az container create, just like on your local workstation with docker create.

To create the container, you can use Azure CLI (az command, see quick start docs) or Azure Cloud Shell.

You would need to create/run the container on a schedule from somewhere else - Azure Functions might be a good place to run the "container create" command from a scheduled function. This supports bash, PowerShell, and other languages - all running on Windows.

If you want to keep using Docker containers without running VMs or learning Kubernetes, this might be a good option.

Alternatively, you could move all your code into Azure Functions, but that's a bigger decision.

Update: Jan 2019 - Azure Logic Apps can be used to run scheduled tasks as well. This replaced Azure Scheduler in Jan 2022.

RichVel
  • 892
  • 6
  • 16
  • 1
    I wasn't aware I could use PowerShell or Bash for Azure Functions! Thanks – Sijmen Mulder Jul 30 '17 at 12:12
  • Yes, you can run any executable from Azure Functions, and bash is explicitly mentioned in this overview doc. To call PowerShell modules, see this blog. – RichVel Jul 31 '17 at 05:51
  • It’s common to use cloud functions such as Azure Functions, AWS Lambda etc to run installation scripts that configure cloud services (e.g, setup a new cloud environment). So you can expect all serverless services and frameworks to support running bash or similar. Traditionally the first VM you would setup was a “control host” server to use as the place to run all setup of all environments. Going Serverless to run all such scripts means no host to pay for. A control server if hacked leaks a map of your environments, old scripts and possibly cached passwords. Serverless bash is more secure. – simbo1905 Jan 04 '19 at 06:55
  • 1
    It's not the case that all FaaS (serverless) services support bash. In fact, AWS Lambda only supports bash via custom Layers, a feature added in late 2018 - you can use this open source layer to simplify running bash. Generally, serverless/FaaS services support specific languages, with some providers enabling ways to extend this - for example, AWS has Layers, and Google has a FaaS service that allows any Docker container to run. – RichVel Feb 01 '20 at 08:26
  • For simplicity if you don't want to use Layers - you can write a Node/Python Lambda that runs bash, without using Layers - see lambdash for one example. – RichVel Feb 01 '20 at 08:33
5

For an alternative approach, I would investigate Azure functions:

No VM continually running.

2

A scheduled devops pipeline is an easy and free way to run azure cli tasks on a recurring basis.

https://docs.microsoft.com/en-us/azure/devops/pipelines/process/scheduled-triggers?view=azure-devops&tabs=yaml

Nathan
  • 156
  • 4
  • ADO pipelines are for deploying source code -- like when you create a new ADO pipeline it wants to know what branch to deploy and where... can you really unmarry it from all that and just use it to execute arbitrary bash commands on a schedule for free? -- I didn't see anything in that article that indicated this was the case. – BrainSlugs83 Feb 18 '21 at 00:23
  • Yes, just add a script task to your pipeline. (you can use an inline script and use an empty repo, or you can add a script file to your which the script task points to) – Nathan Feb 19 '21 at 20:42
  • (You can even reference your container as part of the pipeline definition and run it that way if needed) - See here – Nathan Feb 19 '21 at 20:47
0

I haven't done this myself, but I wonder if an Azure Container Registry (ACR) task could be used to trigger a container on a schedule?

This ACR tutorial seems to suggest that you can schedule a container run as follows:

az acr task create --name timertask --registry $ACR_NAME --cmd mcr.microsoft.com/hello-world --schedule "0 21 * * *" --context /dev/null
daviewales
  • 101
  • 2