I am using the Azcopy tool to copy a storage account to another. But the requirement is that the customer wants the azcopy done via some other Azure service like Azure Runbook but not the Hybrid Runbook. I have used a Docker image with azcopy. I am using Azure App Service with the docker container option. The issue is that the command executes multiple times and also once every 5 mins. How to avoid this. Any suggestions?
Related
I am using Azure Container Apps to host our Containers. I want one instance of an Azure Container App to host two different containers (these are services that share a similar lifecycle). This is possible to do manually via the Azure portal, but I cannot see a way to automate it via the Azure CLI.
I have been using the az containerapp up CLI command to create an App with a single container.
https://learn.microsoft.com/en-us/cli/azure/containerapp?view=azure-cli-latest#az-containerapp-up
But is there a way with the CLI to create an app that manages 2 containers?
Thanks
The az containerapp compose create command let you specify a Compose file.
https://learn.microsoft.com/en-us/cli/azure/containerapp/compose?view=azure-cli-latest
As a part of my testing harness, I'd like to deploy to an Azure Storage Emulator container.
For production releases, I'll use an Azure CLI release task with this command:
call az storage blob sync -s %ReleaseDirectory% -c %ReleaseName% --account-name %AccountName%
This works fine.
The trouble starts when I attempt to create a service connection to the emulator for my testing environment. For creating such a connection, we get this dialog:
See the problem? Subscription ID... Tenant ID... SPN info... none of these exist for an emulator instance. Apparently we can only create a connection to a full-blown Azure Storage account. There doesn't seem to be provision for connecting to an emulator.
Is there another way? How can I create a service connection to an Azure Storage Emulator so that I can use it in Azure-related pipeline tasks?
There is not a service connection for Azure Storage Emulator in azure devops. Above screenshot is the service connection for azure resources on cloud.
If Azure Storage Emulator is installed on local machine, The cloud agents will not able to access to the Azure Storage Emulator hosted on your local machine. You will have to create a self-hosted agent(on the same local machine). And run your pipeline on the self-hosted agent. So that your pipeline can access to the Azure Storage Emulator.
Follow the steps here to create a self-hosted agent.
If the Azure Storage Emulator and your self-hosted agent are installed on different machines. You can add a SSH service connection to connect to the machine which hosts the Azure Storage Emulator. Then you can use SSH task or PowerShell on Target Machines task to run scripts on the remote machines.
If the Azure Storage Emulator container is hosted on Azure Container Registry. You can add Docker Registry service connection. So that you can use Docker task to run the Azure Storage Emulator container on the agent machine. See below:
I have created 1 node.js and 1 java Azure Function, using VS Code
When I have deployed just these 2 functions to Azure using VS Code, I ended up with this many Azure resources (see the picture below).
Is there a way to re-use the same resource type (i.e. App Service, Storage Account, etc..) to host multiple Azure Functions?
Absolutely, we do this all the time. We usually create resources in the portal by hand the first time and later use deployment scripts (msdeploy or Powershell) to update the resources.
When you create a new Function App in the portal, you can tell Azure to put the new Function App in an existing App Service Plan and also set it to use existing storage:
You can also do this using Azure CLI if you are a CLI guy.
So I am trying to create a PowerShell runbook to automate the upload of files from local machine and network drives to an Azure storage account. I have tried doing this with Set-AzureStorageFileContent however get the error that the directory does not exist.
Any help would be greatly received.
Thank you.
Runbooks run on the azure cloud platform, and it does not know(cannot communicate with) your local storage.
I suggest you can use Hybrid Runbook Worker feature of Azure Automation to run runbooks directly on your computer, which can access those local resources.
Is this even possible? I have a couple web apps and a couple of Azure Functions running under the same App Service Plan. I'd like to (ideally) have them use a specific Storage plan, so I can keep everything in one place. I envision them in different containers under the same plan.
If that's not possible...then where are the files? Are they on the storage that's built into the App Service Plan itself? If so, can I connect to this somehow, so I can manage the files through something like Storage Explorer?
Today when playing with the Azure Az Powershell tool I found I was able to provision a Function App without a Azure Storage back-end. This cannot be done via the UI. An easy way to provision a Function App with a storage account backend is by leveraging the Azure UI for provisioning.
When a Function App is provisioned via command line, the bits seem to be stored within the function app itself. There is an FTP URL given if you download the publish profile. The files can be read and written to using an FTP tool like WinSCP (as alternative to Kudu)
I'd like to (ideally) have them use a specific Storage plan, so I can keep everything in one place. I envision them in different containers under the same plan. If that's not possible...then where are the files?
Every Azure Web App has a home directory stored/backed by Azure Storage. More detail info please refer to Azure WebApp sandbox. It is owned by Azure WebApp Service, we are not able to choose Azure Storage to setup WebApp by ourselves currently. But we could config storage account for Azure WebApp Diagnostic logs.
Are they on the storage that's built into the App Service Plan itself? If so, can I connect to this somehow, so I can manage the files through something like Storage Explorer?
Different WebApp Service Plan has different volume of the storage. We could use Kudu tool (https://yoursite.scm.azurewebsites.net) to manage the files. More detail info about Kudu please refer to the document.
Update:
We could access the home directory with the Kudu tool. More details please refer to the snapshoot