I have this task in Devops: I want to copy a text file from blobstorage to a VM.
- task: AzureFileCopy#4
inputs:
sourcePath: 'https://storagename.blob.core.windows.net/container/file.txt'
azureSubscription: 'subscription connection'
storage: 'a_storage_in_subscription'
resourceGroup: $(rgName_of_VM)
destination: 'azureVMs'
MachineNames: $(ipofVM)
vmsAdminUserName: $(adminUsername)
vmsAdminPassword: $(adminPassword)
targetPath: 'c:\files'
But it fails with Upload to container: '8e107770-69d8-xxx' in storage account: 'a_storage_in_subscription' with blob prefix: '' failed with error: 'AzCopy.exe exited with non-zero exit code while uploading files to blob storage.' For more info please refer to https://aka.ms/azurefilecopyreadme
My understanding is that the task copies the file and first puts it in a container in the 'storage' field (a guid is used to create it). The task succeeded in it but then error happened. What am I doing wrong?
If you re-run this failed pipeline along with setting system.debug to true, you will see another message which can explain more detailed for why it failed: failed to perform copy command due to error: cannot start job due to error: cannot scan the path \\?\D:\a\1\s\https://storagename.blob.core.windows.net/container/file.txt.
Now, you should know that why you encountered that error message. That's because our Azure File Copy task does not support to use HTTPS url in sourcePath. The value of sourcePath must satisfied on below mentioned:
Since we does not support to use HTTPS url here. As a work around, you can firstly download this file into build working directory firstly by using Azure cli command. Then upload it to AzureVM:
- task: AzureCLI#1
displayName: 'Azure CLI '
inputs:
azureSubscription: {subscription}
scriptLocation: inlineScript
inlineScript: |
mkdir $(Build.SourcesDirectory)/File
az storage blob download --container-name {container name} --file $(Build.SourcesDirectory)/file --name {file name} --account-key $(accountkey) --account-name {blob name}
- task: AzureFileCopy#4
displayName: 'AzureVMs File Copy'
inputs:
SourcePath: '$(Build.SourcesDirectory)/File'
azureSubscription: {subscription}
Destination: AzureVMs
storage: {storage}
resourceGroup: '{resource group}'
vmsAdminUserName: {login name}
vmsAdminPassword: {login password}
TargetPath: 'xxx'
Note: You can get the accesskey from this tab:
Now, you can see that you can succeed to upload file to AzureVM.
Related
I am trying to copy test.xml from azure blob storage to c:\data\test.xml in local computer in azure devops pipeline. I want to do this as pipeline task, So I am assuming running a powershell script with PAT token passed. as per different articles, I can use azcopy. But I want to avoid this installation and can simply copying file.
I am able to solve like this using azure cli task.
- task: AzureCLI#2
displayName: 'Download Keys From Blob'
inputs:
azureSubscription: 'xxxxxx'
scriptType: 'ps'
scriptLocation: 'inlineScript'
inlineScript: 'az storage blob download --file "test.xml" --name "test.xml" --container keys --connection-string "$(ConnectionStringInetpubKeys)"'
workingDirectory: 'C:\inetpub\keys'
I have more than 100 webapp service in azure. I want to deploy packages in 100 webapps by azure pipeline with one pipeline yml file. But I couldn't find any documentation like this. I got one microsoft documentation and they prefer to increase pipeline steps. If I have 100 webapps service, then have to add 100 steps for each deployment. This is not an efficient way and its time consuming. I want just like this step.
- task: AzureWebApp#1
displayName: 'Azure Web App Deploy'
inputs:
azureSubscription: '$(Parameters.connectedServiceName)'
appType: webApp
ResourceGroupName: $(group)
appName: 'JustGoTestAgain, justgotesttwo, webapp123, webapp555, webapp777 and so on ........'
package: '$(build.artifactstagingdirectory)/**/*.zip'
This yaml file is showing error. I couldn't find any essential extensions to fix it. I also couldn't find any azure powershell deployment command regarding to this issue. How can I get the solution?
You will not be able to do this like this. However you can use Azure Cli task:
- task: AzureCLI#2
displayName: Azure CLI
inputs:
azureSubscription: '$(Parameters.connectedServiceName)'
scriptType: ps
scriptLocation: inlineScript
inlineScript: |
$apps= #('JustGoTestAgain, justgotesttwo, webapp123, webapp555, webapp777 and so on ........')
foreach ($app in $apps) {
az webapp deployment source config-zip -g $(group) -n $app --src '$(build.artifactstagingdirectory)/SOME_FOLDER/Artifact.zip'
}
And here you have more details about deployment itself
Annother approach with multiple task bu continuation if one fail is:
parameters:
- name: apps
type: object
default:
- JustGoTestAgain
- justgotesttwo
- and so on
steps:
- ${{ each app in parameters.apps}}:
- task: AzureWebApp#1
displayName: 'Azure Web App Deploy ${{ app }}'
continueOnError: true
inputs:
azureSubscription: '$(Parameters.connectedServiceName)'
appType: webApp
ResourceGroupName: $(group)
appName: ${{ app }}
package: '$(build.artifactstagingdirectory)/**/*.zip'
Thete was issue with space. Now is fine. Apart from that there is only one issue with connectedServiceName
Job Job: Step input azureSubscription references service connection $(Parameters.connectedServiceName) which could not be found. The service connection does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz. Job Job: Step input azureSubscription references service connection $(Parameters.connectedServiceName) which could not be found. The service connection does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz. Job Job: Step input azureSubscription references service connection $(Parameters.connectedServiceName) which could not be found. The service connection does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz.
Which I skipped here as you already have it on your solution.
Using the AzureFileCopy#4 to copy the files from Source to $web container, it is throwing below error. In the documentation it is said that "If you are deploying to Azure Static Websites as a container in blob storage, you must use Version 2 or higher of the task in order to preserve the $web container name." but looks like it is still not working.
##[error]Container name '$web' is invalid. Valid names start and end with a lower case letter or a number and has in between a lower case letter, number or dash with no consecutive dashes and is 3 through 63 characters long.
Here is the code for the task:
- task: AzureFileCopy#4
displayName: "AzureBlob File Copy"
inputs:
SourcePath: "$(System.DefaultWorkingDirectory)/src"
azureSubscription: ${{ parameters.AzureConnection }}
Destination: AzureBlob
storage: $(TF_VAR_CDN_STORAGE_ACCOUNT_NAME)
ContainerName: $(TF_VAR_BLOB_STORAGE_CONTAINER_NAME)
I would really appreciate if anyone can help me with this issue.
Thanks!
I was able to get it working by creating a variable with the $web container. This will prevent it from being read as a token.
task: AzureCLI#2
displayName: Azure CLI File Copy
inputs:
azureSubscription: {subscription}
scriptType: ps
scriptLocation: inlineScript
inlineScript: |
$Container = '$web'
az storage copy -s $(System.ArtifactsDirectory)/{your files}/*.* --destination-account-name {accountname} --destination-container $Container --recursive
I'm trying to update Content Type property of a file located in my Azure $web blob container using Azure CLI. I'm trying to do this because it's wrongly set and my service-worker.js (for my PWA) needs to have "application/javascript" instead of "text/plain; charset=utf-8" in order to be registered.
When using Azure cli, you should use az storage blob update.
The sample code:
az storage blob update --container-name xxx --name xxx --account-key xxx --account-name xxx --content-type "application/javascript"
I found a way of doing this using Azure Pipelines:
- task: AzureFileCopy#4
displayName: "Azure Storage - copy new files"
inputs:
SourcePath: '$(System.DefaultWorkingDirectory)/dist/*'
azureSubscription: 'johnykes-PAYG(44df419f-b455-4c4d-a8f8-c2a5fd479f10)'
Destination: 'AzureBlob'
storage: 'johnykeschatfrontend'
ContainerName: '$web'
- task: AzureFileCopy#4
displayName: "Azure Storage - overwrite .js files with correct Content Type"
inputs:
SourcePath: '$(System.DefaultWorkingDirectory)/dist/*.js'
azureSubscription: 'johnykes-PAYG(44df419f-b455-4c4d-a8f8-c2a5fd479f10)'
Destination: 'AzureBlob'
storage: 'johnykeschatfrontend'
ContainerName: '$web'
AdditionalArgumentsForBlobCopy: '--content-type "application/javascript"'
Please let me know if you find a way of doing using the Azure CLI or in my way but recursively (for all .js files) or in any other way.
I'm trying to do something I thought was super simple... I want to grab generated files (not built) and copy them to an Azure Blob storage
In my Build the last step of my azure-pipeline.yml look like this:
- task: CopyFiles#2
displayName: 'Copy generated content'
inputs:
SourceFolder: '$(Build.SourcesDirectory)/output'
contents: '**\*'
targetFolder: $(System.DefaultWorkingDirectory)/$(Release.PrimaryArtifactSourceAlias)/drop
cleanTargetFolder: true
Then, in the Release I have an Azure CLI step with the inline following code:
az storage blob upload-batch -s "$(System.DefaultWorkingDirectory)/$(Release.PrimaryArtifactSourceAlias)" -d '$web' --account-name frankdemo--account-key '_MYKEY_'
I try different combinations of paths, but nothing works...
Q: What should I put as targetFolder in my build and "-s" in my release?
You will need to add step so it will publish artifacts
steps:
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: Server'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: Server
Then in your release you can use "Azure File Copy" to copy from your release to blob storatge