Task to Deploy Artifact to a container Storage Outside of my account - azure

I am currently creating a CI for the FrontEnd of one of our client.
We need to copy the file coming from our repo the container account of the compagny that manage the operational part (we are only providing the code).
So , the company that will manage the infrastructure has Given us the storage account name (testdeploy) , the container name (artifact-deply) and the key (securekey).
I have managed to connect to the storage via Azure Storage Explorer , but now I need to deploy the artifact on this container via the CI.
The problem is , I don't know how , and I can't find documentation on how to proceed , every doc talk about deploying to a container in the same subscription.
But I do not have acces to this container , I only have it's name and key.
Here is the Yaml to what I have already setup , I do not know if i can help:
steps:
- task: AzureFileCopy#2
displayName: 'AzureBlob File Copy'
inputs:
SourcePath: '$(System.DefaultWorkingDirectory)/_listes-Azure/buildtest'
azureSubscription: 'Paiement à l''utilisation(my_subscription)'
Destination: AzureBlob
storage: testdeploy
ContainerName: 'artifact-deploy/front'
AdditionalArgumentsForBlobCopy: 'securekey'
outputStorageUri: 'https://testdeply.blob.core.windows.net/'
outputStorageContainerSasToken: 'securekey'
Of course when i do this I have this error message :
2019-10-25T10:45:51.1809999Z ##[error]Storage account: fprplistesdeploy not found. The selected service connection 'Service Principal' supports storage accounts of Azure Resource Manager type only.
Since It's not in my subscription scope , it can't acces it.
What I am doing wrong ?
I am using the AzurFileCopy task , is it good?
How can I setup the AzurFileCopy task to a container account that is not on my subscription scope , knowing that the only thing i have is a account name , and a key?
Thanks in advance !

What you basically have to do is to create and use a Shared Access Signature (SAS) to deploy resources into this blob container. Since you have the storage account key you can create a SAS token with Azure Storage Explorer.
Then use Azure Cloud Shell or Azure CLI on local machine for testing purposes. Try to copy a file into the blob container using a SAS token for authorization. If you have problems with authorization using a SAS token you can also test access using Azure Storage Explorer. Such basic tasks are widely known and well documented.
Finally find a way to run the file copy command used while testing in an Azure Pipeline Task. If Azure File Copy task does not fit to your use case, use a more generic task like an Azure CLI task. From reading over the docs it might be that it does not support your use case although the task name indicates that. I see your point. Find out how to access the artifact provided by the build pipeline and copy the file resources into the storage account. If that basically works find out how to improve it. Voila.

So I managed to do it.
Turns out , you can't do it via the AzureFile Copy , this task can't upload to as Container outside your subscription.
You must use an Azur CLI task , here is the script I used:
#!/bin/bash
az storage blob upload --container-name artifact --file $(System.DefaultWorkingDirectory)/artifact_deply/buildtest/front.zip --name front --account-key securekey
I changed all the variable but the idea is here ( I declared the account name in the variable panel of azur devops).
I used the account key , because I had error with the SAS URL , but I think you can easily use the Azur devops variable to pass the SAS Token URL.
And I created a task before this one to zip all the folder , so it's easier to manage.

Related

How to manipulate remote Terraform state files on Azure Blob storage

I'm working with a subscription that has a few different deployed environments (dev, test, staging, etc.). Each environment has its own storage account, containing an associated Terraform state file. These environments get deployed via Azure DevOps Pipelines.
It's easy enough to get at the .tfstate files that have been created this way, through the portal, CLI, etc.
But is it possible to access these state files using the 'terraform state' commands, for example using Azure Cloud Shell? If so, how do you point them at the right location?
I've tried using the terraform state commands in a Cloud Shell, but it's not clear how to point them to the right location or if this is indeed possible.
For these requirement, you need AzurePowerShell task to achieve your requirement.
1, First, if you can achieve your requirement via powershell feature in azure portal, then it is possible using the AzurePowerShell task to achieve the same thing(AzurePowerShell is running on the agent based on the service connection/service principal you provided.).
- task: AzurePowerShell#5
inputs:
azureSubscription: 'testbowman_in_AAD' #This service connection related to service principal on Azure side.
ScriptType: 'InlineScript'
Inline: |
# Put your logic here.
# Put your logic here.
azurePowerShellVersion: 'LatestVersion'
2, Second, you can use AzCopy to download the file and then do operations to it. DevOps microsoft host agent support this tool.
running this command : terraform state pull > state.tfstate (you can give like thils dev.tfstate extension tfstate is important)in the Azure cloud shell.
All you need to move to the terraform file directory
enter image description here
and run this command terraform state pull > dev.tfstate
enter image description here

How to use SAS Token in Azure File Copy Pipeline Task?

I am trying to copy ADF ARM templates to a storage account using the Azure File Copy Task in my Azure release pipeline. Since the storage account has firewall and networking set up, I want to use the SAS token to allow the Pipeline agent to copy the files to the storage account.
However, I am not able to find any documentation as to how to pass the SAS token as the optional argument(or at another place).
The task version is 2*.
How do I use the SAS token for copying the files?
I changed the file copy task version to 4*. ; and then I was able to add
--sas-token=$(sasToken) into the Optional Arguments and it worked for me.

ADF Pipeline Errors - RequestContentTooLarge and InvalidContentLink

The ADF Pipeline release to the test Data Factory instance is failing with the following error as shown in the image below.
So, to overcome the above issue, I modified the pipeline by adding an additional step of Azure Blob File Copy to store the linked templates in a storage account and reference it in the pipeline to use it for the deployment. However when I made the above change I am getting another error which states InvalidContentLink: Unable to download deployment content from 'https://xxx.blob.core.windows.net/adf-arm-templates/ArmTemplate_0.json?***Sanitized Azure Storage Account Shared Access Signature***'. The tracking Id is 'xxxxx-xxxx-x-xxxx-xx'. Please see https://aka.ms/arm-deploy for usage details.
I have tried using the SAS token for both at the Container level and at the Storage Account level. I also have ensured that the agent and the storage account are under same VNets. I have also tried to remove the firewall restrictions but still it gives me the same InvalidContentLink error.
The modified pipeline with the Azure Storage Account step :
How do I resolve this issue?
InvalidContentLink: Unable to download deployment content from 'https://xxx.blob.core.windows.net/adf-arm-templates/ArmTemplate_0.json?Sanitized Azure Storage Account Shared Access Signature'. The tracking Id is 'xxxxx-xxxx-x-xxxx-xx'. Please see https://aka.ms/arm-deploy for usage details.
This error can cause because of you are trying to link which might not present in storage account.
Make sure you provide correct URl for the nested template that is accessible.
Also, if your storage account has firewall rule you can't link nested template from it.
Make sure your Storage Account, Container and Blob are publicly available. To achieve this:
Provide a Blob level Shared Access Signature URL. select the file click on"..." and then Click on Generate SAS.
refer for more understanding about nested template.

az cli command to grant access to a source blob uri

What CLI command can be used to grant access to an Azure image's source blob uri to a specific App Registration's clientId?
Background
An image is created using the CLI. That image includes a source blob uri whose address is given in the portal as:
https://myblobid.blob.core.windows.net/vhds/letters-and-numbers-guid.vhd
Other tools such as ARM templates need to be able to access that same source blob URI, but the problem is that the source blob uri is not visible either to calling ARM templates or when pasted in raw form in the Azure portal.
There seems to be a permission issue.
The ARM templates will be run using a specific clientId associated with an App Registration that can be assigned any Role that you tell us it needs to have.
So what CLI command must be typed in order to give the clientId we specify the ability to run ARM template commands that can successfully access and use the given source blob uri?

Azure DevOps Pipeline Azure Blob Storage upload file 403 Forbidden Exception

Summary
I'm creating a CI/CD provisioning pipeline for a new Azure Storage Account within an Azure DevOps Pipeline and attempting to upload some files into the Blob Storage using AzCopy running from an Azure Powershell task in the pipeline.
The Error
The script runs successfully from my local machine but when running in the Azure DevOps pipeline I get the following error (ErrorDateTime is just an obfuscated ISO 8601 formatted datetime):
System.Management.Automation.RemoteException: [ErrorDateTime][ERROR] Error parsing destination location
"https://newStorageAccount.blob.core.windows.net/config/import": Failed to validate
destination: One or more errors occurred. The remote server returned an error: (403) Forbidden.
[error][ErrorDateTime][ERROR] Error parsing destination location "https://newStorageAccount.blob.core.windows.net/config/import": Failed to validate destination: One or more errors occurred. The remote server returned an error: (403) Forbidden.
[debug]Processed: ##vso[task.logissue type=error][ErrorDateTime][ERROR] Error parsing destination location "https://newStorageAccount.blob.core.windows.net/config/import": Failed to validate destination: One or more errors occurred. The remote server returned an error: (403) Forbidden.
Error record:
This request is not authorized to perform this operation.
Assumptions
The storage account has been setup to only allow specific VNet and IP Addresses access.
It looks like the firewall or credentials are somehow configured wrongly but the ServicePrincipal running the script has been used successfully in other sibling pipeline tasks and to understand these problems i've temporarily given the ServicePrincipal Subscription Owner permissions and the Storage account Firewall Rules tab has "Allow trusted Microsoft Services to access this storage account"
What I've tried...
I've successfully run the script from my local machine with my IP Address being in the allowed list.
If I enable "Allow access from All networks" on the Storage Account Firewall rules then the script runs and the file is uploaded successfully.
It appears as if the Azure Pipeline Agents running in their own VNet don't have access to my Storage Account but I would have thought that requirement would be satisfied by setting "Allow trusted Microsoft Services to access this storage account" in the Firewall settings
I'm using the following line within the Azure Powershell Task. I'm happy with the values because everything works when "All networks" or my IP Address is enabled and I run locally.
.\AzCopy.exe /Source:$SourcePath /Dest:$blobUrlDir /DestKey:$key /Pattern:$FilenamePattern /Y
Any thoughts or guidance would be appreciated.
Thanks,
SJB
People seem to be getting mixed results in this github issue, but the AzureFileCopy#4 task works (at least for us) after adding the "Storage Blob Data Contributor" role to the ARM connection's service principal (to the storage account itself). The below is the only necessary step in a pipeline that deploys a repo as a static website in a blob container:
- task: AzureFileCopy#4
displayName: 'Copy files to blob storage: $(storageName)'
inputs:
SourcePath: '$(build.sourcesDirectory)'
Destination: AzureBlob
storage: $(storageName)
ContainerName: $web
azureSubscription: 'ARM Connection goes here' # needs a role assignment before it'll work
(Of course, if you're using Azure CDN like we are, the next step is to clear the CDN endpoint's cache, but that has nothing to do with the blob storage error)
After doing further research I noticed the following raised issue - that Azure DevOps isn't considered a trusted Microsoft Service from a Storage Account perspective.
https://github.com/MicrosoftDocs/azure-docs/issues/19456
My temporary workaround is to:
Setting the DefaultAction to Allow, thereby allowing "All networks access".
Setting the DefaultAction to Deny after the copy action ensured my VNet rules were being enforced again.
Try
{
Update-AzureRmStorageAccountNetworkRuleSet -ResourceGroupName "$ResourceGroupName" -Name "$StorageAccountName" -DefaultAction Allow
.\AzCopy.exe /Source:$SourcePath /Dest:$blobUrlDir /DestKey:$key /Pattern:$FilenamePattern /Y
}
Catch
{
#Handle errors...
}
Finally
{
Update-AzureRmStorageAccountNetworkRuleSet -ResourceGroupName "$ResourceGroupName" -Name "$StorageAccountName" -DefaultAction Deny
}
Thanks,
SJB
Have you considered using the Azure DevOps Task "Azure File Copy" instead of a powershell script?
see: https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-file-copy?view=azure-devops

Resources