Azure Devops Restore SQL Database as part of my release - azure

In my release pipeline to my Integration Environment I want to restore the production database prior running the migration script.
Both my databases are hosted in Azure. So I thought I could use the Azure SQL Database Deployment task that is already integrated to the Azure Dedvops. I created two separate tasks. First Export to a .bacpac file and then import that .bacpac file again. Currently I am running into the following issue:
...
2018-11-22T09:02:28.9173416Z Processing Table '[dbo].[EmergencyContacts]'.
2018-11-22T09:02:31.6364073Z Successfully exported database and saved it to file 'D:\a\r1\a\GeneratedOutputFiles\DatabaseName.bacpac'.
2018-11-22T09:02:31.6726957Z Generated file D:\a\r1\a\GeneratedOutputFiles\DatabaseName.bacpac. Uploading file to the logs.
2018-11-22T09:02:31.6798180Z ##[error]Unable to process command '##vso[task.uploadfile] D:\a\r1\a\GeneratedOutputFiles\DatabaseName.bacpac' successfully. Please reference documentation (http://go.microsoft.com/fwlink/?LinkId=817296)
2018-11-22T09:02:31.6812314Z ##[error]Cannot upload task attachment file, attachment file location is not specified or attachment file not exist on disk
2018-11-22T09:02:31.7016975Z Setting output variable 'SqlDeploymentOutputFile' to 'D:\a\r1\a\GeneratedOutputFiles\DatabaseName.bacpac'
2018-11-22T09:02:31.7479327Z ##[section]Finishing: Azure SQL Export`enter code here`
Any ideas how I could solve this?

Related

Trying to copy folder from release pipeline to Azure BLOB ended up with no content copied but task is successfully completed

Try to copy JMeter report to AZ BLOB container.
Using Ubunto agent in release pipeline ,as the Azure blob file copy only supported from Win agent i've tried to find some other way to do the copy on a linux agent.
Running the following command in Azure CLI task:
azcopy copy '$(System.DefaultWorkingDirectory)/Report' 'https://account.blob.core.windows.net/container?sp=racwdl&st=2021-04-22T12:12:49Z&se=2022-06-01T20:12:49Z&spr=https&sv=2020-02-10&sr=c&sig=xxxx' --recursive=true
I want to copy all the content of folder 'Report' to the AZ BLOB recursively.
The task is finished successfully but no files at all were copied.
Using SAS for the container.
Attached the task logs of running.
Any ideas?
Make sure you did not put the files into a wrong path for Reports folder. You can check your variable for $(System.DefaultWorkingDirectory) variable under Pipeline or print it out under pipeline.
With all the steps, you should make sure the files are in the right directory.
Besides, you could use the variable system.debug: true to check the detailed log about your task.

Unable to copy files from azure file share to local disk in azure VM

I am having azure VM, where I have mounted Azure File share as drive(V:), I am trying to create azure pipeline which will do a copy task to copy files from mounted Azure File share(v:) to local disk (D:).
In release pipeline I chose Copy task and I had given the source as Azure file share drive path (\\filesharepath\foldername) and destination as (D:\Foldername) but when I run the pipe line am getting the error
Unhandled: Not found sourcefolder: (\\filesharepath\foldername)
I also tried other way to create a PowerShell script task and created a powershell script (inside the azure VM) with Copy-Item command to do copy task but while running the pipeline got error
Copy-Item: cannot find drive. The drive does not exists.
while in the azure VM, I am able to run the powershell script and file copy does is happening, the issue occur only when I run the scripts through pipeline, how to overcome these issues?

Read Resource File in application which has been uploaded to Azure Batch

I am using Azure batch Service and I am able to add applications,pools and tasks to the same .I have attached a resource file to each task.I am confused about how to access the resource file or inputs provided to the task in the application that I have uploaded to the Azure batch .Please help me in the same
If you have attached the ResourceFile(s) to the individual tasks then the file(s) should be uploaded to the filePath you specified(if any) appended to the environment variable in your task of AZ_BATCH_TASK_WORKING_DIR (https://learn.microsoft.com/en-us/azure/batch/batch-compute-node-environment-variables).
For an ad-hoc experience you can download https://azure.github.io/BatchExplorer/ and navigate to an existing node and/or task to see a file explorer for how the directories in the node are layed out.

How to save file into Azure Storage Account in the release pipeline?

I am trying to publish 2 files 1.ext, 2.ext into a General purpouse v2 storage account that I've just created, I've created a file share inside of it.
Question: How to save/publish a file into storage account from Azure DevOps pipeline? Which task should I use? Azure copy seems to have only two types of storage avaiable:
Yes, you can use Azure file copy task.
Run the pipeline, the file will be uploaded to target storage account:
Anyway, you can also use Azure PowerShell or Azure CLI task to upload file to storage account. Here are the tutorial for PowerShell and CLI
Update
The Source could be a file or a folder path, so you can:
Filter target files by PowerShell task in previous task, and copy it to a temporary folder.
Upload the whole folder.
For example: I just uploaded the whole project source files by setting the path to $(Build.SourcesDirectory)
And then, all the files were uploaded to storage account.

Copy On-Prem Flat file to Azure Blob using Azure Data Factory V2

I am trying to load a Flat file to BLOB using the ADF V2. I have installed the Self Hosted Integration Runtime for the same. The Integration Runtime on Local Machine shows that is successfully connected to the cloud Service as in the SC below. However while making the LinkedService to the on Prem File, some credentials are required. I am not sure of what UserName or Password should be fed in. I have tried both On-Prem and Azure passwords (Wanted to try). Please see the SC.
Could you please guide as how the connection can be made to a local Flat file in my case.
Thanks
- Akshay
Note: You can choose a file while creating the File System as a source in ADF.
You may follow the following steps to select the text file while creating File system as source:
First create a linked service as follows:
Host: **C:\AzureLearn\**
Create a copy activity and select Source as follows:
Click on Source => New
Select New DataSet => Select File => File System and continue
Select Format= > Choose DelimitedText and continue
=> Select previously created File system linked service and click on browse.
Choose a file or folder.
Here you can find the file located under the previously selected folder while creating File System.
Hope this helps.
The LinkedService connection to BLOB or AzureSQL Server are being blocked by the firewall of my organisation. It won't let my system Integration runtime connect my resources to the public cloud.
I followed the same steps on my personal machine and everything went smoothly. Will get the firewall restrictions sorted and update this link for more information.

Resources