Upload to blob tfs 2017 - azure

We are trying to upload the artifact to blob storage from TFS build server. AzCopy task needs the azure subscription details, which is not available to us. We need to upload the artifacts to azure blob storage using azure blob storage connection string. Is there a way to upload files to blob storage using connection string only.

Anything you can do from PowerShell you can do from build and release. There is a task named "PowerShell" and one named "Azure PowerShell". If you don't have the Azure subscription details I doubt you will be able to use the "Azure PowerShell" task. However, if you have a PowerShell script you run locally that works you might be able to simply run it as part of your build with the "PowerShell" task.
Option two is have someone that knows the details to create an Azure Service Endpoint for you. They never have to share the details with you to make the connection. Once the connection is created you can use it without having to ever know the details.

Related

Can't upload files to Azure Storage using SSIS

I want to upload image files from one of my local drives to my Azure storage container. I'm using the Azure Upload Task in SSIS.
It is connected to the azure storage container just fine, and I'm targeting a specific container and place the images into a directory. However, when I execute the task, it gives me the following error:
Error: 'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Unable to create Azure Blob container.
Error: Upload task has stopped with exception: Unable to create Azure Blob container.
Can anyone give me some help with this kind of problem? Thank you.
I had the same problem.
If the error message contains anything like this:
The TLS version of the connection is not permitted on this storage account
the solution is to downgrade the minimum TLS version on Azure Storage in the cloud or do something with the TLS version on your on premises machine. You can find this setting on Azure Storage Configuration.
Default TLS on Azure is Version 1.2
If you downgrade (not recommended) there is a degree of security risk you are taking.
Microsoft TLS documentation
After these changes I can upload files using Access Key method or SAS (shared access signature).

it is possible by means of a pipeline of Microsoft devops to restore a backup .bak in a resource of SQL Server in Microsoft Azure?

The backup .bak is hosted on a storage account. I need to perform an automatic restore through an Azure Devops pipeline on a Microsoft Azure SQL Server resource.
And this is where I must add the task for the automatic restore.
What task should i be using to do the automatic restore
Or if you know in another way how to do the restoration it also works for me.
To import BACPAC file you may use SqlAzureDacpacDeployment#1 and if you want to restore database hosted on VM you may use SQLPACKAGE which is installed on Windows based MS Hosted agents. Please check this link too. To find SQL Package you may use this script. You may also consider using FAKE tool and theirs SQL Package wrapper. Here you have my blog post about this.
If you have your DACPAC file on storage account you have to download it first, but for this you may use AzureCliTask and az storage file copy

Deploy azure data factory v2 app

I'm trying to find any way to publish my console app (.net) written for azure data factory v2.
But could not find any solution.
More details would be really appreciated but if you mean that you are using the .NET SDK to create ADF V2 objects, my understanding is that there is no such thing as publish compared to the new User Interface in the portal where you create/edit the objects first and then you click on publish.
if you use the library they get automatically uploaded to ADF V2 and you can easily test that now with the new UI.
It would be useful to have a bit more info on your context. You're talking about running a custom activity from an Azure Batch account? What did you try already?
When running a custom activity, you'll have to upload your executable + depedencies to an Azure storage account. Create a blob container and copy the files there. Then you'll have to configure the activity to use this storage account and the point it to the right container.
If you're asking for a deployment like a right-click -> deploy option, it doesn't exist. I've automated my deployments using PowerShell, using Set-AzureStorageBlobContent to write the files to the Storage account.

Copy Azure blob to local machine as soon as blob is created

I'm trying to create a Windows service that will detect when a new blob is uploaded to a certain container on Azure and download them onto the local machine immediately. I know I can have a blob trigger running locally but there doesn't seem to be any way to put this into a service. Does anyone have any ideas?
You should be able to do this with using the standard WebJobs SDK with a blob trigger, but running as a service instead of a console app.
You can find more information about using the blob trigger with the SDK directly here: https://github.com/Azure/azure-webjobs-sdk/wiki/Blobs

TeamCity and Windows Azure Blob Storage

I have setup a TeamCity server on my machine and the build is running quite well, for your information, I am building a Unity application with Plastic SCM as VCS.
I would like to know if it's possible to send the result of the build to a Windows Azure Blob Storage easily ?
The solution that I can think of, mentioned in comments above: use the TeamCity plugin, then use Powershell to back up the VM to Blob Storage.
This blog by the Scripting Guy explains the PowerShell process http://blogs.technet.com/b/heyscriptingguy/archive/2014/01/24/create-backups-of-virtual-machines-in-windows-azure-by-using-powershell.aspx but in a nutshell,
Get-AzureOSDisk and Get-AzureDataDisk commands to get your disks, create a container using New-AzureStorageContainer.
Start-AzureStorageBlobCopy backs up VHD to blob storage on your new container.

Resources