How to save file into Azure Storage Account in the release pipeline? - azure

I am trying to publish 2 files 1.ext, 2.ext into a General purpouse v2 storage account that I've just created, I've created a file share inside of it.
Question: How to save/publish a file into storage account from Azure DevOps pipeline? Which task should I use? Azure copy seems to have only two types of storage avaiable:

Yes, you can use Azure file copy task.
Run the pipeline, the file will be uploaded to target storage account:
Anyway, you can also use Azure PowerShell or Azure CLI task to upload file to storage account. Here are the tutorial for PowerShell and CLI
Update
The Source could be a file or a folder path, so you can:
Filter target files by PowerShell task in previous task, and copy it to a temporary folder.
Upload the whole folder.
For example: I just uploaded the whole project source files by setting the path to $(Build.SourcesDirectory)
And then, all the files were uploaded to storage account.

Related

How to Download latest file in Azure storage using azcopy to a local system

I am new to azure storage. In the azure storage i have a container and i have multiple directory and i have sub directory inside the container.
Sub directory contains multiple files. I need to download the latest file from the sub directory.
As of now i use the command
azcopy cp "https://storageforecast.blob.core.windows.net/test/pollo/pollo1/pollo2/?si=plus&sv=2019-12-12&srMAVZhkpCwrXs1" "E:\111" --recursive
test- container
pollo -directory
pollo1 - subdirectory1
pollo2 - subdirectory2
I have multiple files inside pollo2. I need to download the latest file...and how can i do that..Can someone pls help me
If you aren't explicitly looking for a cmd solution, then you can download and install Azure Storage Explorer and connect to your storage accounts. This explorer gives your the options to order by Last Modified Date. You can simply right-click and download the blobs in your containers after ordering by Last Modified Date.
Link to download Azure Storage Explorer : https://azure.microsoft.com/en-us/features/storage-explorer/

Fetch Azure storage dacpac file in ARM Template

Is there a way to get data from Azure Storage like dacpac, zip etc and put in drop folder in CI/CD pipeline?
hm, for saving files to Azure Storage, there is a Azure File Copy task. So you probably have to either use PowerShell (like the Set-AzStorageBlobContent cmdlet) or using the azcopy CLI (you might have to find an image that contains the binary)

Is there a way to create a new blob as a folder using logic apps?

I've set up a logic app to move my new files on my FTP server to my azure storage container, which has blobs for my files. I found a way to create new folders using the storage explorer, but is there a way I can automate this using logic apps? For example, if a new folder is created in my FTP and files are added to it, I want to create a blob folder and move those files into that blob.
First of all, Azure blob storage doesn't support folders. There is only your storage account and a list of container containing blobs.
What you can do is to simulate a directory by adding a name that contains a slash, e. g. uploading the following file:
/myVirtualFolder/test.txt
Will upload the file to your desired container and tools like storage explorer will parse the slashes and display them as a folder:
But if you check the metadata for test.txt, you will see that the actual file name is /myVirtualFolder/test.txt:
So all you have to do is to upload all your files from your target directory to the container by adding the virtual directory to its name. You can`t and don't have to create a folder first.

AzCopy uploading local files to Azure Storage as files, not Blobs

I'm attempting to upload 550K files from my local hard drive to Azure Blob Storage using the following command (AzCopy 5.1.1) -
AzCopy /Source:d:\processed /Dest:https://ContainerX.file.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
It starts churning right away.
But it's actually creating a new Azure File Storage folder called fec-data/reports rather than creating new blobs in the Azure Blob folder fec-data/reports I've already created.
What am I missing?
Also, is there anyway to keep the date created (or similar) values of the old files?
Thanks,
But it's actually creating a new Azure File Storage folder called
fec-data/reports rather than creating new blobs in the Azure Blob
folder fec-data/reports I've already created.
What am I missing?
The reason you're seeing this behavior is because you're uploading to File storage instead of Blob storage. To upload the files to Blob storage, you need to specify blob service endpoint (blob.core.windows.net). So your command would be:
AzCopy /Source:d:\processed /Dest:https://ContainerX.blob.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
Also, is there anyway to keep the date created (or similar) values of
the old files?
Assuming you want to keep the date created of the blob same as that of the desktop file, then it is not possible. Blob's Last Modified Date/Time is a system property that gets assigned when a blob is created and is updated every time that blob is changed. You could however make use of blob's metadata and store file's creation date/time there.
I think you have to get the instance of the bob where you want to deploy the file
like :
AzCopy /Source:d:\processed /Dest:https://ContainerX.blob.core.windows.net/fec-data/Reports/ /DestKey:SomethingSomething== /S
Blob: Upload
Upload single file
AzCopy /Source:C:\myfolder/Dest:https://myaccount.blob.core.windows.net/mycontainer /DestKey:key /Pattern:"abc.txt"
If the specified destination container does not exist, AzCopy will create it and upload the file into it.
Upload single file to virtual directory
AzCopy /Source:C:\myfolder /Dest:https://myaccount.blob.core.windows.net/mycontainer/vd /DestKey:key /Pattern:abc.txt
If the specified virtual directory does not exist, AzCopy will upload the file to include the virtual directory in its name (e.g., vd/abc.txt in the example above).
please refer the link :https://learn.microsoft.com/en-us/azure/storage/storage-use-azcopy

Upload multiple files in Azure Blob Storage from Linux

Is there a way to upload multiple files to Azure Blob Storage from a Linux machine, either using the terminal or an application (web based or not)?
Thank you for your interest – There are two options to upload files in Azure Blobs from Linux:
Setup and use XPlatCLI by following the steps below:
Install the OS X Installer from http://azure.microsoft.com/en-us/documentation/articles/xplat-cli/
Open a Terminal window and connect to your Azure subscription by either downloading and using a publish settings file or by logging in to Azure using an organizational account (find instructions here)
Create an environment variable AZURE_STORAGE_CONNECTION_STRING and set its value (you will need your account name and account key): “DefaultEndpointsProtocol=https;AccountName=enter_your_account;AccountKey=enter_your_key”
Upload a file into Azure blob storage by using the following command: azure storage blob upload [file] [container] [blob]
Use one of the third party web azure storage explorers like CloudPortam: http://www.cloudportam.com/.
You can find the full list of azure storage explorers here: http://blogs.msdn.com/b/windowsazurestorage/archive/2014/03/11/windows-azure-storage-explorers-2014.aspx.
You can use the find command with the exec option to execute the command to upload each file, as described here as described here:
find *.csv -exec az storage blob upload --file {} --container-name \
CONTAINER_NAME --name {} --connection-string=‘CONNECTION_STRING’ \;
where CONNECTION_STRING is the connection string of your Azure Blob store container, available from portal.azure.com. This will upload all CSV files in your directory to the Azure Blob store associated with the connection string.
If you prefer the commandline and have a recent Python interpreter, the Azure Batch and HPC team has released a code sample with some AzCopy-like functionality on Python called blobxfer. This allows full recursive directory ingress into Azure Storage as well as full container copy back out to local storage. [full disclosure: I'm a contributor for this code]

Resources