copy file to Azure File Share (Azure Storage) - azure

in addition to my question Code sync from Azure Scale Set VM To Azure Storage is there any way to copy files from one of the particular Scale Set VM to Azure File share(Azure Storage) through ADO Pipelines ? since its scale Set server i cant push every time from one VM. Eg: In pool there will be 100 VMSS servers ,when i try to push code through pipeline it should pick up one server from pool and from that need to push code !!! does it possible ?

No, we do not have this kind of build-in task. We have a Azure File Copy task which use it copy application files and other artifacts to Microsoft Azure storage blobs or virtual machines (VMs).
When the target is Azure VMs, the files are first copied to an
automatically generated Azure blob container and then downloaded into
the VMs. The container is deleted after the files have been
successfully copied to the VMs.
As a workaround, please use a custom script to copy files to Azure File Storage. In other words, if you are able to do the same thing locally. You should also be able to achieve it through Azure DevOps pipeline.
You may have to built your own extension or use scripts in pipeline. Take a look at this similar 3rd-party task-- AzureDevOpsReleaseToFileStorage

Related

Copy files to Azure VMSS using GitHub Actions

Is there any way I can copy the set of files from my GitHub Repo/Azure Blob Storage to Azure VMSS (Windows Server) using GitHub actions and then restart all the instances? should I use a custom extension script to copy the files? are there any other methods to copy? Please Note: on my vmss server autoscaling is enabled.
Thank you
Hussain.
It seems you want to copy the files from your GitHub Repo/Azure Blob Storage to Azure VMSS and the VMSS is configured with autoscaling, which means you want the copy files in all the instances, including the autoscaling instances.
If I'm right, then manually copy files is not the right way. You should know all the instances of the VMSS are created from the configuration that you set in the creation time. If you just copy files to the existing instances, then when it auto-scales, the new instances will not have the copy files. There are two ways for you.
One is that you can create a custom VM image and copy files in that image. Then use this VM image to create your VMSS.
Another one is that if you use the Azure File Share, you can mount the file share to the VMSS, then all the files in the file share will exist in all the instances of the VMSS. Here is an example.

Azure Function, Blob Trigger : Copy data file from blob to a server

My requirement is to create a blob trigger azure function which will be triggered when a specific file (say, trigger.txt) is copied into a container. Once it is triggered, the powershell function should copy this trigger.txt on to a Azure windows VM in the same same subscription and resource group.
I can see that the function is triggered if an example trigger.txt file is present.
What do I need to do to copy this blob in the container to the azure VM? I see that azcopy does not work.
You could consider one of the following approaches
Using Azure File Storage
Mount Azure File Storage onto Windows VM
Azure Function would create a new file in file storage using content from input blob. Since there is no binding support for file strorage, you will have to use the File Storage SDK directly.
You could also consider using Logic Apps which has connectors for both blob storage and file storage. But do note that there are file size restrictions that you may run into depending on your use case.
And finally, you might want to consider using blob events to reduce on polling costs for both approaches.
Use PowerShell Functions to remote into the VM as suggested by #silent in the comments and run azcopy to download the file
There is an official doc which showcases how you can run commands using remoting that you can refer to. The doc takes the context of remoting to an on-premises machine via a hybrid connection which you can ignore for your use case.
Also, if your VM doesn't have a public endpoint, you will have to use Premium Functions which support VNET Integration.

Using AzCopy in azure virtual machine

I have an azure virtual machine which has some application specific CSV files(retrieved via ftp from on-premise) that needs to be stored into a blob (and eventually will be read and pushed into a Azure SQL DB via a worker role). Question is around pushing the files from VM to blob. Is it possible to get AzCopy without installing the SDK to have the files copied to the blob? Is there a better solution than this? Please read the points below for further information
Points to note:
1) Though files could be directly uploaded to a blob rather than getting them into the VM first and copying from there, for security reasons the files will have to be pulled into the VM and this cannot be changed.
2) I also thought about a worker role talking to a VM folder share (via common virtual network) to pull the files and upload to the blob, but this does not appear to be a right solution after reading some blogs - as it requires changes to both VMs (worker role VM and the Iaas VM).
3) Azure File Service is still in preview (?) and hence cannot be used.
Is it possible to get AzCopy without installing the SDK to have the
files copied to the blob?
Absolutely yes. You can directly download AzCopy binaries without installing SDK using the following links:
Version 3.1.0: http://aka.ms/downloadazcopy
Version 4.1.0: http://aka.ms/downloadazcopypr
Source: http://blogs.msdn.com/b/windowsazurestorage/archive/2015/01/13/azcopy-introducing-synchronous-copy-and-customized-content-type.aspx

Copy file from remote server using SFTP straight to Azure blob storage

I've got a remote server with a bunch of static files (outside of Azure). I've created a worker role in Azure and I want to use the worker role to transfer these files using SFTP from my remote server straight to my blob storage account (without copying locally to the worker role). Is there an established workflow/best practices on how to do this?
The closest thing I was able to find was this question:
Copy file from URL to Azure BLOB
However to use StartCopyFromBlob I would need to have a publicly accessible URL which is not the case.
Also some of these files may be >100mb or >500mb, should that raise any problems?
Thanks.
You might want to tackle this the other way around, by setting up an FTP server on Azure which save the files directly to Azure storage.
A good explanation on how to do this by use of the Preview Files can be found on http://fabriccontroller.net/blog/posts/deploying-a-load-balanced-high-available-ftp-server-with-azure-files/
The simplest approach would be to install the Azure CLI directly on the remote server. You can then use the Azure CLI to transfer these files directly to Azure Blob Storage. No need for SFTP (Azure Blob Storage doesn't expose an SFTP interface) or worker roles (your remote server isn't serving files over a web interface).
If this is not an option, the other approach would be to do what #Mark Volders suggested, and provision an SFTP server on Azure. You can then push files from the remote server to the SFTP server. The SFTP server would then push the file to Azure Blob Storage using the Azure CLI, and delete the local file upon success.
For the SFTP server, one hurdle is making sure files are copied to Azure Blob Storage as soon as the SFTP client is done transferring a file. A common approach is to use Incron, which is a service that listens for file events (in this case, the IN_CLOSE_WRITE event). There's a product SFTP Gateway on the Azure Marketplace that does all this (disclosure: I'm one of the developers for this product) so you don't need to spend time implementing this from scratch.
Also, file sizes of >500mb should not be a problem for either the Azure CLI or SFTP.
You can use SFTP hosted on a docker image hosted on ACI which pushes the files to the Azure Blob Storage container and later deletes them from its local storage after it's uploaded successfully onto the blob storage.
It's also possible to mount Azure Blob Storage using Blobfuse, but that requires some extra privileges for the docker image, which ACI doesn't provide.
I've written an article to tackle the same problem here:
https://aws.plainenglish.io/sftp-as-caas-on-cloud-1171080aa5df

How to achieve Incremental deployment of Blob Files storage files to different environments of windows azure storage?

We are new to Windows azure and are developing a web application. In the beginning of the project , we have deployed complete code to different environments which actually publish complete code and uploaded blob objects to azure storage as we linked sitefinity to hold blob objects in azure storage . But now as we are in the middle of development , we are just required to upload any new blob files created which can be quite less in numbers (1 or 2 or maybe few).Now I would like to know best process to sync these blob files to different azure storage environments which is for each cloud service. So ideally we would like to update staging cloud service and staging storage first and then test there and then once no bugs are found, then will be required to update UAT and production storages as well with the changed or new blob objects.
Please help.
You can use the Azure Storage Explorer to manually upload/download blobs from storage accounts very easily. For one or two blobs, this would be an easy solution, otherwise you will need to write a tool that connects to the blob storage via an API and does the copying for you.

Resources