How to zip my file in azure fileshare using FileShareClient? - azure

I need to zip my file in fileshare. I have gone through few process and they all suggested methods for blob. Any link or advice that would be helpful for me to proceed?
I can't use Azure Data Factory because of cost issue and I have already gone through these links: link1 and link2. In these link they have used blockblobclient.downloadto method which is not present in fileshareclient

Using Azure Data Factory, it supports Azure Blob Storage (see link below).
This is a better approach, but as you mention, there is a tutorial that can help you with zipping files and storing them to Azure Storage.
Tutorial: https://josef.codes/azure-storage-zip-multiple-files-using-azure-functions/
Supported formats: https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-overview#supported-data-stores-and-formats
Using Data Factory: https://learn.microsoft.com/en-us/azure/data-factory/tutorial-copy-data-portal

Related

Azure Storage : How to upload a .pdf or .docx file using Rest API

Recently I have been working on adding documents to Azure storage using blob and file share. But then I realized that in file share using rest API I can upload in two steps
Creating a file
Adding content
I am able to doing that but my requirement here is to upload the .pdf, .docx document at once
and then there should be a way to download them as well.
Could some one please help.
Thanks
Unfortunately, there's no batch download capability available in Azure Blob Storage. You will need to download each blob individually. What you could do is download blobs in parallel to speed things up.
There is an alternative way you can approach using C# or PowerShell.
Would recommend you to please go through this MS document :
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-scalable-app-download-files?tabs=dotnet
And this one also
https://azurelessons.com/upload-and-download-file-in-azure-blob-storage/
Reference: How to download multiple files in a single request from Azure Blob Storage using c#?

moving locally stored documented documents to azure

I want to spike whether azure and the cloud is a good fit for us.
We have a website where users upload documents to our currently hosted website.
Every document has an equivalent record in a database.
I am using terraform to create the azure infrastructure.
What is my best way of migrating the documents from the local file path on the server to azure?
Should I be using file storage or blob storage. I am confused about the difference.
Is there anything in terraform that can help with this?
Based on your comments, I would recommend storing them in Blob Storage. This service is suited for storing and serving unstructured data like files and images. There are many other features like redundancy, archiving etc. that you may find useful in your scenario.
File Storage is more suitable in Lift-and-Shift kind of scenarios where you're moving an on-prem application to the cloud and the application writes data to either local or network attached disk.
You may also find this article useful: https://learn.microsoft.com/en-us/azure/storage/common/storage-decide-blobs-files-disks
UPDATE
Regarding uploading files from local computer to Azure Storage, there are actually many options available:
Use a Storage Explorer like Microsoft's Storage Explorer.
Use AzCopy command-line tool.
Use Azure PowerShell Cmdlets.
Use Azure CLI.
Write your own code using any available Storage Client libraries or directly consuming REST API.

Azure DataFactory from Blob storage to SFTP server

After hunting through the net I can find lots of examples of retrieving data from SFTP but none to send from Blob storage to SFTP.
Basically I attempted to do this using a Logic App but Azure only supports files less than 50MB (which is really dumb).
All the Azure docs I have read reference pulling but not pushing.
https://learn.microsoft.com/en-us/azure/data-factory/v1/data-factory-sftp-connector
etc etc..
Maybe someone with better googling skills can help me find the docs to help me out.
I'm using DataFactory V1.0 not 2.0 cheers
Always check this table to see if a data store is supported as source or sink in a data movement activity.
In this case, SFTP is supported as source but not as sink, this means its possible to extract data from it but not store data on it.
Hope this helped!

View Azure Blob Metadata Online

Is there a way to examine an Azure blob's metadata through a web interface or the Azure portal?
I'm running into a problem where I set metadata on a blob programmatically, without any problems, but when I go back to read the metadata in another section of the program there isn't any. So I'd like to confirm that the metadata was, in fact, written to the cloud.
One of the simplest ways to set/get an Azure Storage Blob's metadata is by using the cross-platform Microsoft Azure Storage Explorer, which is a standalone app from Microsoft that allows you to easily work with Azure Storage data on Windows, macOS and Linux.
Just right click on the blob you want to examine and select Properties, you will see the metadata list if they exist.
Note: Version tested - 0.8.7
There is no way to check this on the portal, however you can try the Storage Explorer tool.
if you want to check the metadata in your code, please try this Get blob metadata

Azure Blob storage and HDF file storage

I am in the middle of developing a cloud server and I need to store HDF files ( http://www.hdfgroup.org/HDF5/ ) using blob storage.
Functions related to creating, reading writing and modifying data elements within the file come from HDF APIs.
I need to get the file path to create the file or read or write it.
Can anyone please tell me how to create a custom file on Azure Blob ?
I need to be able to use the API like shown below, but passing the Azure storage path to the file.
http://davis.lbl.gov/Manuals/HDF5-1.4.3/Tutor/examples/C/h5_crtfile.c
These files i am trying to create can get really huge ~10-20GB, So downloading them locally and modifying them is not an option for me.
Thanks
Shashi
One possible approach, admittedly fraught with challenges, would be to create the file in a temporary location using the code you included, and then use the Azure API to upload the file to Azure as a file input stream. I am in the process of researching how size restrictions are handled in Azure storage, so I can't say whether an entire 10-20GB file could be moved in a single upload operation, but since the Azure API reads from an input stream, you should be able to create a combination of operations that would result in the information you need residing in Azure storage.
Can anyone please tell me how to create a custom file on Azure Blob ?
I need to be able to use the API like shown below, but passing the
Azure storage path to the file.
http://davis.lbl.gov/Manuals/HDF5-1.4.3/Tutor/examples/C/h5_crtfile.c
Windows Azure Blob storage is a service for storing large amounts of unstructured data that can be accessed via HTTP or HTTPS. So from application point of view Azure Blob does not work as regular disk.
Microsoft provides quite good API (c#, Java) to work with the blob storage. They also provide Blob Service REST API to access blobs from any other language (where specific blob storage API is not provided like C++).
A single block blob can be up to 200GB so it should easily store files of ~10-20GB size.
I am afraid that the provided example will not work with Windows Azure Blob. However, I do not know HDF file storage; maybe they provide some Azure Blob storage support.

Resources