I have directory where is about 100 000+ subdirectories. In every subdirectory is from one to ten files. All files are images with content type = image/jpeg.
Together this files have size over 54 GB. Is there any chance to upload this files with structure
/orders/1000000003/12345468878.jpeg.
I know that BLOB is not hierarchical. I don't have Windows, i don't have Powershell, i don't have Visual Studio.
Any suggestions?
Use the full path of your files as blob names. To upload from Linux or Mac, you can use Azure CLI (available as an NPM package).
Even though the structure is not hierarchical, you can "emulate" directories by adding /'s to the path name.
There are multiple clients available for Mac that support Azure Storage; my favorite is Cyberduck: https://cyberduck.io/ (free)
Related
I read a bunch of websites (included stack overflow posts and Microsoft website) about the file size limitation of 2GB
Microsoft Docs: Compress-Archive
"The Compress-Archive cmdlet uses the Microsoft .NET API
System.IO.Compression.ZipArchive to compress files. The maximum file
size is 2 GB because there's a limitation of the underlying API."
I have a PowerShell script to backup my Azure DevOPS projects using the Azure DevOPS RestAPI 5.0.
I download all my projects directly in zip format with the RestAPI then i use Compress-Archive to consolidate all the zipped projects into 1 "big" zip file
My total zipped projects files all together is equal to 5.19GB
After compressing to 1 big zip file with Compress-Archive, i got a zip file size of 5.14GB
I don't have any issue to uncompress it and I don't get any error although the 2GB limitation on the documentation.
I wonder if it's because i'm using Windows Server 2016 (so 64bits) so I don't have the 2GB file size limitation?
Anyone can confirm that? As the Microsoft documentation doesn't specify anything about it.
Indeed, they stipulate that the issue is due to the API limitation of System.IO.Compression
Microsoft Docs: system.io.compression.ziparchive
As my zip will continue to grow, i would like to be sure that the zip won't be corrupted due to a file size limitation.
I can indeed use a 3rd-party library like Zip64, 7Zip, ... but i would like to only use the built-in compress method from PS/.Net
The 2GB limitation is for single Files inside the zip.
For example, if you try to Expand a .zip with an 2.1GB file inside it, it will trow an error.
Some large disks containing hundreds of 30GB tar files have been prepared and ready to ship.
The disks have been prepared as BLOB using the WAImportExport tool.
The Azure share is expecting files.
Ideally we don't want to redo the disks as FILE instead of BLOB. Are we able to upload as BLOBs to one storage area and extract the millions of files from the tarballs to a FILE storage area without writing code?
Thanks
Kevin
azcopy will definitely do it and has been tested. We were able to move files from blobs to files using the CLI in Azure with the azcopy command.
The information provided below was proven not to be true.
Microsoft Partner told me yesterday there is no realistic way to convert Blobs to Files in the above-mentioned scenario.
Essentially, it is important to select either WAImportExport.exe Version 1 for BLOBS or WAImportExport.exe Version 2 for files. Information on this can be found at this location.
The mistake was easily made and done so by a number of people here: the link to the tool sent was to the binary version 1. Search results tended to direct users to version 1 but version 2 only appears only after deeper dig. Version 2 - seems to be an afterthought my Microsoft when they added the Files option to Azure. It's a pity they didn't use different binary names or build a switch into version 2 to do both and delete the version 1 offering.
I currently manually delete obsolete folders from a published azure website. I know there is an option in visual studio to Remove additional files at destination. My problem is that I have an Images folder (quite large) that users upload, that will be deleted when I publish with this option checked. My question is, is there a way to use this option with exclusions? Meaning, to delete all files that are not in the local project except "\Images" folder?
You can most likely customize the web deploy usage from VS to do what you want but I don't think I would recommend it since things like that tend to get fragile.
I would suggest changing your architecture to store the images in a blob container, then possibly mapping your blobs to a custom domain (https://azure.microsoft.com/en-us/documentation/articles/storage-custom-domain-name/).
Having your images in blob storage will also prevent any accidental deletion of the Images folder by someone else that doesn't know it shouldn't be touched (or you simply forgetting about it one day).
Using blob storage will also allow you to configure CDN usage if ever find that you needed it.
Another option would be to create a virtual directory on your WebApp configuration and put the Images there - that way your VS deploy/publish wouldn't be modifying that subdirectory. This link may help with that: https://blogs.msdn.microsoft.com/tomholl/2014/09/21/deploying-multiple-virtual-directories-to-a-single-azure-website/
I have a trivial and simple task - I want to store some my data (documents, photos etc) on Azure as backup. Which type of service should I select? Store as Blob? But I want to save a structure of data (folders, subfolders etc). Azure Backup? It stores only archive data, I don't want to archive it all in one. DocumentDB? I not need to have features like return json format etc... What is the best way to store many (thousands) files (big and small) to save folder structure without archivation to one file (so, I want to have a simple way to get only one file quickly)
I use Windows 7.
In addition to what knightpfhor wrote above you might also want to take a look at Getting Started with AzCopy - it is a Windows tool that helps you copy content to / from your local storage to Blob Storage. PowerShell and XPlatCLI are other options as well.
Based on what you've described, blob storage can achieve what you're after. While blob storage doesn't technically support folders, it does let you include \ in the file names which are interpreted by some blob storage clients as folders (for example the free Cerebrata Azure Explorer). So if you have
RootDrive
|
--My Pictures
|
Picture1.jpg
Picture2.jpg
--My Other Picutres
|
Picture1.jpg
You could create blobs with the names:
RootDrive\My Pictures\Picture1.jpg
RootDrive\My Pictures\Picture2.jpg
RootDrive\My Other Picutres\Picture1.jpg
And they have the ability to be interpreted as folders.
I examined Recovery Services deeply and it works now (from december 2014) for Windows client versions too (including Windows 7, Windows 8, Windows 8.1). It allows to backup selected files and folders.
This is a guide how to use it
http://azure.microsoft.com/blog/2014/11/04/back-up-your-data-to-the-cloud-with-3-simple-steps/
I found a folder named BlockBlobRoot in my local folder: C:\Users\<user name>\AppData\Local\WAStorageEmulator\ , its size increased quickly.
My question is:
What and where data come from in this folder?
Can I keep its size not increase so much or maintain in a certain size?
Can this folder be moved to somewhere other place?
What and where data come from in this folder?
This is where files uploaded in storage emulator are stored. So when you're uploading the files in storage emulator, a file is created in this folder and a reference of this file goes into the database for storage emulator.
Can I keep its size not increase so much or maintain in a certain
size?
I don't think so. One way is to periodically delete files from blob storage. That should automatically delete the files from here.
Can this folder be moved to somewhere other place?
Yes, it can be moved. Please see this thread for more details: Azure Storage Emulator store data on specific path