Download file from Azure using AzCopy(v8) but only download the one with newest modified time - azure

I am trying to automate a download from Azure using AzCopy(v8). In a blob we have few .zip files with prefix "BuildTools" and following with the time been modified. Currently what we do is hard-coded the /pattern to match the name of the file(newest modified time) in AzCopy command so we can download the latest version.
But I would like the script to automate sort all files with prefix "BuildTools" then only download the file with newest "Last Modified" time.
I have read the document from AzCopy(v8) and know there are flags(/MT, /XN, /XO) which related to "Modified Time" but they are not what I want.
Just wonder if anyone have any ideas on this. Thanks in advance.

if you are open to try preview capabilities i would suggest to enable Blob Verisoning and download the latest. This way you manage the logic or lifecycle on the server side. less code on your end.
Preview Blog on Blob Versioning -
https://learn.microsoft.com/en-us/azure/storage/blobs/versioning-overview?tabs=powershell
(at the moment available in few regions but i guess worth waiting)
Otherwise you can try the file or object properties via the API call which has the last modified date in there. Link below:
https://learn.microsoft.com/en-us/rest/api/storageservices/get-file-properties

Related

Uploading large .bak file to Azure Blob through Powershell

So I am trying to create a powershell script which will upload a large (> 4GB) .Bak file to Azure Blob Storage but currently it is getting hung. This script works with small files which I have been using to test.
Originally the issue I was having was the requirement to have a Content-Length specified (I imagine due its size) so I now calculate the file size of the .bak file (as it varies slightly each week) and pass this through as a request header
I am a total powershell newbie, as well as being very new to Azure blob. (NOTE: I am trying to do this purely in powershell, without relying on other tools such as AzCopy)
Below is my script
Powershell Script
Any help would be greatly appreciated..
There are a few things to check. Since file is big, are you sure it isn't uploading? Have you checked network activity in performance tab of task explorer? AzCopy seems like a good option too that you can use from within Powershell, but if it's not an option in your case, then why not to use native AZ module for Powershell?
I suggest you using Set-AzStorageBlobContent cmdlet to see if it helps. You can find examples at Microsoft docs

Azure convert blob to file

Some large disks containing hundreds of 30GB tar files have been prepared and ready to ship.
The disks have been prepared as BLOB using the WAImportExport tool.
The Azure share is expecting files.
Ideally we don't want to redo the disks as FILE instead of BLOB. Are we able to upload as BLOBs to one storage area and extract the millions of files from the tarballs to a FILE storage area without writing code?
Thanks
Kevin
azcopy will definitely do it and has been tested. We were able to move files from blobs to files using the CLI in Azure with the azcopy command.
The information provided below was proven not to be true.
Microsoft Partner told me yesterday there is no realistic way to convert Blobs to Files in the above-mentioned scenario.
Essentially, it is important to select either WAImportExport.exe Version 1 for BLOBS or WAImportExport.exe Version 2 for files. Information on this can be found at this location.
The mistake was easily made and done so by a number of people here: the link to the tool sent was to the binary version 1. Search results tended to direct users to version 1 but version 2 only appears only after deeper dig. Version 2 - seems to be an afterthought my Microsoft when they added the Files option to Azure. It's a pity they didn't use different binary names or build a switch into version 2 to do both and delete the version 1 offering.

Upload of large company snapshot results in error "the file exceeds the maximal allowed size (1048576 KB)"

Trying to upload a large Acumatica company snapshot file (1.3 GB) and I am getting an error as soon as I hit the upload button.
What setting (if any) can I change in my local Acumatica site or web.config to allow the large file import?
As a work around I am requesting a snapshot file without file attachments as the file attachments data is about 95% of the snapshot file size.
My file upload preferences are currently set to 25000 KB if that helps any. (I assume this setting is not used for snapshot imports.)
The error occurs after I select the file and click ok (before being able to click the upload button). I am using 2017R2 Update 4.
Image of error:
Modifying your web.config might work, but I think Sergey Marenich alternative is better. He did an excellent post on his blog on how to do this.
http://asiablog.acumatica.com/2017/12/restore-large-snapshot.html
The idea is :
Get a snapshot of your site in xml
Extract and put the folder in C:\Program Files (x86)\Acumatica ERP\Database\Data
Use the Configuration Wizard to deploy a site and select your snapshot data, just like you would when choosing demo data.
If your on SaaS then you may request a copy of database and be able to restore the database for offsite instance.
If your on PCS/PCP then you have couple of options you could modify the Web.config to allow bigger files to process as detailed in this blog https://acumaticaclouderp.blogspot.com/2017/12/acumatica-snapshots-uploading-and.html
If you have larger files then you can't do it coz of IIS constraint and you can certainly use Sergey's method but that would be creating for new instance only or simple approach is to take a SQL .bak file and restore to new database.
I think Acumatica shld provide a mechanism to split these large files and have them processed into multiple uploads to accomplish but again very few customers might face this issue too.
I had this same problem. I tried to modify the web.config but, that gave me an error that said that the file didn't exist or I didn't have permissions when I tried to import the snapshot file into Acumatica again.
Turns out, I had a table that has image blobs stored inside of it so, it wouldn't compress. Watch out for that one.

I want to take backup and restore of SQLite Database file using xamarin Azure SDK for My Xamarin App

I am using xamarin Azure SDK to download and manage the local database for my Xamarin . Forms App.
We are facing downloading time issues because we have a lot of data,
so I am thinking of taking backup once of the SQLite File from one device and use it to restore on different devices as restoring the same SQLite File.
Planned to use Azure Blob storage to store backup of SQLite files and for different device planning to download that blob of SQLite file and thinking of restore it on different devices.
Any Help will be appreciated.
Thanks :)
An approach I have used in the past is to create a controller method on the azure end which the client app can call that generates a pre-filled sqlite database or 'snapshot' on the server (making sure you include all the extra azure tables and columns) and then return a download url for the file to the client. We also zip-up the snapshot database to reduce the download times. You could store this 'snapshot' in azure blob if you desired.
Please refer given link. SQLite is not supporting only relationship like Foreign Key.
Memory Stream as DB
you can upload back up file on Blob with respective user details. and when there is any call with same user details you can download it from blob.
Those are the links that provide you with the code / knowledge required to use Azure Blob Storage on Xamarin:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-xamarin-blob-storage
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-dotnet-how-to-use-blobs
As this question is very general I can provide you only with those general links. There are many details on how to implement that in your case, if you face some specific problem I recommend to ask another question with the exact description of that specific problem.
EDIT: According to your comment you have some problems in replacing the local file. The only thing is that you must replace it before you initialize SQLite, otherwise it is a simple file operation.

CloudBerry PowerShell Multipart

I've written a powershell script to upload from a windows system to an Amazon S3 Bucket. The script successfully uploads all files, except those over 5Gb. I have the Cloud Berry Explorer Pro license which allows for multipart upload on files up to 5TB. However there is no flag for multipart on the powershell snapin documentation. Cloudberry support directed me here as they only support the GUI not the powershell snapin. When running my script I get the error
"WARNING: Your proposed upload exceeds the maximum allowed object size (5 Gb)".
So question is, does anyone know if there is a command line option, or another way to enable multipart upload to Amazon S3 using Cloudberry Explorer Pro's Powershell Snapin?
Set-CloudOption UseChunks=true
I'm looking for the same in Powershell.
I believe in GUI, the original chunking mechanism has been depreciated. I have not tested myself, but I assume Powershell option UseChunks=true is still using the old mechanism? If so, files may be split into multiple parts and not automatically recombined when they arrive on S3. The new GUI Multipart Upload facility sorts this all out for you.
Annoyed Cloudberry still advertise Powershell as a component of Explorer (Free & Pro), but don't support it, even for fully paid up PRO support customers.
We did purchase the cloudberry explorer pro license for the native multipart upload capability, but we wanted to automate it. I believe based on their documentation that the old chunk method is deprecated in favor of their new multi-part functionality. We wound up testing the options listed in the powershell documentation. Those options are as follows:
Set-CloudOption -UseChunks -ChunkSizeKB
"Defines a size of chunk in KB; files larger than a chunk will be divided into chunks."
We verified that this was successfully uploading files beyond the 5GB restriction to our S3 bucket. I attempted to get a response from Cloudberry as to whether this was the old Chunking method or the new Multi-Part method, but I was unable to get a straight answer. They confirmed that because we were using pro, this powershell option was supported, but they failed to confirm which option the powershell command was using.
From what I can tell it appears that Cloudberry's legacy chunking mechanism would actually just break the file into individual files and thus would appear in S3 as multiple files. The Chunk Transparency mechanism in Cloudberry Explorer would allow the multiple chunks to appear as a single file in the Cloudberry Explorer GUI only. Since I can see the file as a single file on the S3 side, I'm assuming that the powershell option uses the new Multi-Part functionality and not the legacy Chunking functionality. Again I was not able to confirm this through Cloudberry so it's speculation on my part.
However, I can confirm that using the powershell options above will get you around the 5GB upload limit when using powershell.

Resources