Rename Azure Storage Table? - azure

Is it not possible to rename an Azure Storage Table?
I cannot seem to find anything online (not even cmdlets). There are no options for this in Visual Studio Server Explorer, Cloud Storage Studio or TableXplorer.

You're correct. It is not possible to rename an Azure Storage Table (or Blob Container or Queue for that matter).
Possible solution would be to download all entities from the table and upload them again in another table. Once all entities are uploaded, you can then delete the old table. When downloading entities, please do keep Continuation Token in mind as querying table would return up to 1000 entities per request.
You can download all entities using either Cloud Storage Studio (or Azure Management Studio) from Cerebrata or TableXplorer. If you want, you can use Azure Management Cmdlets from Cerebrata as well. It has cmdlets to export a table (Export-Table) and restore a table (Restore-Table).

Now, you can rename Azure Tables with Microsoft's "Microsoft Azure Storage Explorer" (after version 0.8.3). You can also rename containers and file shares with this tool. See the release notes here.
Note that this feature has the following disclaimer during usage.
Renaming works by copying to the new name, then deleting the source item. Renaming a table currently loses the table's properties and metadata, and may take a while if there are lots of entities.
Therefore this is not an actual renaming behind the scenes and incurs read/write/transaction costs.

You can also use AzCopy, which is a Microsoft command line tool for downloading/moving table data.

Related

Copying data from azure table storage for the past 30 days

We are in a process of migrating our manually managed production environment to Terraform and in the process would be creating all the resources required for the environment anew. One of such resource is storage account.
We have a storage account that has close to 1500+ tables and each table consisting of millions of records with a timestamp attached to each of these records. During the migration we are mostly interested in copying the records for the past 30 days.
I was wondering if there's a tool that could help us perform this copy operation most effectively and which is less time consuming.
We looked into Azcopy but it only allows us to do one to one copy and copying billions of records might take us days and from what I learnt online Azcopy doesn't support queries to only copy days from a certain timestamp.
Would be helpful to get some insights on different tools and techniques we could adapt to accomplish this.
As far as I know, there is no such the tools that can copy table storage from a specified timestamp. You should write your own logic to select from the specified timestamp, but that would cause bad performance.
Here, I suggest you can use a tool named EastFive.Azure.Storage.Backup. It supports copy azure blob / azure table storage to a new storage account. And for azure table storage, it supports copy array of specified partition_key but not supports specified timestamp.
If you're interesting about it, you can follow the simple steps as below:
1.Create a folder named "backup" in D drive, then download all the 4 projects mentioned in Prerequisites into D:\backup.
2.Unzip all the 4 projects, and open them one by one in visual studio -> in the manage nuget package in visual studio, updates all the old packages -> build them one by one, make sure each of them are built successfully.
3.Open the backup.json in EastFive.Azure.Storage.Backup project, fill in your sourceConnectionString and targetConnectionString.
If you don't want to copy blobs, just remove blobs.
For timeLocal field at the end, it means when to run the copy activity according to your local time.
4.You can install it as a service, and start the service to run the copy activity.
I test it at my side, and all of my azure table storage are copied to the new storage account, a screenshot as below:

Azure Unzip automation

I am looking to do the following in Azure however I should point out that on my local machine I have no visual studio, no admin rights, no IT support and no tools (except SSMS) but I have a VERY strong drive to complete this work if its possible.
I have created an Azure blob which receives a file each day (zipped) from a 3rd party. I am looking to do the following:
1)Unzip the data in an automated fashion
2)Get the data into an Azure SQL database (already created) in an automated fashion
What I want to know is if this is possible to do using Azure alone or am I going to need admin rights / Visual Studio? If it is possible any directions that you could point me in would be greatly received!
Thanks
Dave
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Spin up your data factory on Azure, unzip function is available on adf

Auto backup Azure Tables used by a WebApp in Azure Portal

My Azure WebApp stores data in Azure Storage Tables and Blob storage.
There is a backup functionnality, but as I understood it just does not support azure tables/blobs... however I would like to automatically backup my tables to protect against accidental data corruption by users or by a software issue...
I would like to backup MyProdTables in MyBackupBlob container. Is there away to do it actually?
I read something about AZCopy, but it seems working with virtual machine's hard drives, but we have the WebApplication as a Service, so I am not sure that it will work in our case...
Edit: There is a partial (negative) MS feedback on the question, as mentioned in this answer, but it focused rather on migration and entire account snapshots. I am rather focused on the table storage, and maybe even the possibility of backuping individual tables... is strange that is nothing possible in this field, cause MS Azure Storage Explorer can easily backup the tables as CSV files.
There is no built-in backup feature for blobs and tables, as you've surmised. However: blobs do offer snapshots (a point-in-time snapshot may be taken at any time).
There are also Shared Access Signatures (and Policies) to limit exposure to your storage. And you can even protect the storage account itself from deletion.
As for AzCopy: that has nothing to do with VM disks. That's specifically for moving content in and out of blobs and tables.

View Azure Blob Metadata Online

Is there a way to examine an Azure blob's metadata through a web interface or the Azure portal?
I'm running into a problem where I set metadata on a blob programmatically, without any problems, but when I go back to read the metadata in another section of the program there isn't any. So I'd like to confirm that the metadata was, in fact, written to the cloud.
One of the simplest ways to set/get an Azure Storage Blob's metadata is by using the cross-platform Microsoft Azure Storage Explorer, which is a standalone app from Microsoft that allows you to easily work with Azure Storage data on Windows, macOS and Linux.
Just right click on the blob you want to examine and select Properties, you will see the metadata list if they exist.
Note: Version tested - 0.8.7
There is no way to check this on the portal, however you can try the Storage Explorer tool.
if you want to check the metadata in your code, please try this Get blob metadata

Create Azure Table Storage

Can I check if my understanding is correct here.
To create an Azure Storage table I have to C# or Javacript, PHP etc.
There is no GUI for simply creating a table? And if there is a GUI, is it popular/recommended approach or a niche thing?
What you're looking for is a Storage Explorer. There are many storage explorers available in the market today - There are both open source and commercial (both paid and free) storage explorers available. Please see this blog post from Windows Azure Storage Team about the list of storage explorers: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx.
Apart from these, Visual Studio also has a storage explorer built into it. You can find that in the Server Explorer. I haven't used Eclipse but I have heard that there's a storage explorer there as well.

Resources