How can I backup an Azure Cosmos DB [closed] - azure

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I have an Azure Cosmos DB and I need to delete all the resources from this subscription. Is there any way to take a backup offline from the portal?

UPDATE : Cosmos DB now supports backup Online backup and on-demand data restore in Azure Cosmos DB
You can use Data Migration Tool suggested on Automatic online backup and restore with Azure Cosmos DB article to do the same.
There is no way to take a backup and import to Azure CosmosDB.
The recommendation is to open a support ticket (e.g. via Azure Portal) or call Azure Support to streamline the backup/restore strategy and to request Azure to restore the latest backup in case of a disaster event. In addition, you can contact the Azure CosmosDB team by sending an email to AskCosmosDB#microsoft.com.

Sajeetharan's answer is wrong.
mongodump --uri="PRIMARY_CONNECTION_STRING"
Use this command. It will create a dump in your current working DIR.

MS has finally introduced a backup policy feature to CosmosDB!See https://learn.microsoft.com/en-us/azure/cosmos-db/online-backup-and-restore#modify-the-backup-interval-and-retention-period
I guess this removes the need for third party and/or custom tools to do such a basic Ops routine.

Automatic backup is not for free... Standard support plan ($100/mo)
I'm using free A̶z̶u̶r̶e̶ ̶C̶o̶s̶m̶o̶s̶ ̶D̶B̶ ̶D̶a̶t̶a̶ ̶M̶i̶g̶r̶a̶t̶i̶o̶n̶ ̶T̶o̶o̶l̶.
EDIT: thx #e2eDev link Azure Cosmos DB Data Migration Tool 1.8.3 on github
There is both GUI tool dtui.exe even CLI tool dt.exe
Supports many protocols and even JSON format (for both import and export).

Related

Azure Dedicated SQL Pool (Formerly SQL DW) is going to be remove from Azure? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I am using a Dedicated SQL Pool (Formerly SQL BW), I have a doubt if this is a new synapse or this will be removed from Azure in the future.
I need to know the difference between Dedicated SQL Pool (Formerly SQL BW) and Synapse.
Azure Synapse brings together data integration, enterprise data warehousing, and big data analytics and provides a unified experience in a single workspace. Dedicated SQL Pools are part of this workspace. However, Dedicated SQL Pool (Formerly SQL DW) will still be a stand-alone service in Azure for those who do not want all the other features of Synapse analytics.

How to query Azure Data Lake? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
Coming from the database world, when we have something related to Data we use a ui tool to query data. Be it big or small.
Is there anything like SSMS, SQL WorkBench (For Big Data Redshift), Athena (Query Big Data S3) for Azure Data Lake?
I see Data Lake Analytics just queries the data and store it in file. Is there is anyway to query the data on Azure Data Lake via a UI Tool or WebBased Tool?
No there is not (yet). Sure, you can run a query using the portal or using Visual Studio (docs) or Visual Studio Code (docs) but all those tools will provide access to the generated file (which can be easily obtained or previewed)
Main reason is that u-sql / data lake analytics is geared toward long running jobs (that can take up from a few minutes to hours) to process the vast amount of data. Keeping that in mind you can hopefully better understand why these kind of direct query tooling is not (yet?) available.
EDIT: try upvoting this on the feedback site. What you are asking is a highly requested feature.
You can download the Azure Data Explorer form here https://azure.microsoft.com/en-us/features/storage-explorer/
Upload, download, and manage Azure blobs, files, queues, and tables, as well as Azure Cosmos DB and Azure Data Lake Storage entities. Easily access virtual machine disks, and work with either Azure Resource Manager or classic storage accounts. Manage and configure cross-origin resource sharing rules
You can create External table in SQL Server pointing to Data Lake Files.Only thing is we have to take care of the schema changes manually.
you can use Spark SQL through azure data bricks to query Azure Data Lake Files.

Move Azure VHD from Premium to Standard Storage [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 6 years ago.
Improve this question
I have a VM that currently has the OS disk in Premium Storage -- I'd prefer that it use Standard Storage and my data disks use Premium Storage. That said, is there an easy method to move the existing VHD from Premium to Standard?
You will need to
delete the VM while preserving the disks
use AzCopy to copy the OS disk to Standard Storage
create a Premium Storage capable VM using the copied disk
This may be more trouble than it is worth. You can likely script it by downloading the configuration prior to deletion, doing the copy, then modify the configuration and create the new VM.
Jdixon04,
We published an article which outlines step by step guide to migrate to Premium Storage here (https://azure.microsoft.com/en-us/documentation/articles/storage-migration-to-premium-storage/). There is also a sample script at the end of the article if you wish to automate the flow. If you have multiple VMs to migrate, automation through PowerShell scripts will be helpful. Let us know if you need additional information.
Aung

Azure Website content or blob storage - which is faster? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I have a website in Azure which advertises my software. Users can download my software file which is about 100Mb. Right now I'm putting this file in a folder in the Visual Studio project as a content file and publishing it along with the other project files.
Would moving the download files to Azure Blob Storage give faster download speeds than my current approach?
Geo-redundant blobs are replicated to a secondary datacentre, but unless you use "Read-Access Geo-Redundant Storage" the blob will only be served from the primary data centre. With RA-GRS you can optionally access the secondary via a different domain, but this will not improve latency for users outside your region.
For best download performance and scalability, store your file in blob storage and cache it close to your users with Azure CDN.
With the file in blob storage you can also get it geo-replicated, so that it's closer to your user. You can also setup a CDN endpoint to that blob storage.
And it will at least make your deployment a lost faster since you will not need to upload the 100MB file ;) (yes, I know, web deploy does not upload unmodified files, so that's not always the case)

What tools can provide scheduled backups of Azure blob storage? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I'm looking for the best way to prevent accidental deletion by IT - perhaps copying to disk or a separate Azure Storage account or Amazon. What tools can do this? Redgate Cloud Services seems like the closest fit for what I want but it seems to require config per container. I know of some other tools like Cloud Storage Studio and Azure Sync Tool exist but I don't think they support scheduled backups of blob storage.
Windows Azure storage is backed up Geo-replication which means there are total 6 copies of your data at any given time. There is no built-in service available in Windows Azure to backup data on Azure Storage to outside Azure Storage or user defined location.
Windows Azure Azure is manged by RESTful interface so 3rd party vendors have created application for such purposes. Besides above I had chance to use Gladinet Cloud Backup solution could be useful in your case. Based on my experience, there are a few backup tools available however and not a single one perfect to match everybody expectation.
A cheap way to prevent accidental deletion by IT is to snapshot the blobs into a backup container. IT would have to be very persistent and delete all of the snapshots taken of the original blob in order to accidentally delete it.
"A blob that has snapshots cannot be deleted unless the snapshots are also deleted. You can delete a snapshot individually, or tell the storage service to delete all snapshots when deleting the source blob. If you attempt to delete a blob that still has snapshots, your call will return an error."
http://msdn.microsoft.com/en-us/library/windowsazure/hh488361
CloudBerry Backup: it supports Amazon S3, Azure, Google, and much more cloud storage providers
http://www.cloudberrylab.com/amazon-s3-microsoft-azure-google-storage-online-backup.aspx

Resources