Azure Resource Manager SQL db templates - azure

I am trying to create an ARM templates for Azure SQL DB deployment. I started by exporting template for an existing Azure sql database which was created and configured from portal. However I see quite a few fields which look a bit unfamiliar to me on usage - examples provided below:
a. kind
b. serviceLevelObjective
c. currentServiceObjectiveId
d. requestedServiceObjectiveId
e. containmentState
f. readScale
Is there any place where I can find information on what each of these properties/keys means and what are the valid values against these so that I know how to use these?

Your best bet is Azure Rest API Reference it has got examples and definitions for parameters.
Also, you don't need all of those, you can just use a "regular" way of creating Azure SQL stuff defined here.
Also, MSDN reference: https://msdn.microsoft.com/en-us/library/azure/mt163685.aspx

Related

Automatically adding data to cosmos DB through ARM template

I made an ARM template which runs through an azure devops pipeline to create a new cosmos instance and put two collections inside it. I'd like to put some data inside the collections (fixed values, same every time). Everything is created in the standard way, e.g. the collections are using
"type": "Microsoft.DocumentDb/databaseAccounts/apis/databases/containers"
I think these are the relevant docs.
I haven't found mentions of automatically adding data much, but it's such an obviously useful thing I'm sure it will have been added. If I need to add another step to my pipeline to add data, that's an option too.
ARM templates are not able to insert data into Cosmos DB or any service with a data plane for many of the reasons listed in the comments and more.
If you need to both provision a Cosmos resource and then insert data into it you may want to consider creating another ARM template to deploy an Azure Data Factory resource and then invoke the pipeline using PowerShell to copy the data from Blob Storage into the Cosmos DB collection. Based upon the ARM doc you referenced above it sounds as though you are creating a MongoDB collection resource. ADF supports MongoDB so this should work very well.
You can find the ADF ARM template docs here and the ADF PowerShell docs can be found here. If you're new to using ARM to create ADF resources, I recommend first creating it using the Azure Portal, then export it and examine the properties you will need to drive with parameters or variables during deployment.
PS: I'm not sure why but this container resource path (below) you pointed to in your question should not used as it breaks a few things in ARM, namely, you cannot put a resource lock on it or use Azure Policy. Please use the latest api-version which as of this writing is 2021-04-15.
"type": "Microsoft.DocumentDb/databaseAccounts/apis/databases/containers"

Is it possible to access data within a table in a dedicated SQL pool in Azure Synapse using REST API endpoints?

I am trying to see whether it is possible to access some data stored within a table in a dedicated SQL in Azure Synapse using REST API but I have not been able to figure much out. I checked the official docs at Microsoft and at most I have been able to query for a specific column within a table, not much more beyond that. I am wondering whether it is even possible to get data through the Azure Synapse REST API. Would appreciate any help.
Docs for reference: https://learn.microsoft.com/en-us/rest/api/synapse/
It is not possible to access the data in Synapse Dedicated SQL pools using REST APIs, today it is possible only to manage compute using REST API.

Is there any Azure service that can simulate the concept of a global Azure 'variable' to hold a single value?

I am looking for some Azure service that can store a value and then I can fetch it from any other Azure service. It's a storage basically but extremely lightweight storage -- it should allow one to define a variable for a given subscription and then its value can be updated from any other Azure service. In Azure Data Factory there is a recent introduction of global parameter at data factory level , even this could serve purpose to some limited extent if it was mutable, but it's a parameter not a variable. So its value can't be updated. Even if I can get some solution that will work within data factory that's fine too. One could always store such a value in SQL or blob but that sounds like an overkill. Having a global Azure variable is a genuine requirement -- so wondering if there is anything like that.
Please consider Azure KeyVault. You can define there a secret to hold this value. However I'm not sure what integration with other Azure services you need.
you have several options:
cosmosdb table api
redis
table storage
ref: https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/data-store-overview#keyvalue-stores

VSTS Database Deployment to Azure - Loop a Task Group (or something like that)

I am trying to setup continuous deployment for a group of Azure databases that all share the same schema. In my situation, there are a number of dynamic databases that get created via copying and renaming a standard template. The software will make a copy of the CompanyTemplate database and rename it to Company_XXXX.
I would like to create a Task Group and/or a script in VSTS (hosted) that can query the master database, get a list of the company database names and then loop said Task Group in order to deploy the same schema and scripts to each of the Company databases that get created.
I have been Googling and testing odds and ends for days but I cannot find anything pertaining to how this can be done. Any thoughts? Is this possible?
There is no loop concept in the VSTS Build/Release environment.
There are a few workarounds that sprint to mind:
Run a powershell script and implement the logic there. Using the loop constructs in Powershell.
Run a powershell to trigger as many builds as you want using the REST API.
To begin with, I want to acknowledge that reading the answer from #jessehouwing triggered a few thoughts on my end.
As he mentions in his answer, there isn't anything that would directly do what you're asking. However, some techniques do come to mind, depending on how you want to deploy the databases.
ARM Templates -
Setup an ARM template that uses Resource Iteration to deploy multiple Azure SQL Databases. (See MS DOCS on how to do that). Configure the template to copy the schema of an existing DB to the new ones. You'll need that template DB deployed to Azure to act as the schema source. To configure the ARM template to create the new databases as a copy of the template, look at the createMode property of the SQL Database ARM template (SQL ARM Template documentation).
Run a Powershell script that queries the master DB to get the list of companies (Query DB from Powershell).
Output the results of the DB query to a VSTS variable and pass that variable into the ARM template to produce the databases.
DACPAC -
Create a DACPAC from a SQL DB Project in Visual Studio.
You can either create a DACPAC that defines just the DB schema and use the ARM template technique above to run the DACPAC for each database you need in something of a hybrid technique - or
You can create a dacpac that queries your main DB for the list of
companies and creates a database for each one based on the defined
schema. This options encapsulates the process of creating the schema
and querying the main DB for the ones to create all into a single
deployment artifact
Each option has its Pros and Cons. The ARM Template option is going to give you the most flexibility, but requires that you have a template DB in place to copy from.
The DACPAC option requires familiarity with using that technique for deploying databases and may still require an ARM template to make the process as flexible as possible. It does offer the potential to encapsulate all the DB deployment parts into a single step.
There are a fair number of variables here, but I think this should give you some options to consider that will take you in a workable direction.

Azure Data Factory - moving data from On-Premise SQL to Azure SQL

A simple question: Can this be achieved directly? I mean without the Azure blob storage in between (as showed in all the examples)? Can someone provide some code example please.
yes, you can do this directly. In fact, you can do direct copies from any of our supported sources/sinks, you don't have to pass through blob. To go from on-prem SQL Server-->SQL azure, you will need to setup a Data Management Gateway connector on your on-prem server. Then, you use a linked service of type AzureStorage and an output dataset of type AzureSQLTable as the output dataset, instead of AzureBlob as is shown in the example. The exact steps to setup the DMG and the JSON code for the linked services, datasets, and pipelines can be found in our documentation. We are also improving our UI in the near future to make these kinds of copy setups an easy code-free experience.
https://azure.microsoft.com/en-us/documentation/articles/data-factory-sqlserver-connector/

Resources