Why isn't SQL Endpoints enabled in Azure Databricks? - azure

I want to create an SQL Endpoint as per the documentation:
https://learn.microsoft.com/en-us/azure/databricks/sql/admin/sql-endpoints
However this option does not appear in the databricks assets drop down:
Why not and how can I enable SQL Endpoints in my Azure Databricks instance?

Requirements
Your Azure Databricks account must be on the Premium plan.
Launch a workspace. You can use an existing workspace or create a new one
You must be an Azure Databricks admin. Ref

There could be several reasons for that:
(most probable) administrators didn't give you access to Databricks SQL - they need to follow the documentation to enable it for users or groups.
your workspace is not on Premium plan (requirements in docs)

Related

Data Migration from Snowflake (on GCP Instance) to Snowflake (Azure Instance)

I am looking for some inputs on how to do a GCP cloud to AZURE cloud data migration.
Scenario -
I have a snowflake instance configured on GCP cloud (multiple databases holding legacy data) and I have another snowflake instance configured on Azure Cloud (DWH created on this instance).
I want to move/copy the data of all the databases (including all child objects - schema, table, views etc) sitting on GCP snowflake instance to snowflake instance configured on Azure Cloud.
Can you please guide me on what can be the best solution for such data migration and any steps or documentation link would be really helpful.
Many thanks - Minti
Please check the Database replication mechanism which can be used as a migration tool for SF account from 1 cloud platform to another. https://docs.snowflake.com/en/user-guide/database-replication-intro.html
Not something I've done before to be honest but if you didn't want to use external tools one possible method would be to secure share your GCP databases with your Azure Snowflake account.
You then might be able to create a new database that is a clone of this share (not sure if this is possible).
Most objects get cloned apart from stages and pipes but tables, views etc should carry over
This is a pretty easy process with a couple of prerequisites.
Make sure you have Organizations enabled on your GCP account.
This feature allows you to self-provision Snowflake accounts on any cloud provider/region. Open a support case to enable it.
Introduction to Organizations
Create a new account on Azure if you haven't already.
Enable Replication on both accounts
This can be done when logged into the account with the ORGADMIN role
Replicate your databases
Note: this will work for having a replica of the databases in the GCP Snowflake account databases in your Azure Snowflake account. If you want to permanently migrate your databases you need to set up Failover/Failback. This is a Business Critical feature, but Snowflake support will enable it for lower editions until you can complete your migration, at which point they will disable it.
Replicating a Database to Another Account
There are two options
You could make use of the replication feature
High level Steps include the below
a. Target account to be created - Can use the Organizations feature available in Snowflake(Enabled by Snowflake Support upon request)
b. Account level objects should be created manually in the target account
Note: The Failover feature is supported for the accounts whose edition is Business-critical and above. However, for account migration scenarios, this feature will be enabled for a temporary period by the Snowflake Support.
c. Replication - the below links can be referenced for a complete understanding of the process.
https://docs.snowflake.com/en/user-guide/database-replication-intro.html#introduction-to-database-replication-across-multiple-accounts
https://docs.snowflake.com/en/user-guide/database-replication-config.html#replicating-a-database-to-another-account
https://docs.snowflake.com/en/user-guide/database-failover-config.html#failing-over-databases-across-multiple-accounts
Please find the link below to have an overview on the costs associated
https://docs.snowflake.com/en/user-guide/database-replication-billing.html#understanding-billing-for-database-replication
Limitations
https://docs.snowflake.com/en/user-guide/database-replication-intro.html#current-limitations-of-replication
One other option is to create the target account and use the unloading and loading feature
https://docs.snowflake.com/en/user-guide-data-unload.html
https://docs.snowflake.com/en/user-guide-data-load.html

Azure Synapse Workspace failed to load ressources

We have different resources (storage account, logic app, SQL database, SQL server, Synapse Workspace) under a directory and a subscription (let's call them directory_1 and subscription_1)
The resources are used to perform simple ETL pipelines.
We want to move all these resources to a new directory and subscription (directory_2, subscription_2), everything moves correctly except the Synapse Worksapce.
When we try to access it shows this error:
Failed to load one or more resources due to no access, error code 403.
Pipeline
Related service
Trigger
Data flow
Dataset
Credentials
SQL script
Spark job definition
Synapse KQL Scripts
Notebook
Lake databases
Both accounts (from directory_1 and Directory_2) have [Owner] and [Contributor] roles for the Azure Synapse Workspace and the resources group as well.
Any idea how to fix this?
Unfortunately, you cannot transfer an entire Azure Synapse Analytics workspace to another subscription.
Before moving Azure resources to another Subscription, check whether the resource type supports move operation or not by checking this Microsoft Doc
According to this, Microsoft won’t support move operation of Azure Synapse workspace to another resource group or subscription or region.
This may be the reason behind getting that error.
I recommend you to upvote the request submitted by another Azure customer in the below forum.
Transfer an entire Azure Synapse Analytics workspace to another subscription · Community
Reference:
Transfer an entire Azure Synapse Analytics workspace to another subscription - Microsoft Q&A

How to connect Azure Data Factory with SQL Endpoints instead of interactive cluster?

Is it possible to connect Azure Data Factory with Azure Databricks SQL Endpoints (Delta table and views) instead of interactive cluster. I tried with Azure delta lake connector but it has options for cluster and not Endpoints?
Unfortunately, you cannot connect Azure Databricks SQL endpoints with Azure Databricks using ADF.
Note: With compute option - you can connect Azure Databricks workspace with the below cluster options:
New Job cluster
Existing interactive cluster
Existing instance pool
Note: With Datastore option - Azure Databricks Delta Lake option you can connect only existing interactive clusters:
Appreciate if you could share the feedback on our feedback channel. Which would be open for the user community to upvote & comment on. This allows our product teams to effectively prioritize your request against our existing feature backlog and gives insight into the potential impact of implementing the suggested feature.

How to Synapse Pool/DW in Terraform without entire synapse workspace

I am attempting to spin up an azure synapse pool in terraform. At present from the documentation found at: https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/synapse_sql_pool, it appears you have to use a synapse workspace, which also includes a datafactory integration and powerbi, etc.
Right now we just want to datawarehouse not all the other bells and whistles. As you can see within the Azure Portal, you are free to spin up a synapse analytics DW with or without a workspace (see the right image in the box, "formerly SQL DW"):
When you spin that up, you simply have a standalone DW...
Any insight on just getting the datawarehouse as you can in the portal without the workspace and realted?
I am not a Terraform guy. As for Synapse, you are referring to the new one that is in preview. The new one has the workspace which supports SQL pools, Sparks clusters and Pipelines. Although they are supported, they are not created when you deploy a Synapse workspace.
So you can go ahead and created the workspace and one SQL Pool and you will get what you're looking for: the data warehouse engine, named SQL Pool.
Some extra notes: there are 2 types of SQL data warehouse in Synapse Analytics: SQL Pools and SQL on demand. The first one is provisioned computing and is the traditional one with all the features. SQL on demand is still in preview, doesn't have all the features and is charged by the terabyte processed by your queries.
Happy data crunching!

Azure SQL Deleted Database cannot restore because of region restriction

Is there any way to change the region for a Azure SQL server/database from one geographical region to another region?
I have a deleted database which I cannot restore as I get "MSDN subscriptions are restricted from provisioning in this region. Please choose a different region. For exceptions to this rule please contact Microsoft Support."
The Server was originally setup in US-West region with a VS MSDN Subscription.
To answer the ways to change region from Azure SQL database, there are mutiple options -
Configure active geo-replication for Azure SQL Database in the Azure portal and initiate failover
Copy an Azure SQL database
Export an Azure SQL database to a BACPAC file
Set up SQL Data Sync (Preview)
However, based on the error if there is limitation on specific region and you have strict requirement to have database in same region you may need to work with Azure support on it.

Resources