Use BigQuery Omni with Azure - azure

After we follow this documentation: https://cloud.google.com/bigquery/docs/omni-azure-introduction.
We had some trouble like this error message:
Invalid table-valued function EXTERNAL_QUERY External database credentials not found for connection acelerai1.azure-eastus2.Acelerai_Azure at [1:15]
We already tried on two environments, and we got the same error.
We want to use the Big Query omni with Azure and Google cloud integration.

Related

data_source_id in Azure databricks

What is the data_source_id in the parameters of azure databricks ran SQL scripts.
Mentioned in databricks AWS documentation that it is the ID of the data source where this query will run, but if I try to change the notebook of and run the query, still the data_source_id is same.
data_source_id is an another identifier of the SQL Endpoint. It's not the same as the SQL endpoint ID that is used for operations on SQL Endpoint. Unfortunately that information isn't returned by the SQL Endpoint Get API, but instead should be searched via another API that isn't officially documented yet (coming very soon), but you can look into Databricks Terraform provider source code how to use it.

Azure Function trigger if any update is made in the tables in PostGres schema

I am working on the design where the contents in the postgres schema will be static. But in case anything is updated to these static content I want to be able to trigger an Azure Function app to capture these updates and send the update to the device(Function App-> IotHub-?Device).
Looks like the PostGres DB is not supported by Azure Functions (input/output binding).
Azure Function supports limited bindings.
As per this MS Doc reference, although the special database which you specify may not be supported by the function's binding.
But You can still install the necessary packages and write the connection, input, and output code within the function logic.
The above is essentially the same as binding.
Here are the few workarounds references where Postgres connected through code in Azure Functions:
Azure Functions integration with Postgres in Node.js
An Article of .Net Stack Azure Functions using PostgreSQL
An Article of Connect from Function app with managed identity to Azure Database for PostgreSQL
Connect to PostgreSQL using Azure Java Functions

Make REST call to API and save the result to Azure SQL every hour

I'm using this very useful SQLCLR script to make a REST call to an API and save the data on SQL Server on the fly.
I have created a stored procedure that withdraws new data every hour so my data are always updated.
I would like to have all this on Azure so I can then create a Power BI data visualization.
THE PROBLEM:
As soon as I try to transfer the database on Azure I receive this error:
TITLE: Microsoft SQL Server Management Studio
------------------------------
Could not import package.
Warning SQL0: A project which specifies SQL Server 2019 or Azure SQL Database Managed Instance as the target platform may experience compatibility issues with Microsoft Azure SQL Database v12.
Error SQL72014: .Net SqlClient Data Provider: Msg 40517, Level 16, State 1, Line 4 Keyword or statement option 'unsafe' is not supported in this version of SQL Server.
Error SQL72045: Script execution error. The executed script:
CREATE ASSEMBLY [ClrHttpRequest]
AUTHORIZATION [dbo]
FROM 0x4D5A90000300000004000000FFFF0000B800000000000000400000000000000000000000000000000000000000000000000000000000000000000000800000000E1FBA0E00B409CD21B8014CCD21546869732070726F6772616D2063616E6E6F742062652072756E20696E20444F53206D6F64652E0D0D0A2400000000000000504500004C0103006D85475F0000000000000000E00022200B0130000026000000060000000000007E45000000200000006000000000001000200000000200000400000000000000060000000000000000A00000000200004C1E01000300608500001000001000000000100000100000000000001000000000000000000000002C4500004F00000000600000FC03000000000000000000000000000000000000008000000C000000F44300001C0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000200000080000000000000000000000082000004800000000000000000000002E7465787400000084250000002000000026000000020000000000000000000000000000200000602E72737263000000FC030000006000000004000000280000000000000000000000000000400000402E72656C6F6300000C000000008000000002000000
(Microsoft.SqlServer.Dac)
------------------------------
BUTTONS:
OK
------------------------------
This happens because Azure SQL has some feature stripped off like SQLCLR or SQL Server Agent (for some obvious security reason).
Is there any alternative to SQLCLR on Azure?
Is there any alternative to SQL Server Agent on Azure?
Basically: how to automate a REST call to an API every hour and save the result to SQL Server on Azure?
I do not think there is a straight forward replacement for SQL CLR. However, there are some Azure offerings that might be interesting.
I suppose an alternative is using a scheduled azure function that calls the API and store the result in the Azure SQL Database.
Do mind that if the process takes longer than 10 minutes you cannot use a consumption plan for the Azure Function, which is the most cost effective probably.
Depending on the scenario, Azure Data Factory can also provide a solution. You can create a pipeline that calls the API and copies the data to Sql Server as outlined here, based on a schedule trigger.
Even though Azure Functions is great, you could even solve this without much code using Azure Logic Apps, a scheduled trigger, the http request and the mssql connector.
https://azure.microsoft.com/de-de/services/logic-apps/

How to connect Azure Web App to Azure SQL Database?

I'm having what I hope is a simple problem.
I've published an API to an Azure Web App, which should fetch data from an Azure SQL database, but I'm getting a 500 error, which of course isn't helpful. Checking the logs in Azure doesn't give anything more useful to me.
I've added the connection string to the connection strings in the Web App. I have also created a method which returns the connection string from the repository class that's using it, so I know it's definitely seeing the correct connection string; so this means it's an issue connecting to the database with that connection string.
I have ensured that 'Allow access to Azure Services' is switched on, and when I use the query editor I can successfully pull data from the database.
I've also connected to the database using SQL management studio so I know the database can be reached.
What am I doing wrong?
As usual it turns out to be my stupidity, but I'll answer here in case it helps someone else...
I just updated my local app to use the Azure db connection string so I could get a more detailed error and it said the keyword was not supported 'initial catalog'.
This is when I realised that I developed this locally using Postgresql but couldn't justify the costs of that on Azure so switched to Sql Server but didn't change my connections to SqlConnection types!

Getting the error when I am trying to connect the on prem IBM DB2 from Azure using Microsoft integration runtime

We are trying to get the data from On Premise IBM DB2 from Azure Data Factory using Microsoft Integration Runtime. We are able to connect the database also we are able to get the list of tables in the ADF Dataset but when we are trying to execute the query we are getting the below error. Not able to identify the issue. Help me on this.
ROUTINE SQLSTATISTICS (SPECIFIC NAME SQLSTATISTICS) HAS RETURNED AN
ERROR SQLSTATE WITH DIAGNOSTIC TEXT -805
DSN.DSNASPCC.DSNASTAU.0E5F1F1D09F1404 SQLSTATE=38112 SQLCODE=-443
Error Screen Shot
Suggest you follow this technote to bind db2schema.bnd to the target Db2 database, after ensuring your Db2 client has the same version/fixpack as the Db2-server.

Resources