API or Service to fetch metadata of table in Azure SQL Database - azure

I am trying to find an API or service to fetch the metadata of tables in Azure SQL database. However, I can't find anything. I only have an API that gets metadata of a database.

There are no Azure ARM APIs for reading and writing from a database.
To read the metadata you must connect to the database with a SQL Server client and issue metadata queries, like
select *
from sys.tables
etc. You can easily do this with PowerShell, SQLCMD, or mssql-cli.

Related

SQL Server Tables not showing up in Logic Apps Designer using a successful on-premise connection

I am using Logic Apps Designer on Azure to move source data to a storage file and I need to connect to a database on my SQL Server. To do this, I am using an on-premise connection througha gateway. I was able to successfully connect to my SQL Server database, however, when I am using Logic Apps action, not all tables are available on Logic Apps. Is there a security/permissions/connections issue that is preventing all tables from the SQL Server database not to return to Logic Apps?
enter image description here
enter image description here
I tried refreshing/restarting Azure and also tried connecting through with other authentications hoping more tables would return, but no luck.
I would suggest you to create a new api connection to sql and try to access tables again. There might be issue with connecting to db.
One of the alternative approaches to get row from sql db is using execute query action in logic app.
As shown in above picture, Execute a SQL query (V2) action is used to query table and retrieve records based on condition.
Reference link

In Azure Synapse, how do I setup a SQL Server that can access Datalake Storage?

I have setup a Synapse environment and filled my storage account with some sample Parquet files. I have then created a serverless SQL database and created some external tables over the Parquet files. All this works fine and I can query these tables fine from the Synapse UI and SSMS using AD Authentication.
The problem is I want to connect an app to the serverless SQL database which doesn't support AD authentication. Therefore I want to connect it using a standard SQL account. I have setup a SQL account (username and password) and I'm able to connect through SSMS, but not query any tables due to this error...
External table 'TableName' is not accessible because content of directory cannot be listed.
I assume this is a double-hop authentication problem because the SQL user doesn't have access to the storage account? I can't seem to find any guides on how to do this. Does anyone know?
I've written a blog-post where this issue is tackled, as I've encountered this problem as well a few days ago. You can read it here.
Basically, it comes down to the fact that you have to:
create a SQL login for your user
create a credential in SQL that has the same name as the URL that points to the container in your datalake that contains the files you want to query
grant reference rights on that credential to your SQL login
create a user on your database for that login
Next to that, you also need to create some specific role-assignments.

Azure SQL database - GraphQL

Does anybody know if there is a way you can import data into an Azure SQL Database, using an GraphQL API? Or if you could create a connection within an iPaas system to send data to the Azure SQL database?
Since you want to exchange data between azure sql database and GraphQl.
Directus can be used which is an open-source Data Platform that enables anyone to access and manage the database content.
Directus will give us a GraphQl API to be used with the data .
Refer the following article to setup the azure sql database and directus .

ADF Unable to connect to Synapse Link SQL Pool External Tables

I am trying to create an ADF Linked Service connection to a Synapse Link Serverless SQL Pool connected to ADSL Storage. I can successfully get a connection but when I try and use a dataset to access the data I get a permission issue.
I can successfully access the data via Synapse studio :
This is the error I get when I use the data set in ADF.
I can also look at the schemas in SSMS , where they appear as External tables. But get a similar credential error at the same point.
Has anyone come across this issue please ?
There are a few pieces of information you haven’t supplied in your question but I believe I know what happened. The external table worked in Synapse Studio because you were connected to the Serverless SQL pool with your AAD account and it passed through your AAD credentials to the data lake and succeeded.
However when you setup the linked service to the Serverless SQL Pool Im guessing you used a SQL auth account for the credentials. With SQL auth it doesn’t know how to authenticate with the data lake so looked for a server scoped credential but couldn’t find one.
The same happened when you connected from SSMS with a SQL auth account I’m guessing.
You have several options. If it’s important to be able to access the external table with SQL auth you can execute the following to tell it how to access the data lake. This assumes the Synapse Workspace Managed Service Identity has Storage Blob Data Reader or Storage Blob Data Contributor role on the data lake.
CREATE CREDENTIAL [https://<YourDataLakeName>.dfs.core.windows.net]
WITH IDENTITY = 'Managed Identity';
Or you could change the authentication on the linked service to use the Managed Service Identity.

Moving data from MsSQL to Cosmos DB of FHIR server(open source version)

I am using Microsoft Azure Storage Explorer to move patient resource data from MsSQL to Azure cosmos DB of FHIR server. I have installed the FHIR server using the below github link.
https://github.com/Microsoft/fhir-server/blob/master/docs/DefaultDeployment.md
I am able to move MsSQL server data inside FHIR cosmos db server but data format is not matching with FHIR server apps.
Example :- I have patient data with sql server side and we want to move all data with FHIR cosmos db ("resource type ":"Patient")and query on it. FHIR sever apps/services is not able to map with MsSQL server data.
Are there any Azure functions that can be run so that the bulk
ingestion of data into the FHIR server ?(Posting data with Postman is one way which is not feasible for bulk data)
Thanks in Advance.
#Vinayaka you are on a right track with Azure function.
In a nutshell, it is a simple post/put requests of FHIR resources from MS SQL into FHIR Server endpoints.
One approach could be a simple Azure function or a console app, that loops through FHIR json resources and post them in async way.
My humble advice: bump up capacity/throughput of FHIR server before running your ingestion process/load, and downgrade it as needed once ingestion of FHIR resources is completed.
You could also reuse Microsoft FhirImporter function for your case.
You could build a Data Factory pipeline to load the data to CosmosDB in the correct format. You may need to do some transformation to get the data into the format that FHIR expects.

Resources