I am facing this issue while trying to query Azure Synapse tables from Azure app service. It gives
Invalid object name ‘stg.Member’
Locally it is working fine and able to query the same set of tables.
I have mentioned the database name in the connection string.
Also tried to list the tables from app service but it is returning zero records even though there are tables.
Related
I have setup a Synapse environment and filled my storage account with some sample Parquet files. I have then created a serverless SQL database and created some external tables over the Parquet files. All this works fine and I can query these tables fine from the Synapse UI and SSMS using AD Authentication.
The problem is I want to connect an app to the serverless SQL database which doesn't support AD authentication. Therefore I want to connect it using a standard SQL account. I have setup a SQL account (username and password) and I'm able to connect through SSMS, but not query any tables due to this error...
External table 'TableName' is not accessible because content of directory cannot be listed.
I assume this is a double-hop authentication problem because the SQL user doesn't have access to the storage account? I can't seem to find any guides on how to do this. Does anyone know?
I've written a blog-post where this issue is tackled, as I've encountered this problem as well a few days ago. You can read it here.
Basically, it comes down to the fact that you have to:
create a SQL login for your user
create a credential in SQL that has the same name as the URL that points to the container in your datalake that contains the files you want to query
grant reference rights on that credential to your SQL login
create a user on your database for that login
Next to that, you also need to create some specific role-assignments.
1.I'm unable to connect AzureSQL DB from Azure APP Services.
2.In Azure App Services Sonarqube is running with 8.9.7 version i created some of the projects and i passed the connection string on with JDBC URL , User Name , Password and default tables is not creating.
3.Both db and password are correct manul i treid to login into db as well showing zero tables.
4.I'm getting one more exception with "*****Embedded database should be used for evaluation purposes only
The embedded database will not scale, it will not support upgrading to newer versions of SonarQube, and there is no support for migrating your data out of it into a different database engine.****"
Below are the screen shot for reference.
I am trying to create an ADF Linked Service connection to a Synapse Link Serverless SQL Pool connected to ADSL Storage. I can successfully get a connection but when I try and use a dataset to access the data I get a permission issue.
I can successfully access the data via Synapse studio :
This is the error I get when I use the data set in ADF.
I can also look at the schemas in SSMS , where they appear as External tables. But get a similar credential error at the same point.
Has anyone come across this issue please ?
There are a few pieces of information you haven’t supplied in your question but I believe I know what happened. The external table worked in Synapse Studio because you were connected to the Serverless SQL pool with your AAD account and it passed through your AAD credentials to the data lake and succeeded.
However when you setup the linked service to the Serverless SQL Pool Im guessing you used a SQL auth account for the credentials. With SQL auth it doesn’t know how to authenticate with the data lake so looked for a server scoped credential but couldn’t find one.
The same happened when you connected from SSMS with a SQL auth account I’m guessing.
You have several options. If it’s important to be able to access the external table with SQL auth you can execute the following to tell it how to access the data lake. This assumes the Synapse Workspace Managed Service Identity has Storage Blob Data Reader or Storage Blob Data Contributor role on the data lake.
CREATE CREDENTIAL [https://<YourDataLakeName>.dfs.core.windows.net]
WITH IDENTITY = 'Managed Identity';
Or you could change the authentication on the linked service to use the Managed Service Identity.
I am trying to copy a file data from Azure Blob storage to Azure SQL DB just for my learning. I cannot able to create the linked service for Azure SQL db destination as it is giving the error. I can able to connect fine from my local SSMS to the Azure SQL server but not from AZURE data factory. I turned on Allow access to Azure services. I am using the default integration runtime (AutoResolveIntegrationRuntime). I also did Add client IP by adding my current IP address to the rule list
Try using an Azure Integration Runtime with the same region as the SQL server. Sometimes the auto resolve cannot reach the sql server.
Hope this helped!
I'm attempting to add a new data source from a SQL Server on Azure VM for a search service and indexer I'm creating through the Azure web portal. It's my understanding that I can create an index, import this data, then create an indexer to regularly push data to the index. I'm adding the connection string for our SQL Server and getting a successful message when clicking "Test Connection". The tables show up in a drop-down list, and I select one.
When I click "OK" on creating the new data source, a popup comes up that says "Sampling Data Source..." then an "Error detecting index schema from data source: 'Data source payload should specify at least one of datasource name and type'".
I've tried Googling this error, and I can't find anything on it and not sure how to fix it so I can proceed.
This looks like a bug in Azure Search portal support for SQL Server data sources.
We'll investigate. In the meantime, you can create your datasource programmatically as shown in Connecting Azure SQL Database to Azure Search.