Cannot connect to Azure SQL Data Warehouse database-error "Incorrect syntax near 'ANSI_NULLS'" - azure

I could successfully deliver the new Azure SQL Data Warehouse database.
If Í try to connect to the SQL Data Warehouse Database, I receive following error message:
"Parse error at line: 1 ,column: 5: Incorrect syntax near 'ANSI_NULLS'".
This happens in VS 2013 and VS 2015! The data load process with BCP to the SQL Data Warehouse database was successfully!
Thanks, Herbert

Azure SQL Data Warehouse does not currently support setting ANSI_NULLS on (SET ANSI_NULL ON). You can simply remove that statement from your query and you should have success.
Additionally, make sure that you are running the June 2015 Preview of SSDT (http://blogs.msdn.com/b/ssdt/archive/2015/06/24/ssdt-june-2015-preview.aspx). This has the supported SSDT capabilities for SQL Data Warehouse.

I had the same error, when tried to Use Visual Studio to query Azure SQL Data Warehouse 
 and selected my database.
The Workaround was to select master database, connect to it, then in top drop-down for the query change to my database.

I think your connection isn't actually recognised as a SQL DW connection. I bet your query window is a .sql file, not a .dsql as it needs to be. If you connect as a .sql query, it will try to set various settings that aren't supported.
Go back into the Azure portal and use the link to connect using SSDT from there. You should get a connection in the SQL Server Explorer pane which looks different, and when you start a New Query based on it, you should get a .dsql window, not a .sql one.

Related

Manual Azure Backup Cosmos DB

Tried to export data in CosmosDB but it was not successful. According to https://learn.microsoft.com/en-us/azure/cosmos-db/storage-explorer, by using this tool I can export the data inside the cosmosdb, but no option to export. Tried to do the instructions here https://azure.microsoft.com/en-us/updates/documentdb-data-migration-tool/ and https://learn.microsoft.com/en-us/azure/cosmos-db/import-data#JSON, but error is being encountered.
Can you help me how to do this in Data Factory or any steps just to manual backup cosmos DB?
i tried doing the backup through azure data factory but data factory can't seem to connect to cosmos db, it's so weird 'cause the primary string/secondary string that I used is in the details of the cosmos db
Thank you.
Can you help me how to do this in Data Factory
According to your description,it seems you have trouble with export data,not import data. You could use Copy activity in ADF which supports cosmos db connector.For you needs,cosmos db is source dataset and please add one more sink dataset(destination). Such as some json files in the blob storage.Just make sure you configure right authentication information with your cosmos db account.
ADF is more suitable for the batch back up or daily back up.
or any steps just to manual backup cosmos DB
Yes,Storage Explorer is not for exporting data from cosmos db,Data migration tool is the suitable option.Please install the tool and refer to some details from this link:https://learn.microsoft.com/en-us/azure/cosmos-db/import-data#export-to-json-file
DMT is more suitable for single back up.Surely,it also supports execution in the batch if you use command line to execute it.
Cosmos DB Data Migration tool can be used to export data from Cosmos DB.
Refer https://learn.microsoft.com/en-us/azure/cosmos-db/import-data#export-to-json-file
this one worked for me... since my SSL in my Macbook did not work, I did these steps from the Azure VM that I created.
Steps:
Download MongoDB Community Server Client tool as per your OS version and MongoDB compatible version.
(Or you can download [v3.2.22 for Windows X64] directly at here, please don’t download the version beyond 4.2 as it’s incompatible)
After installing the MongoDB client tools, go to the installation directory -> go to the subfolder “bin” containing the mongoexport.exe, then issue below command to export your data:
mongoexport --host=: -u= -p= --db= --collection= --ssl --sslAllowInvalidCertificates --out=
Note 1: You can find the , , and in Cosmos DB Portal – “Connection String”

Cannot view object schema Azure Datawarehouse

Attempting to view the a view or procedure:
from SQL Server Management Studio 17.4 in Azure SQL Datawarehouse notes the error:
I can however, delete and create any object that I want.
How can I work to ensure I can view the objects definition?
UPDATED
Concerning setting the options in SSMS to SQL Datawarehouse, there is not that option:
Please change this setting under Tools... Options. That should resolve the error. I wish we didn't have to change this but at lease we have a workaround.
In SSMS 17.5 there are a few more options. You can have it automatically detect what type of database you're connected to and script accordingly. Or you can force it to script for a particular database type like the following screenshot.
It appears there is a bug in certain versions of SSMS (such as 17.5) where if the DW user isn't also a user in the master database then scripting fails. The easy fix for this is having the server admin connect to the master database and run:
CREATE USER MyUserNameHere FROM LOGIN MyUserNameHere;
Sorry you're seeing this. I've been able to repro this on SSMS 17.4 and 17.5. We are looking at this now.
This seems to be a defect in the upgrade path for 17.5 as this should be supported without having to change the script settings. We are investigating this and will try to get an update out as soon as possible.

SQL Azure Export and Import into local SQL Server 2012 - Unknown block type. Stream might be corrupted

I have a SQL Azure database which is approximately 10GB in total size.
I wanted to have a local copy of the database for development so I saved an export of the database to my storage account and downloaded it. I was a little suspicious when the backup size was 500MB but I backed up the database twice, the file size was the same both times.
I am using SSMS 2014 on a SQL Server 2012 database and selecting 'import data tier application', the backup appears to be working BUT I get an error with the largest table. The error is:
TITLE: Microsoft SQL Server Management Studio
Data plan execution failed with message One or more errors occurred.
(Microsoft.SqlServer.Dac)
ADDITIONAL INFORMATION:
One or more errors occurred. (mscorlib)
One or more errors occurred.
One or more errors occurred.
Unknown block type. Stream might be corrupted. (System)
I cannot find any examples of others with this problem, but it can't be only me that has it?
FYI When I try to use SSMS 2012 to import that database I get the following error:
TITLE: Microsoft SQL Server Management Studio
Could not load schema model from package. (Microsoft.SqlServer.Dac)
ADDITIONAL INFORMATION:
Internal Error. The database platform service with type
Microsoft.Data.Tools.Schema.Sql.SqlAzureV12DatabaseSchemaProvider is
not valid. You must make sure the service is loaded, or you must
provide the full type name of a valid database platform service.
(Microsoft.Data.Tools.Schema.Sql)
Which is why I installed 2014.
#
UPDATE, After installing SSMS 2016 I got the same error:
TITLE: Microsoft SQL Server Management Studio
Data plan execution failed with message One or more errors occurred. (Microsoft.SqlServer.Dac)
ADDITIONAL INFORMATION:
One or more errors occurred. (mscorlib)
One or more errors occurred.
One or more errors occurred.
Unknown block type. Stream might be corrupted. (System)
Please try with the latest version of SSMS that you can download and install from SQL Server Management Studio Download page
Azure SQL Database has a numerous new features being added in a fast pace manner. SSMS from this unified download page provides up-to-date supports for the latest features in Azure SQL Database.
That error indicates that the compressed table data stored in the bacpac file couldn't be decompressed. That can happen if the file gets corrupted at some point during a file copy operation. If it happens with multiple bacpac files, though, that would be concerning. Have you tried exporting the bacpac a second time?

How to use Excel Power Query to access a postgresql database

We have a postgresql 9.0 database which we want to connect to via Excel 2010 running the Power Query plugin. I've setup the machine running excel to be able to use Npgsql to do the connection according to these instructions http://office.microsoft.com/en-us/excel-help/connect-to-a-postgresql-database-HA104028095.aspx?CTT=5&origin=HA104003952. The connection fails in Excel with this error message:
DataSource.Error: PostgreSQL: ERROR: 42883: function
concat(information_schema.character_data, unknown) does not exist
Details: Message=ERROR: 42883:..., ErrorCode=-2147467259
Has anyone successfully connected to a postgresql database from Excel using the Power Query plugin? There's a menu item on the power query ribbon in excel specifically for postgresql so I figured it would work. The concat function isn't in our postgresql version of 9.0 so do I have to upgrade our postgresql database to do this?
You can either upgrade or define your own function like this:
CREATE OR REPLACE concat(information_schema.character_data, varchar)
RETURNS varchar LANGUAGE SQL AS
$$ SELECT $1::varchar || $2; $$;
But upgrading to 9.1 seems likely the best approach.

How to import data via post deployment script from SQL Server CE into a SQL Server DB freshly created using VS database project

I'm trying to import data from a SQL Server Compact (.sdf) database into a SQL Server database created through Visual Studio database project.
The .sdf database was created by a third party tool from an existant SQL Server database, and it is embedded into the Visual Studio project. The purpose is to deploy the database and insert initial values on a remote machine. I would like to run the import script as a post deployment script something like the following:
INSERT INTO Employer(
EmployerId,
EmployeeName
) SELECT
EmployerId,
EmployeeName
FROM sdfDB.dbo.Employer
Is it possible to reference a SQL Server Compact database (sdfDB)
in T-SQL script?
If not, what is the best practice to import data
into a freshly created DB from an embedded datasource which can be
deployed by build script in a remote machine?
1) Is it possible to reference a SQL Server Compact database (sdfDB) in
T-SQL script?
If you are thinking to something like this,
INSERT INTO [SQL SERVER Db].Table (Values)
SELECT (Values)
FROM [Sql Server Compact Db].Table
unfortunately no, this is not possible.
2) If not, what is the best practice to import data into a freshly
created DB from an embedded datasource which can be deployed by build
script in a remote machine.
You can use the SQL Server Compact Toolbox, that contains features to generate scripts from a Sql server compact file, that you can use to populate the SqlServer database.
Features:
Script!
Migrate a SQL Server Compact database directly to SQL Server (LocalDB/Express)
Migrate from SQL Server Compact to SQL Server, SQL Azure and SQLite via script (...)
(...)
EDIT
Also available in api form at exportsqlce.codeplex.com. Thanks ErikEJ for comment and great job!

Resources