SQL Azure Export and Import into local SQL Server 2012 - Unknown block type. Stream might be corrupted - azure

I have a SQL Azure database which is approximately 10GB in total size.
I wanted to have a local copy of the database for development so I saved an export of the database to my storage account and downloaded it. I was a little suspicious when the backup size was 500MB but I backed up the database twice, the file size was the same both times.
I am using SSMS 2014 on a SQL Server 2012 database and selecting 'import data tier application', the backup appears to be working BUT I get an error with the largest table. The error is:
TITLE: Microsoft SQL Server Management Studio
Data plan execution failed with message One or more errors occurred.
(Microsoft.SqlServer.Dac)
ADDITIONAL INFORMATION:
One or more errors occurred. (mscorlib)
One or more errors occurred.
One or more errors occurred.
Unknown block type. Stream might be corrupted. (System)
I cannot find any examples of others with this problem, but it can't be only me that has it?
FYI When I try to use SSMS 2012 to import that database I get the following error:
TITLE: Microsoft SQL Server Management Studio
Could not load schema model from package. (Microsoft.SqlServer.Dac)
ADDITIONAL INFORMATION:
Internal Error. The database platform service with type
Microsoft.Data.Tools.Schema.Sql.SqlAzureV12DatabaseSchemaProvider is
not valid. You must make sure the service is loaded, or you must
provide the full type name of a valid database platform service.
(Microsoft.Data.Tools.Schema.Sql)
Which is why I installed 2014.
#
UPDATE, After installing SSMS 2016 I got the same error:
TITLE: Microsoft SQL Server Management Studio
Data plan execution failed with message One or more errors occurred. (Microsoft.SqlServer.Dac)
ADDITIONAL INFORMATION:
One or more errors occurred. (mscorlib)
One or more errors occurred.
One or more errors occurred.
Unknown block type. Stream might be corrupted. (System)

Please try with the latest version of SSMS that you can download and install from SQL Server Management Studio Download page
Azure SQL Database has a numerous new features being added in a fast pace manner. SSMS from this unified download page provides up-to-date supports for the latest features in Azure SQL Database.

That error indicates that the compressed table data stored in the bacpac file couldn't be decompressed. That can happen if the file gets corrupted at some point during a file copy operation. If it happens with multiple bacpac files, though, that would be concerning. Have you tried exporting the bacpac a second time?

Related

SSIS Package Export on-premise SharePoint List to SQL Server

I have created an SSIS package on my local development machine that is running VS2019 Prof and SQL Server 2019.
Every thing is working as expected, but I have one SharePoint List with 10,000 items which keeps failing with the message:
Error: 0xC02090F5 at Load Data From SharePoint Absence List, Absence [2]: The Absence was unable to process the data. An error occured when reading the OData feed.
Error: 0xC0047038 at Load Data From SharePoint Absence List,
SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on Absence returned error code 0xC02090F5.
The component returned a failure code when the pipeline engine called PrimeOutput().
The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
There may be error messages posted before this with more information about the failure.
It always fails at the same record with is just under 4,000 transferred. This is the only error that I have found.
I have exported the data from SharePoint into Excel, which returns the full dataset and cannot see any issues with the data.
The SSIS package is using an OLE Db connection to access the data, other SharePoint lists are read and completed by the same SSIS package using the same connection.
My thoughts:
Is this an OLE Db issue?
Do I need to set some parameter to the OLE Db connection?
Does this need to be sperated into it's own SSIS package?
Can I find out more information about the specific error?
Does anyone have any helpful advice on how to proceed?
I have tried:
Adding SharePoint Libraries
Started looking at Power Automate, learning curve
Power BI, but again a learning curve, so far I have created a data query which can see all of the items in the list, great, but I have no idea on how to push the dataset into SQL Server. I've created a transformation and tried Python, but that does not work.
UPDATE
I have checked the data in the record and everthing is OK. No hidden formating or odd charactors, date all in UTC format, etc...

SQL Server deploy database to Azure throws "You must have a user with the same password..."

I have a live SQL Server database which was originally on an Express version and has compatibility 100. I am inspecting it using SSMS 2016 RC, which seems to have highest compatibility level of 120.
On performing the task Deploy Database to Microsoft Azure SQL Database... it goes through all the verification steps and then bails out with the error:
Could not import package. Unable to connect to master or target server X. You must have a user with the same password in master or target server X
Is the only solution to upgrade the compatibility version of the database and is that even possible from such an old level? Do I have to install a newer SSMS to do this?
I've seen a similar question below, but it refers to SSDT, not SSMS:
SSDT failing to publish: "Unable to connect to master or target server"
I installed latest SSMS v18.3 and performed the deploy operation again and it went through without a hitch, without even requiring me to change the compatibility level

Excel Data Load using SSIS - Memory used up error

I am trying to load data to an excel file using SSIS Package. Please find below the details
Source : SQL Server Table
Destination : Excel File
No.of rows:646K
No.of columns:132
I have deployed the package in the SQL Server Integration Services Catalog and trying to execute it from there.
But the following errors are being thrown:
Not enough storage is available to complete this operation.
The attempt to add a row to the Data Flow task buffer failed with
error code 0xC0047020.
SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on
SRC_MDM_ENTITYDUPLICATE returned error code 0xC02020C4. The
component returned a failure code when the pipeline engine called
PrimeOutput(). The meaning of the failure code is defined by the
component, but the error is fatal and the pipeline stopped executing.
There may be error messages posted before this with more information
about the failure.
My DFT looks like the following:
I am using Data Conversion since I am facing some datatype mismatch between Unicode and Non-Unicode characters.
The package is working fine in my local machine with 95-99% resource utilization.
Since I have deployed the package in production environment, I can't do any modifications in the Server Settings. Also I guess the high resource utilization is creating issue while executing the package in production server.
I tried reducing DefaultBufferMaxRows size and increasing DefaultBufferSize which didn't help me anyhow.
Can somebody help me to optimize my package and fix this issue.
Thanks much in Advance.
I realized that the solution of the error is that the column is not excel in your package, as a solution you will either delete that column from the package or add empty columns

Automated SQL Export Failed

I have an automatic backup running each night through the Portal which should back up my Azure database to blob storage as a .bacpac file and up until Friday that had been working successfully.
Each night I get an email error saying:
Automated SQL Export failed for myServer:myDatabase at 5/30/2016 11:35:39 PM. The temporary database copy was made, but this copy could not be exported to the .bacpac file.
Some tutorials suggest logging into the Portal and doing it manually. When I do this it works successfully and I am able to see the file without error. But on the following night, the process fails again (it doesn't recover itself from performing a manual backup). Is there a way to get more information on why it is failing?
In the new Portal, you can find more information via audit log, database level operations will be logged there including import/export.
OK so after further analysis I was able to pinpoint the root cause of my issue to a Stored Procedure.
I had a Stored Proc which was explicitly referencing my database. Whenever the database backup is taken in Azure, it creates a temporary name and at that point, "breaks" the Stored Procedure as it was Self Referencing.
Fixing the Stored Proc has resumed the automatic backups.
An example of a statement the Proc was calling was:
Select Name from MyDatabase.Dbo.MyTable
This should be rewritten as the following to make it exportable:
Select Name from Dbo.MyTable
Note that while I was able to obtain a more meaningful error using a local copy of Sql Server Management Studio, no error was present in the Azure Portal.
Hopefully this will help someone else.

Cannot connect to Azure SQL Data Warehouse database-error "Incorrect syntax near 'ANSI_NULLS'"

I could successfully deliver the new Azure SQL Data Warehouse database.
If Í try to connect to the SQL Data Warehouse Database, I receive following error message:
"Parse error at line: 1 ,column: 5: Incorrect syntax near 'ANSI_NULLS'".
This happens in VS 2013 and VS 2015! The data load process with BCP to the SQL Data Warehouse database was successfully!
Thanks, Herbert
Azure SQL Data Warehouse does not currently support setting ANSI_NULLS on (SET ANSI_NULL ON). You can simply remove that statement from your query and you should have success.
Additionally, make sure that you are running the June 2015 Preview of SSDT (http://blogs.msdn.com/b/ssdt/archive/2015/06/24/ssdt-june-2015-preview.aspx). This has the supported SSDT capabilities for SQL Data Warehouse.
I had the same error, when tried to Use Visual Studio to query Azure SQL Data Warehouse 
 and selected my database.
The Workaround was to select master database, connect to it, then in top drop-down for the query change to my database.
I think your connection isn't actually recognised as a SQL DW connection. I bet your query window is a .sql file, not a .dsql as it needs to be. If you connect as a .sql query, it will try to set various settings that aren't supported.
Go back into the Azure portal and use the link to connect using SSDT from there. You should get a connection in the SQL Server Explorer pane which looks different, and when you start a New Query based on it, you should get a .dsql window, not a .sql one.

Resources