UserErrorSqlBulkCopyInvalidColumnLength - Azure SQL Database - azure

I am configuring a Salesforce to Azure SQL Database data copy using Azure Data Factory. There appears to be an issue with the column length, but I am unable to identify which column is actually causing an issue.
How can I gain more insight into exactly what is causing my problem? or what column is really invalid?
{
"dataRead":18560714,
"dataWritten":0,
"rowsRead":15514,
"rowsCopied":0,
"copyDuration":34,
"throughput":533.109,
"errors":[
{
"Code":9123,
"Message":"ErrorCode=UserErrorSqlBulkCopyInvalidColumnLength,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=SQL Bulk Copy failed due to received an invalid column length from the bcp client.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=The service has encountered an error processing your request. Please try again. Error code 4815.\r\nA severe error occurred on the current command. The results, if any, should be discarded.,Source=.Net SqlClient Data Provider,SqlErrorNumber=40197,Class=20,ErrorCode=-2146232060,State=1,Errors=[{Class=20,Number=40197,State=1,Message=The service has encountered an error processing your request. Please try again. Error code 4815.,},{Class=20,Number=0,State=0,Message=A severe error occurred on the current command. The results, if any, should be discarded.,},],'",
"EventType":0,
"Category":5,
"Data":{
},
"MsgId":null,
"ExceptionType":null,
"Source":null,
"StackTrace":null,
"InnerEventInfos":[
]
}
],
"effectiveIntegrationRuntime":"DefaultIntegrationRuntime (East US 2)",
"usedCloudDataMovementUnits":4,
"usedParallelCopies":1,
"executionDetails":[
{
"source":{
"type":"Salesforce"
},
"sink":{
"type":"AzureSqlDatabase"
},
"status":"Failed",
"start":"2018-03-01T18:07:37.5732769Z",
"duration":34,
"usedCloudDataMovementUnits":4,
"usedParallelCopies":1,
"detailedDurations":{
"queuingDuration":5,
"timeToFirstByte":24,
"transferDuration":4
}
}
]
}
"Message":"ErrorCode=UserErrorSqlBulkCopyInvalidColumnLength,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=SQL
Bulk Copy failed due to received an invalid column length from the bcp
client.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=The
service has encountered an error processing your request. Please try
again. Error code 4815.\r\nA severe error occurred on the current
command. The results, if any, should be
discarded.,Source=.Net SqlClient Data
Provider,SqlErrorNumber=40197,Class=20,ErrorCode=-2146232060,State=1,Errors=[{Class=20,Number=40197,State=1,Message=The service has encountered an error processing your request. Please try
again. Error code 4815.,},{Class=20,Number=0,State=0,Message=A severe
error occurred on the current command. The results, if any,
should be discarded.

I have seen that error when trying copy a string from the source to the destination but the column receiving the string on the destination is shorter than the source.
My suggestion is to make sure you selected data types big enough on the destination so they can receive the data from the source. Make sure also the sink has the columns shown in the preview.
To identify which column may be involved, exclude big columns one by one from the source.

Related

ORA-01555: snapshot too old: rollback segment number with name “” too small Sonar qube

i am getting error while publishing results on sonar.
Error querying database. Cause: org.apache.ibatis.executor.result.ResultMapException: Error attempting to get column 'RAWLINEHASHES' from result set. Cause: java.sql.SQLException: ORA-01555: snapshot too old: rollback segment number 2 with name "_SYSSMU2_111974964$" too small
Cause: org.apache.ibatis.executor.result.ResultMapException: Error attempting to get column 'RAWLINEHASHES' from result set. Cause: java.sql.SQLException: ORA-01555: snapshot too old: rollback segment number 2 with name "_SYSSMU2_111974964$" too small
Pipeline executed for 2 hr 30 mins.
Can you please help ?
The error that you are getting is ORA-01555. Which is an Oracle error message.
Your pipeline is executing something against an Oracle database, which after it has run for a long time, gives the error.
For ways to avoid this error see: https://blog.enmotech.com/2018/09/10/ora-01555-snapshot-old-error-ways-to-avoid-ora-01555-snapshot-too-old-error/

Why can't Azure Search import JSON blobs?

When importing data using the configuration found below, Azure Cognitive Search returns the following error:
Error detecting index schema from data source: ""
Is this configured incorrectly? The files are stored in the container "example1" and in the blob folder "json". When creating the same index with the same data in the past there were no errors, so I am not sure why it is different now.
Import data:
Data Source: Azure Blob Storage
Name: test-example
Data to extract: Content and metadata
Parsing mode: JSON
Connection string:
DefaultEndpointsProtocol=https;AccountName=EXAMPLESTORAGEACCOUNT;AccountKey=EXAMPLEACCOUNTKEY;
Container name: example1
Blob folder: json
.json file structure.
{
"string1": "vaule1",
"string2": "vaule2",
"string3": "vaule3",
"string4": "vaule4",
"string5": "vaule5",
"string6": "vaule6",
"string7": "vaule7",
"string8": "vaule8",
"list1": [
{
"nested1": "value1",
"nested2": "value2",
"nested3": "value3",
"nested4": "value4"
}
],
"FileLocation": null
}
Here is an image of the screen with the error when clicking "Next: Add cognitive skills (Optional)" button:
To clarify there are two problems:
1) There is a bug in the portal where the actual error message is not showing up for errors, hence we are observing the unhelpful empty string "" as an error message. A fix is on the way and should be rolled out early next week.
2) There is an error when the portal attempts to detect index schema from your data source. It's hard to say what the problem is when the error message is just "". I've tried your sample data and it works fine with importing.
I'll update the post once the fix for displaying the error message is out. In the meantime (again we're flying blind here without the specific error string) here are a few things to check:
1) Make sure your firewall rules allow the portal to read from your blob storage
2) Make sure there are no extra characters inside your JSON files. Check the whitespace charcters are whitespace (you should be able to open the file in VSCode and check).
Update: The portal fix for the missing error messages has been deployed. You should be able to see a more specific error message should an error occur during import.
Seems to me that is a problem related to the list1 data type. Make sure you're selecting: "Collection(Edm.String)" for it during the index creation.
more info, please check step 5 of the following link: https://learn.microsoft.com/en-us/azure/search/search-howto-index-json-blobs
I have been in contact with Microsoft, and this is a bug in the Azure Portal. The issue is the connection string wizard does not append the Endpoint suffix correctly. They have recommeded to manually pasting the connection string, but this still does not work for me. So this is a suggested answer by Microsoft, but I don't believe is completely correct because the portal outputs the same error message:
Error detecting index schema from data source: ""

Web Api Returning Json - [System.NotSupportedException] Specified method is not supported. (Sybase Ase)

I'm using Web api with Entity Framework 4.2 and the Sybase Ase connector.
This was working without issues returning JSon, until I tried to add a new table.
return db.car
.Include("tires")
.Include("tires.hub_caps")
.Include("tires.hub_caps.colors")
.Include("tires.hub_caps.sizes")
.Include("tires.hub_caps.sizes.units")
.Where(c => c.tires == 13);
The above works without issues if the following line is removed:
.Include("tires.hub_caps.colors")
However, when that line is included, I am given the error:
""An error occurred while preparing the command definition. See the inner exception for details."
The inner exception reads:
"InnerException = {"Specified method is not supported."}"
"source = Sybase.AdoNet4.AseClient"
The following also results in an error:
List<car> cars = db.car.AsNoTracking()
.Include("tires")
.Include("tires.hub_caps")
.Include("tires.hub_caps.colors")
.Include("tires.hub_caps.sizes")
.Include("tires.hub_caps.sizes.units")
.Where(c => c.tires == 13).ToList();
The error is as follows:
An exception of type 'System.Data.EntityCommandCompilationException' occurred in System.Data.Entity.dll but was not handled in user code
Additional information: An error occurred while preparing the command definition. See the inner exception for details.
Inner exception: "Specified method is not supported."
This points to a fault with with the Sybase Ase Data Connector.
I am using data annotations on all tables to control which fields are returned. On the colors table, I have tried the following annotations to limit the properties returned just the key:
[JsonIgnore]
[IgnoreDataMember]
Any ideas what might be causing this issue?
Alternatively, if I keep colors in and remove,
.Include("tires.hub_caps.sizes")
.Include("tires.hub_caps.sizes.units")
then this works also. It seems that the Sybase Ase connector does not support cases when an include statement forks from one object in two directions. Is there a way round this? The same issue occurs with Sybase Ase and the progress data connector.
The issue does not occur in a standard ASP.net MVC controller class - the problem is with serializing two one to many relationships on a single table to JSON.
This issue still occurs if lazy loading is turned on.
It seems to me that this is a bug with Sybase ASE, that none of the connectors are able to solve.

Error while using topcount

I'm using the following MDX query:
SELECT [Measures].[Unita (libri)] ON COLUMNS,
TopCount([Anno].[Anno].Members, 10.0, [Measures].[Unita (libri)]) ON ROWS
FROM [Pubblicazioni]
But i always get back an error:
Error Occurred While getting Resultset
MDX Query Editor
An error occurred while rendering Pivot.jsp. Please see the log for details.
I can't access the log (for some reason the server is not writing logs) but i think it's not normal to get this error with this simple query. Any ideas about?

DataServiceRequestException was unhandled by user code. An error occurred while processing this request

DataServiceRequestException was unhandled by user code. An error occurred while processing this request.
This is in relation to the previous post I added
public void AddEmailAddress(EmailAddressEntity emailAddressTobeAdded)
{
AddObject("EmailAddress", emailAddressTobeAdded);
SaveChanges();
}
Again the code breaks and gives a new exception this time. Last time it was Storage Client Exception, now it is DataServiceRequestException. Code breaks at SaveChanges.
Surprisingly, solution is same. CamelCasing is not supported. So the code works if "EmailAddress" is changed to "Emailaddress" or "emailaddress"
More details
http://weblogs.asp.net/chanderdhall/archive/2010/07/10/dataservicerequestexception-was-unhandled-by-user-code-an-error-occurred-while-processing-this-request.aspx
Because you've tagged this with "azure" and "table," I'm going to assume this code results in a call to Windows Azure tables. If so, I can assure you that camel casing table names is indeed supported. However, the string you pass to AddObject has to match a table that you've created, so make sure the code where you create the table is using the same casing.

Resources