i am getting error while publishing results on sonar.
Error querying database. Cause: org.apache.ibatis.executor.result.ResultMapException: Error attempting to get column 'RAWLINEHASHES' from result set. Cause: java.sql.SQLException: ORA-01555: snapshot too old: rollback segment number 2 with name "_SYSSMU2_111974964$" too small
Cause: org.apache.ibatis.executor.result.ResultMapException: Error attempting to get column 'RAWLINEHASHES' from result set. Cause: java.sql.SQLException: ORA-01555: snapshot too old: rollback segment number 2 with name "_SYSSMU2_111974964$" too small
Pipeline executed for 2 hr 30 mins.
Can you please help ?
The error that you are getting is ORA-01555. Which is an Oracle error message.
Your pipeline is executing something against an Oracle database, which after it has run for a long time, gives the error.
For ways to avoid this error see: https://blog.enmotech.com/2018/09/10/ora-01555-snapshot-old-error-ways-to-avoid-ora-01555-snapshot-too-old-error/
I am configuring a Salesforce to Azure SQL Database data copy using Azure Data Factory. There appears to be an issue with the column length, but I am unable to identify which column is actually causing an issue.
How can I gain more insight into exactly what is causing my problem? or what column is really invalid?
{
"dataRead":18560714,
"dataWritten":0,
"rowsRead":15514,
"rowsCopied":0,
"copyDuration":34,
"throughput":533.109,
"errors":[
{
"Code":9123,
"Message":"ErrorCode=UserErrorSqlBulkCopyInvalidColumnLength,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=SQL Bulk Copy failed due to received an invalid column length from the bcp client.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=The service has encountered an error processing your request. Please try again. Error code 4815.\r\nA severe error occurred on the current command. The results, if any, should be discarded.,Source=.Net SqlClient Data Provider,SqlErrorNumber=40197,Class=20,ErrorCode=-2146232060,State=1,Errors=[{Class=20,Number=40197,State=1,Message=The service has encountered an error processing your request. Please try again. Error code 4815.,},{Class=20,Number=0,State=0,Message=A severe error occurred on the current command. The results, if any, should be discarded.,},],'",
"EventType":0,
"Category":5,
"Data":{
},
"MsgId":null,
"ExceptionType":null,
"Source":null,
"StackTrace":null,
"InnerEventInfos":[
]
}
],
"effectiveIntegrationRuntime":"DefaultIntegrationRuntime (East US 2)",
"usedCloudDataMovementUnits":4,
"usedParallelCopies":1,
"executionDetails":[
{
"source":{
"type":"Salesforce"
},
"sink":{
"type":"AzureSqlDatabase"
},
"status":"Failed",
"start":"2018-03-01T18:07:37.5732769Z",
"duration":34,
"usedCloudDataMovementUnits":4,
"usedParallelCopies":1,
"detailedDurations":{
"queuingDuration":5,
"timeToFirstByte":24,
"transferDuration":4
}
}
]
}
"Message":"ErrorCode=UserErrorSqlBulkCopyInvalidColumnLength,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=SQL
Bulk Copy failed due to received an invalid column length from the bcp
client.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=The
service has encountered an error processing your request. Please try
again. Error code 4815.\r\nA severe error occurred on the current
command. The results, if any, should be
discarded.,Source=.Net SqlClient Data
Provider,SqlErrorNumber=40197,Class=20,ErrorCode=-2146232060,State=1,Errors=[{Class=20,Number=40197,State=1,Message=The service has encountered an error processing your request. Please try
again. Error code 4815.,},{Class=20,Number=0,State=0,Message=A severe
error occurred on the current command. The results, if any,
should be discarded.
I have seen that error when trying copy a string from the source to the destination but the column receiving the string on the destination is shorter than the source.
My suggestion is to make sure you selected data types big enough on the destination so they can receive the data from the source. Make sure also the sink has the columns shown in the preview.
To identify which column may be involved, exclude big columns one by one from the source.
I am trying to Launch Presto to query Hive ON RHEL Machine.
But while Launching the Presto Server via "./launcher run", i am getting the following error :
3 errors
com.google.inject.CreationException: Unable to create injector, see the following errors:
1) Configuration property 'http-server.http.port=8080 ' was not used
at io.airlift.bootstrap.Bootstrap.lambda$initialize$2(Bootstrap.java:235)
2) Error: Could not coerce value '8080 ' to int (property 'http-server.http.port') in order to call [public io.airlift.http.server.HttpServerConfig io.airlift.http.server.HttpServerConfig.setHttpPort(int)]
at io.airlift.http.server.HttpServerModule.configure(HttpServerModule.java:74)
3) Error: Invalid configuration property node.id: is malformed (for class io.airlift.node.NodeConfig.nodeId)
at io.airlift.node.NodeModule.configure(NodeModule.java:34)
3 errors
at com.google.inject.internal.Errors.throwCreationExceptionIfErrorsExist(Errors.java:466)
at com.google.inject.internal.InternalInjectorCreator.initializeStatically(InternalInjectorCreator.java:155)
at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:107)
at com.google.inject.Guice.createInjector(Guice.java:96)
at io.airlift.bootstrap.Bootstrap.initialize(Bootstrap.java:242)
at com.facebook.presto.server.PrestoServer.run(PrestoServer.java:116)
at com.facebook.presto.server.PrestoServer.main(PrestoServer.java:67)
A few things. You need to remove the space after "http-server.http.port=8080".
Also, your node.id property is invalid. The node.id should match the following regex: "[A-Za-z0-9][_A-Za-z0-9-]*" (see https://github.com/airlift/airlift/blob/1e5694fb13ac6ca9cbdae1a1e60909c62fc7a64e/node/src/main/java/io/airlift/node/NodeConfig.java#L30).
same question: No factory for connector jmx
remove all /etc/* config files space
I'm using Web api with Entity Framework 4.2 and the Sybase Ase connector.
This was working without issues returning JSon, until I tried to add a new table.
return db.car
.Include("tires")
.Include("tires.hub_caps")
.Include("tires.hub_caps.colors")
.Include("tires.hub_caps.sizes")
.Include("tires.hub_caps.sizes.units")
.Where(c => c.tires == 13);
The above works without issues if the following line is removed:
.Include("tires.hub_caps.colors")
However, when that line is included, I am given the error:
""An error occurred while preparing the command definition. See the inner exception for details."
The inner exception reads:
"InnerException = {"Specified method is not supported."}"
"source = Sybase.AdoNet4.AseClient"
The following also results in an error:
List<car> cars = db.car.AsNoTracking()
.Include("tires")
.Include("tires.hub_caps")
.Include("tires.hub_caps.colors")
.Include("tires.hub_caps.sizes")
.Include("tires.hub_caps.sizes.units")
.Where(c => c.tires == 13).ToList();
The error is as follows:
An exception of type 'System.Data.EntityCommandCompilationException' occurred in System.Data.Entity.dll but was not handled in user code
Additional information: An error occurred while preparing the command definition. See the inner exception for details.
Inner exception: "Specified method is not supported."
This points to a fault with with the Sybase Ase Data Connector.
I am using data annotations on all tables to control which fields are returned. On the colors table, I have tried the following annotations to limit the properties returned just the key:
[JsonIgnore]
[IgnoreDataMember]
Any ideas what might be causing this issue?
Alternatively, if I keep colors in and remove,
.Include("tires.hub_caps.sizes")
.Include("tires.hub_caps.sizes.units")
then this works also. It seems that the Sybase Ase connector does not support cases when an include statement forks from one object in two directions. Is there a way round this? The same issue occurs with Sybase Ase and the progress data connector.
The issue does not occur in a standard ASP.net MVC controller class - the problem is with serializing two one to many relationships on a single table to JSON.
This issue still occurs if lazy loading is turned on.
It seems to me that this is a bug with Sybase ASE, that none of the connectors are able to solve.
I'm using the following MDX query:
SELECT [Measures].[Unita (libri)] ON COLUMNS,
TopCount([Anno].[Anno].Members, 10.0, [Measures].[Unita (libri)]) ON ROWS
FROM [Pubblicazioni]
But i always get back an error:
Error Occurred While getting Resultset
MDX Query Editor
An error occurred while rendering Pivot.jsp. Please see the log for details.
I can't access the log (for some reason the server is not writing logs) but i think it's not normal to get this error with this simple query. Any ideas about?