DataDog Log - Search a sequence with wildcards - search

In DataDog's log search, I want to match the following sentence. But I have not been able to do so.
• Request failed with status code 500
• Request failed with status code 525
• Request failed with status code 512
The status code can be any value from 500 to 599.
I have tried using the following searches. Now of them work.
• "Request failed with status code 5*"
• "Request failed with status code 5**"
• "*Request failed with status code 5*"
• "Request failed with status code 5?*"
• "Request failed with status code 5??*"
• "Request failed with status code 5??"
• "*Request?failed?with?status?code?5*"
• "Request?failed?with?status?code?5??"
• "*Request?failed?with?status?code?5??*"
These work. But the result is wrong.
"Request failed with status code" 5??
"Request failed with status code" 5*
Any help would be appreciated, thanks!

update:
looks like ? doesn't match on spaces
query HTTP responded 5??
they need to be escaped: query HTTP\ responded?5??
So I would try Request\ failed\ with\ status\ code\ 5?? as the query.
However I would recommend parsing the log with a pipeline so you can just make an easier query like `#request.status:failed #status.code:[500 TO 599]. To avoid having to dealing with all the escaping and wilcarding
Wildcards don't work inside quotes, they will be treated as literal.
just search for Request?failed?with?status?code?5??
You also don't need to surround the query with *, they are there by default when searching a message.

Related

ORA-01555: snapshot too old: rollback segment number with name “” too small Sonar qube

i am getting error while publishing results on sonar.
Error querying database. Cause: org.apache.ibatis.executor.result.ResultMapException: Error attempting to get column 'RAWLINEHASHES' from result set. Cause: java.sql.SQLException: ORA-01555: snapshot too old: rollback segment number 2 with name "_SYSSMU2_111974964$" too small
Cause: org.apache.ibatis.executor.result.ResultMapException: Error attempting to get column 'RAWLINEHASHES' from result set. Cause: java.sql.SQLException: ORA-01555: snapshot too old: rollback segment number 2 with name "_SYSSMU2_111974964$" too small
Pipeline executed for 2 hr 30 mins.
Can you please help ?
The error that you are getting is ORA-01555. Which is an Oracle error message.
Your pipeline is executing something against an Oracle database, which after it has run for a long time, gives the error.
For ways to avoid this error see: https://blog.enmotech.com/2018/09/10/ora-01555-snapshot-old-error-ways-to-avoid-ora-01555-snapshot-too-old-error/

Determine Mongoose findByIdAndUpdate() error type

Re. mongoose findByIdAndUpdate(): How to determine proper error status code that whether id was not found (404) or passed data was invalid (422)?
Like alternatively we can use model.findById() first to determine 404 and then can use model.update() to determine 422.

Error running ASR health status report Kusto query

I am trying to query replication health of all protected Azure VMs and break them down into three states: normal, warning and critical. But I got an error running below code:
AzureDiagnostics 
| where replicationProviderName_s == "A2A"  
| where isnotempty(name_s) and isnotnull(name_s) 
| summarize hint.strategy=partitioned arg_max(TimeGenerated, *) by name_s 
| project name_s , replicationHealth_s 
| summarize count() by replicationHealth_s 
| render piechart 
Error is: 'where' operator: Failed to resolve column or scalar ex;pression named 'replicationProviderName_S'
Please help me resolve the error.
 
the error means there's no column named replicationProviderName_s in the schema of the table/function named AzureDiagnostics.
what does the following return when you run it?
AzureDiagnostics | getschema

UserErrorSqlBulkCopyInvalidColumnLength - Azure SQL Database

I am configuring a Salesforce to Azure SQL Database data copy using Azure Data Factory. There appears to be an issue with the column length, but I am unable to identify which column is actually causing an issue.
How can I gain more insight into exactly what is causing my problem? or what column is really invalid?
{
"dataRead":18560714,
"dataWritten":0,
"rowsRead":15514,
"rowsCopied":0,
"copyDuration":34,
"throughput":533.109,
"errors":[
{
"Code":9123,
"Message":"ErrorCode=UserErrorSqlBulkCopyInvalidColumnLength,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=SQL Bulk Copy failed due to received an invalid column length from the bcp client.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=The service has encountered an error processing your request. Please try again. Error code 4815.\r\nA severe error occurred on the current command. The results, if any, should be discarded.,Source=.Net SqlClient Data Provider,SqlErrorNumber=40197,Class=20,ErrorCode=-2146232060,State=1,Errors=[{Class=20,Number=40197,State=1,Message=The service has encountered an error processing your request. Please try again. Error code 4815.,},{Class=20,Number=0,State=0,Message=A severe error occurred on the current command. The results, if any, should be discarded.,},],'",
"EventType":0,
"Category":5,
"Data":{
},
"MsgId":null,
"ExceptionType":null,
"Source":null,
"StackTrace":null,
"InnerEventInfos":[
]
}
],
"effectiveIntegrationRuntime":"DefaultIntegrationRuntime (East US 2)",
"usedCloudDataMovementUnits":4,
"usedParallelCopies":1,
"executionDetails":[
{
"source":{
"type":"Salesforce"
},
"sink":{
"type":"AzureSqlDatabase"
},
"status":"Failed",
"start":"2018-03-01T18:07:37.5732769Z",
"duration":34,
"usedCloudDataMovementUnits":4,
"usedParallelCopies":1,
"detailedDurations":{
"queuingDuration":5,
"timeToFirstByte":24,
"transferDuration":4
}
}
]
}
"Message":"ErrorCode=UserErrorSqlBulkCopyInvalidColumnLength,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=SQL
Bulk Copy failed due to received an invalid column length from the bcp
client.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=The
service has encountered an error processing your request. Please try
again. Error code 4815.\r\nA severe error occurred on the current
command. The results, if any, should be
discarded.,Source=.Net SqlClient Data
Provider,SqlErrorNumber=40197,Class=20,ErrorCode=-2146232060,State=1,Errors=[{Class=20,Number=40197,State=1,Message=The service has encountered an error processing your request. Please try
again. Error code 4815.,},{Class=20,Number=0,State=0,Message=A severe
error occurred on the current command. The results, if any,
should be discarded.
I have seen that error when trying copy a string from the source to the destination but the column receiving the string on the destination is shorter than the source.
My suggestion is to make sure you selected data types big enough on the destination so they can receive the data from the source. Make sure also the sink has the columns shown in the preview.
To identify which column may be involved, exclude big columns one by one from the source.

Getting "required (...)+ loop did not match anything at input 'Scenario:'" error when using Background section in cucumber

I am writing a Karate DSL test to test a web service end point. I have defined my url base in karate-config.js file already. But when I try to use this in the Background section, I am getting the below error. Please help. Provided my feature file below.
Error: "required (...)+ loop did not match anything at input 'Scenario:'"
Feature: Test Data Management service endpoints that perform different operations with EPR
Background:
url dataManagementUrlBase
Scenario: Validate that the contractor's facility requirements are returned from EPR
Given path 'facilities'
And def inputpayload = read('classpath:dataManagementPayLoad.json')
And request inputpayload
When method post
Then status 200
And match $ == read('classpath:dataManagementExpectedJson.json')
You are missing a * before the url
Background:
* url dataManagementUrlBase

Resources