Kibana: searching for a specific phrase, returns without results, while another search returns the phrase - logstash

Looks like a simple usecase but for some reason I just can't figure out how to do this, or google a clear example.
Lets say I have a message stored in logstash
message:
"info: 2015-11-28 22:02:19,232:common:INFO:ENV: Production
User:None:Username:None:LOG: publishing to bus "
And I want to search in kibana (version 4) for the phrase:"publishing to bus"
I'll get a set of results
But if I'll search for: "None:LOG: publishing to bus"
Then I get "No results found".
While Obviously this phrase does exists and is returned by the previous search.
So my question is basically - What is going on? What is the correct way to search for a possible long phrase and why does the second example fail.
EDIT:
The stored JSON.
{
"_index": "logz-ngdxrkmolklnvngumaitximbohqwbocg-151206_v1",
"_type": "django_logger",
"_id": "AVF2DPxZZst_8_8_m-se",
"_score": null,
"_source": {
"log": " publishing to bus {'user_id': 8866, 'event_id': 'aibRBPcLxcAzsEVRtFZVU5', 'timestamp': 1449384441, 'quotes': {}, 'rates': {u'EURUSD': Decimal('1.061025'), u'GBPUSD': Decimal('1.494125'), u'EURGBP': Decimal('0.710150')}, 'event': 'AccountInstrumentsUpdated', 'minute': 1449384420}",
"logger": "common",
"log_level": "INFO",
"message": "2015-12-06 06:47:21,298:common:INFO:ENV: Production User:None:Username:None:LOG: publishing to bus {'user_id': 8866, 'event_id': 'aibRBPcLxcAzsEVRtFZVU5', 'timestamp': 1449384441, 'quotes': {}, 'rates': {u'EURUSD': Decimal('1.061025'), u'GBPUSD': Decimal('1.494125'), u'EURGBP': Decimal('0.710150')}, 'event': 'AccountInstrumentsUpdated', 'minute': 1449384420}",
"type": "django_logger",
"tags": [
"celery"
],
"path": "//path/to/logs/out.log",
"environment": "Staging",
"#timestamp": "2015-12-06T06:47:21.298+00:00",
"user_id": "None",
"host": "path.to.host",
"timestamp": "2015-12-06 06:47:21,298",
"username": "None"
},
"fields": {
"#timestamp": [
1449384441298
]
},
"highlight": {
"message": [
"2015-12-06 06:47:21,298:common:INFO:ENV: Staging User:None:Username:None:LOG: #kibana-highlighted-field#publishing#/kibana-highlighted-field# #kibana-highlighted-field#to#/kibana-highlighted-field# #kibana-highlighted-field#bus#/kibana-highlighted-field# {'user_id': **, 'event_id': 'aibRBPcLxcAzsEVRtFZVU5', 'timestamp': 1449384441, 'quotes': {}, 'rates': {u'EURUSD': Decimal('1.061025'), u'GBPUSD': Decimal('1.494125'), u'EURGBP': Decimal('0.710150')}, 'event': 'AccountInstrumentsUpdated', 'minute': 1449384420}"
]
},
"sort": [
1449384441298
]
}

Accodrding to Elasticsearch, it uses standard analyzer as default. The standard analyzer tokenizes the message field as follows:
"2015-12-06 06:47:21,298:common:INFO:ENV: Production
User:None:Username:None:LOG: publishing to bus {'user_id': 8866,
'event_id': 'aibRBPcLxcAzsEVRtFZVU5', 'timestamp': 1449384441,
'quotes': {}, 'rates': {u'EURUSD': Decimal('1.061025'), u'GBPUSD':
Decimal('1.494125'), u'EURGBP': Decimal('0.710150')}, 'event':
'AccountInstrumentsUpdated', 'minute': 1449384420}"
{
"tokens": [
{
"token": "2015",
"start_offset": 0,
"end_offset": 4,
"type": "<NUM>",
"position": 0
},
{
"token": "12",
"start_offset": 5,
"end_offset": 7,
"type": "<NUM>",
"position": 1
},
{
"token": "06",
"start_offset": 8,
"end_offset": 10,
"type": "<NUM>",
"position": 2
},
{
"token": "06",
"start_offset": 11,
"end_offset": 13,
"type": "<NUM>",
"position": 3
},
{
"token": "47",
"start_offset": 14,
"end_offset": 16,
"type": "<NUM>",
"position": 4
},
{
"token": "21,298",
"start_offset": 17,
"end_offset": 23,
"type": "<NUM>",
"position": 5
},
{
"token": "common:info:env",
"start_offset": 24,
"end_offset": 39,
"type": "<ALPHANUM>",
"position": 6
},
{
"token": "production",
"start_offset": 41,
"end_offset": 51,
"type": "<ALPHANUM>",
"position": 7
},
{
"token": "user:none:username:none:log",
"start_offset": 52,
"end_offset": 79,
"type": "<ALPHANUM>",
"position": 8
},
{
"token": "publishing",
"start_offset": 81,
"end_offset": 91,
"type": "<ALPHANUM>",
"position": 9
},
{
"token": "to",
"start_offset": 92,
"end_offset": 94,
"type": "<ALPHANUM>",
"position": 10
},
{
"token": "bus",
"start_offset": 95,
"end_offset": 98,
"type": "<ALPHANUM>",
"position": 11
},
{
"token": "user_id",
"start_offset": 100,
"end_offset": 107,
"type": "<ALPHANUM>",
"position": 12
},
{
"token": "8866",
"start_offset": 109,
"end_offset": 113,
"type": "<NUM>",
"position": 13
},
{
"token": "event_id",
"start_offset": 115,
"end_offset": 123,
"type": "<ALPHANUM>",
"position": 14
},
{
"token": "aibrbpclxcazsevrtfzvu5",
"start_offset": 125,
"end_offset": 147,
"type": "<ALPHANUM>",
"position": 15
},
{
"token": "timestamp",
"start_offset": 149,
"end_offset": 158,
"type": "<ALPHANUM>",
"position": 16
},
{
"token": "1449384441",
"start_offset": 160,
"end_offset": 170,
"type": "<NUM>",
"position": 17
},
{
"token": "quotes",
"start_offset": 172,
"end_offset": 178,
"type": "<ALPHANUM>",
"position": 18
},
{
"token": "rates",
"start_offset": 184,
"end_offset": 189,
"type": "<ALPHANUM>",
"position": 19
},
{
"token": "ueurusd",
"start_offset": 192,
"end_offset": 199,
"type": "<ALPHANUM>",
"position": 20
},
{
"token": "decimal",
"start_offset": 201,
"end_offset": 208,
"type": "<ALPHANUM>",
"position": 21
},
{
"token": "1.061025",
"start_offset": 209,
"end_offset": 217,
"type": "<NUM>",
"position": 22
},
{
"token": "ugbpusd",
"start_offset": 220,
"end_offset": 227,
"type": "<ALPHANUM>",
"position": 23
},
{
"token": "decimal",
"start_offset": 229,
"end_offset": 236,
"type": "<ALPHANUM>",
"position": 24
},
{
"token": "1.494125",
"start_offset": 237,
"end_offset": 245,
"type": "<NUM>",
"position": 25
},
{
"token": "ueurgbp",
"start_offset": 248,
"end_offset": 255,
"type": "<ALPHANUM>",
"position": 26
},
{
"token": "decimal",
"start_offset": 257,
"end_offset": 264,
"type": "<ALPHANUM>",
"position": 27
},
{
"token": "0.710150",
"start_offset": 265,
"end_offset": 273,
"type": "<NUM>",
"position": 28
},
{
"token": "event",
"start_offset": 277,
"end_offset": 282,
"type": "<ALPHANUM>",
"position": 29
},
{
"token": "accountinstrumentsupdated",
"start_offset": 284,
"end_offset": 309,
"type": "<ALPHANUM>",
"position": 30
},
{
"token": "minute",
"start_offset": 311,
"end_offset": 317,
"type": "<ALPHANUM>",
"position": 31
},
{
"token": "1449384420",
"start_offset": 319,
"end_offset": 329,
"type": "<NUM>",
"position": 32
}
]
}
The phrase "Production User:None:Username:None:LOG: publishing to bus "
{
"token": "production",
"start_offset": 41,
"end_offset": 51,
"type": "<ALPHANUM>",
"position": 7
},
{
"token": "user:none:username:none:log",
"start_offset": 52,
"end_offset": 79,
"type": "<ALPHANUM>",
"position": 8
},
{
"token": "publishing",
"start_offset": 81,
"end_offset": 91,
"type": "<ALPHANUM>",
"position": 9
},
{
"token": "to",
"start_offset": 92,
"end_offset": 94,
"type": "<ALPHANUM>",
"position": 10
},
{
"token": "bus",
"start_offset": 95,
"end_offset": 98,
"type": "<ALPHANUM>",
"position": 11
}
So if you search "publishing to bus" the elasticsearch matches the above three token and return the document.
if you search "None:LOG: publishing to bus" "None:LOG:" doesn't match fully so it doesn't return the document.
you can try "User:None:Username:None:LOG: publishing to bus" to get the result.

There are some problems in Kibana with special character as : | and -. When kibana found that kind of character they save in different parts, not in the same field. For that is easy to find publishing to bus or None or log. The solution is that you must indicate to kibana that the field wil not be analyzed.

Related

I have a JSON formatted output as below, how to access and valid the outputs of each node with Assert statement

I have a JSON formatted output as below, how to access and valid the outputs of each node with Assert statement
{
"Type": "Page",
"X": 0,
"Y": 0,
"Width": 696,
"Height": 888,
"Children": [
{
"Type": "Column",
"X": 0,
"Y": 0,
"Width": 696,
"Height": 888,
"Children": [
{
"Type": "Paragraph",
"X": 209,
"Y": 290,
"Width": 248,
"Height": 24,
"Children": [
{
"Type": "Line",
"X": 209,
"Y": 290,
"Width": 248,
"Height": 24,
"Children": [
{
"Type": "Word",
"X": 209,
"Y": 290,
"Width": 49,
"Height": 24,
"Children": [
],
"Content": "Core"
},
{
"Type": "Word",
"X": 263,
"Y": 290,
"Width": 106,
"Height": 24,
"Children": [
],
"Content": "Enterprise"
},
{
"Type": "Word",
"X": 375,
"Y": 290,
"Width": 82,
"Height": 24,
"Children": [
],
"Content": "Installer"
}
],
"Content": null
}
],
"Content": null
},
{
"Type": "Paragraph",
"X": 580,
"Y": 803,
"Width": 79,
"Height": 13,
"Children": [
{
"Type": "Line",
"X": 580,
"Y": 803,
"Width": 79,
"Height": 13,
"Children": [
{
"Type": "Word",
"X": 580,
"Y": 803,
"Width": 46,
"Height": 13,
"Children": [
],
"Content": "Version"
},
{
"Type": "Word",
"X": 629,
"Y": 803,
"Width": 12,
"Height": 13,
"Children": [
],
"Content": "8."
},
{
"Type": "Word",
"X": 640,
"Y": 803,
"Width": 12,
"Height": 13,
"Children": [
],
"Content": "0."
},
{
"Type": "Word",
"X": 651,
"Y": 803,
"Width": 8,
"Height": 13,
"Children": [
],
"Content": "0"
}
],
"Content": null
}
],
"Content": null
}
],
"Content": null
}
],
"Content": null
}
Looking for solutions

Gitlab API access to package registry file

trying to get my .war file with the GitLab API, how to do that ?
I got this route : /api/v4/projects/:id/packages/:package_id/package_files
to get list of files, but how to get the file content from this data response ?
Thanks
let's assume you have these responses:
https://gitlab.com/api/v4/projects/8377576/packages/25/package_files:
[{
"id": 101,
"package_id": 25,
"created_at": "2018-09-14T07:41:10.409Z",
"file_name": "my-app-1.4-20180914.074110-1.jar",
"size": 2497,
"file_md5": "2f94a9760bcd7c2be781b938ec825205",
"file_sha1": "63d4153372057e12ca8e539c5fcae82b7b110e45"
}, {
"id": 102,
"package_id": 25,
"created_at": "2018-09-14T07:41:10.843Z",
"file_name": "my-app-1.4-20180914.074110-1.pom",
"size": 1429,
"file_md5": "380bbe1891b4d568f823f5562875b12b",
"file_sha1": "2406e3f80700ff1579a255858b8dcab35ef9ee4e"
}, {
"id": 103,
"package_id": 25,
"created_at": "2018-09-14T07:41:11.250Z",
"file_name": "maven-metadata.xml",
"size": 767,
"file_md5": "994e0bf8f19bc1c6fdfaf821e9e65037",
"file_sha1": "1462cf5d9ba09e67848202d67cafa3c7e1034a9d"
}, {
"id": 106,
"package_id": 25,
"created_at": "2018-09-14T07:41:24.324Z",
"file_name": "my-app-1.4-20180914.074123-1.jar",
"size": 2505,
"file_md5": "a75078cae821223e7ac6d9055cca24b6",
"file_sha1": "7eca3cba2d25225382e079381cb4b0616528552f"
}, {
"id": 107,
"package_id": 25,
"created_at": "2018-09-14T07:41:25.521Z",
"file_name": "my-app-1.4-20180914.074123-1.pom",
"size": 1429,
"file_md5": "380bbe1891b4d568f823f5562875b12b",
"file_sha1": "2406e3f80700ff1579a255858b8dcab35ef9ee4e"
}, {
"id": 108,
"package_id": 25,
"created_at": "2018-09-14T07:41:27.257Z",
"file_name": "maven-metadata.xml",
"size": 767,
"file_md5": "938e53442dbe0e513bf99ac35a721a30",
"file_sha1": "3d71c2f7064fdf016a70ccddf27d879af7a08d47"
}
]
assuming you want the file my-app-1.4-20180914.074110-1.pom, you can download it by using the url:
https://gitlab.com/gitlab-org/examples/mvn-example/-/package_files/102/download
(you should know your group name, project name etc.)
** gitlab does not have a proper way to do this. this is a workaround
Reference:
https://gitlab.com/gitlab-org/gitlab/-/issues/271534

How to create an order in Shopware6 via API?

I am trying to create an order in Shopware6 via the Admin API
My Json Object:
// POST https://myserver.example.com/api/v3/order
{
"currencyId": "b7d2554b0ce847cd82f3ac9bd1c0dfca",
"salesChannelId": "ed47859020cc46349ed5f024c65d09e5",
"billingAddressId": "629002cceec64dd888b4c52b782d6427",
"stateId": "9c22101c81d14eb9be55a93a3ba39aa2",
"languageId": "2fbb5fe2e29a4d70aa5854ce7ce3e20b",
"currencyFactor": 1,
"orderCustomer": {
"email": "abc#xyz.com",
"salutationId": "1354f03b5d7c414e8b5aaddd0a32e8ca",
"firstName": "Donald",
"lastName": "Duck",
"customerNumber": "10045",
"customerId": "9432a62c65ae4bcbb335c3bc853d79e8",
"remoteAddress": "10.147.19.0"
},
"price": {
"netPrice": 33.57,
"totalPrice": 39.95,
"calculatedTaxes": [{
"tax": 6.38,
"taxRate": 19,
"price": 39.95
}],
"taxRules": [{
"taxRate": 19,
"percentage": 100
}],
"positionPrice": 39.95,
"taxStatus": "gross"
},
"orderDateTime": "2021-04-21T14:27:57.006+0000",
"createdAt": "2021-04-21T14:27:57.056+0000",
"lineItems": [{
"identifier": "8c380f63883449adec1ee4ccb456c489",
"quantity": 1,
"label": "Handtasche - Designer Bag",
"type": "product",
"good": true,
"removable": true,
"stackable": true,
"position": 1,
"price": {
"unitPrice": 39.95,
"quantity": 1,
"totalPrice": 39.95,
"calculatedTaxes": [{
"tax": 6.38,
"taxRate": 19,
"price": 39.95
}],
"taxRules": [{
"taxRate": 19,
"percentage": 100
}]
},
"createdAt": "2021-04-21T14:27:57.054+0000"
}]
}
But this gives me the Response:
{
"errors": [{
"code": "0",
"status": "500",
"title": "Internal Server Error",
"detail": "Notice: Trying to access array offset on value of type null",
"meta": {
"trace": [{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/DataAbstractionLayer\/Dbal\/EntityHydrator.php",
"line": 213,
"function": "decode",
"class": "Shopware\\Core\\Framework\\DataAbstractionLayer\\FieldSerializer\\CalculatedPriceFieldSerializer",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/DataAbstractionLayer\/Dbal\/EntityHydrator.php",
"line": 49,
"function": "hydrateEntity",
"class": "Shopware\\Core\\Framework\\DataAbstractionLayer\\Dbal\\EntityHydrator",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/DataAbstractionLayer\/Dbal\/EntityReader.php",
"line": 126,
"function": "hydrate",
"class": "Shopware\\Core\\Framework\\DataAbstractionLayer\\Dbal\\EntityHydrator",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/DataAbstractionLayer\/Dbal\/EntityReader.php",
"line": 94,
"function": "_read",
"class": "Shopware\\Core\\Framework\\DataAbstractionLayer\\Dbal\\EntityReader",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/DataAbstractionLayer\/Cache\/CachedEntityReader.php",
"line": 130,
"function": "read",
"class": "Shopware\\Core\\Framework\\DataAbstractionLayer\\Dbal\\EntityReader",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/DataAbstractionLayer\/Cache\/CachedEntityReader.php",
"line": 55,
"function": "loadResultByIds",
"class": "Shopware\\Core\\Framework\\DataAbstractionLayer\\Cache\\CachedEntityReader",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Profiling\/Entity\/EntityReaderProfiler.php",
"line": 36,
"function": "read",
"class": "Shopware\\Core\\Framework\\DataAbstractionLayer\\Cache\\CachedEntityReader",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/DataAbstractionLayer\/EntityRepository.php",
"line": 239,
"function": "read",
"class": "Shopware\\Core\\Profiling\\Entity\\EntityReaderProfiler",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/DataAbstractionLayer\/EntityRepository.php",
"line": 91,
"function": "read",
"class": "Shopware\\Core\\Framework\\DataAbstractionLayer\\EntityRepository",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Checkout\/Customer\/Subscriber\/CustomerMetaFieldSubscriber.php",
"line": 62,
"function": "search",
"class": "Shopware\\Core\\Framework\\DataAbstractionLayer\\EntityRepository",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/symfony\/event-dispatcher\/EventDispatcher.php",
"line": 304,
"function": "fillCustomerMetaDataFields",
"class": "Shopware\\Core\\Checkout\\Customer\\Subscriber\\CustomerMetaFieldSubscriber",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/symfony\/event-dispatcher\/EventDispatcher.php",
"line": 251,
"function": "Symfony\\Component\\EventDispatcher\\{closure}",
"class": "Symfony\\Component\\EventDispatcher\\EventDispatcher",
"type": "::"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/symfony\/event-dispatcher\/EventDispatcher.php",
"line": 73,
"function": "callListeners",
"class": "Symfony\\Component\\EventDispatcher\\EventDispatcher",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/Webhook\/WebhookDispatcher.php",
"line": 88,
"function": "dispatch",
"class": "Symfony\\Component\\EventDispatcher\\EventDispatcher",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/Event\/BusinessEventDispatcher.php",
"line": 46,
"function": "dispatch",
"class": "Shopware\\Core\\Framework\\Webhook\\WebhookDispatcher",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/Event\/NestedEventDispatcher.php",
"line": 32,
"function": "dispatch",
"class": "Shopware\\Core\\Framework\\Event\\BusinessEventDispatcher",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/Event\/NestedEventDispatcher.php",
"line": 28,
"function": "dispatch",
"class": "Shopware\\Core\\Framework\\Event\\NestedEventDispatcher",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/symfony\/event-dispatcher\/Debug\/TraceableEventDispatcher.php",
"line": 168,
"function": "dispatch",
"class": "Shopware\\Core\\Framework\\Event\\NestedEventDispatcher",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/DataAbstractionLayer\/EntityRepository.php",
"line": 178,
"function": "dispatch",
"class": "Symfony\\Component\\EventDispatcher\\Debug\\TraceableEventDispatcher",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/Api\/Controller\/ApiController.php",
"line": 916,
"function": "create",
"class": "Shopware\\Core\\Framework\\DataAbstractionLayer\\EntityRepository",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/Context.php",
"line": 183,
"function": "Shopware\\Core\\Framework\\Api\\Controller\\{closure}",
"class": "Shopware\\Core\\Framework\\Api\\Controller\\ApiController",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/Api\/Controller\/ApiController.php",
"line": 934,
"function": "scope",
"class": "Shopware\\Core\\Framework\\Context",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/Api\/Controller\/ApiController.php",
"line": 780,
"function": "executeWriteOperation",
"class": "Shopware\\Core\\Framework\\Api\\Controller\\ApiController",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/Api\/Controller\/ApiController.php",
"line": 452,
"function": "write",
"class": "Shopware\\Core\\Framework\\Api\\Controller\\ApiController",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/symfony\/http-kernel\/HttpKernel.php",
"line": 158,
"function": "create",
"class": "Shopware\\Core\\Framework\\Api\\Controller\\ApiController",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/symfony\/http-kernel\/HttpKernel.php",
"line": 80,
"function": "handleRaw",
"class": "Symfony\\Component\\HttpKernel\\HttpKernel",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/symfony\/http-kernel\/Kernel.php",
"line": 201,
"function": "handle",
"class": "Symfony\\Component\\HttpKernel\\HttpKernel",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/symfony\/http-kernel\/HttpCache\/SubRequestHandler.php",
"line": 85,
"function": "handle",
"class": "Symfony\\Component\\HttpKernel\\Kernel",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/symfony\/http-kernel\/HttpCache\/HttpCache.php",
"line": 477,
"function": "handle",
"class": "Symfony\\Component\\HttpKernel\\HttpCache\\SubRequestHandler",
"type": "::"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/symfony\/http-kernel\/HttpCache\/HttpCache.php",
"line": 267,
"function": "forward",
"class": "Symfony\\Component\\HttpKernel\\HttpCache\\HttpCache",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/symfony\/http-kernel\/HttpCache\/HttpCache.php",
"line": 283,
"function": "pass",
"class": "Symfony\\Component\\HttpKernel\\HttpCache\\HttpCache",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/symfony\/http-kernel\/HttpCache\/HttpCache.php",
"line": 211,
"function": "invalidate",
"class": "Symfony\\Component\\HttpKernel\\HttpCache\\HttpCache",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/HttpKernel.php",
"line": 163,
"function": "handle",
"class": "Symfony\\Component\\HttpKernel\\HttpCache\\HttpCache",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/HttpKernel.php",
"line": 80,
"function": "doHandle",
"class": "Shopware\\Core\\HttpKernel",
"type": "-\u003E"
},
{
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/public\/index.php",
"line": 83,
"function": "handle",
"class": "Shopware\\Core\\HttpKernel",
"type": "-\u003E"
}],
"file": "\/data\/srv\/myserver.example.com\/www\/releases\/0.0\/vendor\/shopware\/core\/Framework\/DataAbstractionLayer\/FieldSerializer\/CalculatedPriceFieldSerializer.php",
"line": 51
}
}]
I tried all kinds of different variations with the "price" object, with calculated taxes, without and anything that came to my mind, no success.
How would a correct JSON object look like that successfully creates an order?
When I work with the API, sometimes I just create an entity manually (e.g. Order) & then try to GET it via the API. Then you can just copy the formatting and feed it back to your POST call.
You might be missing "rawTotal": 468.55 in your main price object.
That said, the error suggests that the value should be an array, but it is not set. They don't use version numbers in their API anymore, so maybe this is no longer relevant :)

How to prevent Elasticsearch from matching on only one non-English character with multi_match

I'm using elasiticsearch-dsl and SmartCN analyzer:
from elasticsearch_dsl import analyzer
analyzer_cn = analyzer(
'smartcn',
tokenizer=tokenizer('smartcn_tokenizer'),
filter=['lowercase']
)
And I'm using multi_match to match on several terms:
from elasticsearch_dsl import Q
q_new = Q("multi_match", query="SOME_QUERY", fields=["FEIDL_NAME"])
The desired behavior is that ES only returns documents with at least two characters matched. I've looked through the documents but couldn't find a way to prevent Elasticsearch matching on a single character.
Any direction / suggestion is highly appreciated.
Thanks.
The desired behavior is that ES only returns documents with at least
two characters matched.
I am not familiar with SmartCN analyzer, but if you want to match on a minimum of 2 characters, then according to your use case, you can use N-gram tokenizer that first breaks text down into words whenever it encounters one of a list of specified characters, then it emits N-grams of each word of the specified length.
Index Mapping:
{
"settings": {
"analysis": {
"analyzer": {
"my_analyzer": {
"tokenizer": "my_tokenizer"
}
},
"tokenizer": {
"my_tokenizer": {
"type": "ngram",
"min_gram": 2, <-- note this
"max_gram": 20,
"token_chars": [
"letter",
"digit"
]
}
}
},
"max_ngram_diff": 50
},
"mappings": {
"properties": {
"title": {
"type": "text",
"analyzer": "my_analyzer",
"search_analyzer": "standard"
}
}
}
}
Index Data:
{
"title": "world"
}
Analyze API
The search query will not match for "title": "w", since the tokens generated are of minimum length 2 (as the min_gram is defined 2 in the index mapping above)
The tokens generated are:
POST/_analyze
{
"tokens": [
{
"token": "wo",
"start_offset": 0,
"end_offset": 2,
"type": "word",
"position": 0
},
{
"token": "wor",
"start_offset": 0,
"end_offset": 3,
"type": "word",
"position": 1
},
{
"token": "worl",
"start_offset": 0,
"end_offset": 4,
"type": "word",
"position": 2
},
{
"token": "world",
"start_offset": 0,
"end_offset": 5,
"type": "word",
"position": 3
},
{
"token": "or",
"start_offset": 1,
"end_offset": 3,
"type": "word",
"position": 4
},
{
"token": "orl",
"start_offset": 1,
"end_offset": 4,
"type": "word",
"position": 5
},
{
"token": "orld",
"start_offset": 1,
"end_offset": 5,
"type": "word",
"position": 6
},
{
"token": "rl",
"start_offset": 2,
"end_offset": 4,
"type": "word",
"position": 7
},
{
"token": "rld",
"start_offset": 2,
"end_offset": 5,
"type": "word",
"position": 8
},
{
"token": "ld",
"start_offset": 3,
"end_offset": 5,
"type": "word",
"position": 9
}
]
}
**Search Query:**
{
"query": {
"match": {
"title": "wo"
}
}
}
Search Result:
"hits": [
{
"_index": "stof_64003025",
"_type": "_doc",
"_id": "2",
"_score": 0.56802315,
"_source": {
"title": "world"
}
}
]

timeUnit does not work after a flatten and flod transformation

Is it possible to use timeUnit after a flatten and flod transformation?
In the example below it doesnt work!
If I remove the timeUnit from the x axis it plots, but without the good things that come with the timeUnit.
Thanks
This is an example code that can be executed in the link below
https://vega.github.io/editor/#/edited
{
"$schema": "https://vega.github.io/schema/vega-lite/v4.json",
"description": "Sales in a Year.",
"width": 500,
"height": 200,
"data": {
"values": [
{"timestamp": ["2019-01-01","2019-02-01","2019-03-01","2019-04-01","2019-05-01","2019-06-01",
"2019-07-01","2019-08-01","2019-09-01","2019-10-01","2019-11-01","2019-12-01"],
"cars" : [55, 43, 91, 81, 53, 19, 87, 52, 52, 44, 52, 52],
"bikes" : [12, 6, 2, 0, 0, 0, 0, 0, 0, 3, 9, 15]}
]
},
"transform": [
{"flatten": ["timestamp", "cars", "bikes"]},
{"fold": ["cars", "bikes"]}
],
"mark": {"type":"bar", "tooltip": true, "cornerRadiusEnd": 4},
"encoding": {
"x": {"field": "timestamp",
"timeUnit": "month",
"type": "ordinal",
"title": "",
"axis": {"labelAngle": 0}},
"y": {"field": "value",
"type": "quantitative",
"title": "Soiling Loss"},
"color":{"field": "key",
"type": "nominal"}
}
}
For convenience, strings in input data with a simple temporal encoding are automatically parsed as dates, but such parsing is not applied to data that is the result of a transformation.
In this case, you can do the parsing manually with a calculate transform (view in editor):
{
"$schema": "https://vega.github.io/schema/vega-lite/v4.json",
"description": "Sales in a Year.",
"width": 500,
"height": 200,
"data": {
"values": [
{
"timestamp": [
"2019-01-01",
"2019-02-01",
"2019-03-01",
"2019-04-01",
"2019-05-01",
"2019-06-01",
"2019-07-01",
"2019-08-01",
"2019-09-01",
"2019-10-01",
"2019-11-01",
"2019-12-01"
],
"cars": [55, 43, 91, 81, 53, 19, 87, 52, 52, 44, 52, 52],
"bikes": [12, 6, 2, 0, 0, 0, 0, 0, 0, 3, 9, 15]
}
]
},
"transform": [
{"flatten": ["timestamp", "cars", "bikes"]},
{"fold": ["cars", "bikes"]},
{"calculate": "toDate(datum.timestamp)", "as": "timestamp"}
],
"mark": {"type": "bar", "tooltip": true, "cornerRadiusEnd": 4},
"encoding": {
"x": {
"field": "timestamp",
"timeUnit": "month",
"type": "ordinal",
"title": "",
"axis": {"labelAngle": 0}
},
"y": {"field": "value", "type": "quantitative", "title": "Soiling Loss"},
"color": {"field": "key", "type": "nominal"}
}
}

Resources