i am trying to set the cover of a product programmatically, by doing this for example;
[
"name" => "Example product"
"price" => [
0 => [
"currencyId" => "b7d2554b0ce847cd82f3ac9bd1c0dfca"
"gross" => 15
"net" => 10
"linked" => false
]
]
"manufacturer" => [
"name" => "Example manufacturer"
]
"tax" => [
"name" => "21%"
"taxRate" => 21
]
"stock" => 6235
"productNumber" => "PE-123123"
"coverId" => "4efd6bc156014cc2945b6351d3e9ff03"
]
I checked, and i am sure the media is uploaded. If i do it via the media as showed bellow, the media/image gets linked correctly.
"media" => [
"Id" => 'Example",
"mediaId" => "4efd6bc156014cc2945b6351d3e9ff03"
]
I don't understand why it is going wrong, as the documentation (https://docs.shopware.com/en/shopware-platform-dev-en/admin-api-guide/writing-entities#setting-the-cover) prescribes this way. A example in the documentation is the following;
{
"name": "test",
"productNumber": "random",
"stock": 10,
"taxId": "5f78f2d4b19f49648eb1b38881463da0",
"price": [
{ "currencyId" : "b7d2554b0ce847cd82f3ac9bd1c0dfca", "gross": 15, "net": 10, "linked" : false }
],
"coverId": "00a9742db2e643ccb9d969f5a30c2758"
}
You should pass cover media ID in the following way:
[
//other product data
"cover" => [
"mediaId" => "00a9742db2e643ccb9d969f5a30c2758"
]
]
For me, it works.
"media": [
{
"id": 'YourProductUuid',
"media": {
"id": 'YourProductMediaId'
}
}
],
"cover" : {
"mediaId" : 'YourProductMediaId'
},
Related
I have data in MongoDB structured like this
[
{
"users": [
"5dd13dac47b4c85e382c6e27",
"5dce9f6d95f4ee0017be3c2c"
],
"created_at": "2019-11-20T11:22:19.167Z",
"_id": "5dd5224d76cf581424e1bb83",
"name": "Fast Weight Loss",
"program": [
{
"breakfast": [
"3x Eggs",
"2x Bread",
"Cup of Milk"
],
"lunch": [
"1/4 Chicken breast"
],
"dinner": [
"1x Apple"
],
"snack": [],
"_id": "5dd5224d76cf581424e1bb84"
}
],
"__v": 0
},
{
"users": [
"5dd168eea514847564f04a74",
"5dd010a1dfa846001742e913"
],
"created_at": "2019-11-20T11:30:22.316Z",
"_id": "5dd5259bcdb7af35f09e6f9e",
"name": "30 Days Weight Loss",
"program": [
{
"breakfast": [
"3x Eggs"
],
"lunch": [],
"dinner": [],
"snack": [],
"_id": "5dd5259bcdb7af35f09e6f9f"
}
],
"__v": 0
}
]
I want to send a post request to my nodeJS server with a user if the userid exists in my users array this
"users": [
"5dd13dac47b4c85e382c6e27",
"5dce9f6d95f4ee0017be3c2c"
],
or this
"users": [
"5dd168eea514847564f04a74",
"5dd010a1dfa846001742e913"
],
it send the "program" array back as response here is what I have been trying:
users.post("/dietData", (req, res) => {
var id = req.body.userID;
DietProgram.find()
.then(user => {
if (user.users.contains(id)) {
res.json(user.program);
} else {
res.send("User Weight Data Does not Exsist");
}
})
.catch(err => {
res.send("error: " + err);
});
});
but I got an error that apparently I can't use contain.
Since users property is an Array, I think you are trying to use include()you can use it like this users.includes(id)
Logstash is receiving a docs JSON object which contains various types of docs.
{
"docs": [
{
"_id": "project:A",
"_rev": "project:1",
"name": "secret",
"children": ["item:A"]
},
{
"_id": "item:A",
"_rev": "item:1",
"name": "secret"
}
]
}
I want each doc with an _id starting with project to include matching children. The end result should be:
{
"docs": [
{
"_id": "project:A",
"_rev": "project:1",
"name": "secret",
"children": [{
"_id": "item:A",
"_rev": "item:1",
"name": "secret"
}]
},
]
}
How can I achieve this?
Here is my conf file. I aven't been able to figure out how to solve this:
input {
file {
path => ["/home/logstash/logstash-testdata.json"]
sincedb_path => "/dev/null"
start_position => "beginning"
}
}
filter {
json {
source => "message"
}
// ... ???
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout {
codec => rubydebug
}
}
I generate an envelope of 10-15 documents. There are two end users, the user that will sign should not see document 1 with the data of the second user. The second is to see all the documents.
I tried to use "excludedDocuments", but I come across the error "ACCOUNT_LACKS_PERMISSIONS".
The example I am generating (incomplete)
{"compositeTemplates": [
{
"inlineTemplates": [
{
"sequence": "1",
"recipients": {
"signers": [
{
"email": "albert#princeton.edu",
"name": "Albert Einstein",
"recipientId": "1",
"clientUserId": "albert#princeton.edu",
"routingOrder": "1",
"tabs": {
"textTabs": [],
"radioGroupTabs": [],
"checkboxTabs": []
}
}
]
}
}
],
"document": {
"documentId": 1,
"name": "FirstFile",
"transformPdfFields": "true"
}
},
{
"inlineTemplates": [
{
"sequence": "1",
"recipients": {
"signers": [
{
"email": "albert#princeton.edu",
"name": "Albert Einstein",
"recipientId": "1",
"clientUserId": "albert#princeton.edu",
"routingOrder": "1",
"tabs": {
"textTabs": [],
"radioGroupTabs": [],
"checkboxTabs": []
}
}
]
}
}
],
"document": {
"documentId": 2,
"name": "SecondFile",
"transformPdfFields": "true"
}
}
]}
Please tell me how to solve this problem. Thank you in advance
upd PHP7 code:
$compositeTemplates[] = [
'inlineTemplates' => [
[
'sequence' => '1',
'recipients' => [
'signers' => [
[
'email' => $userData['email'],
'name' => $userData['name'],
'recipientId' => '1',
'clientUserId' => $userData['email'],
'routingOrder' => '1',
"excludedDocuments" => ['1'],
'tabs' => [
'textTabs' => Template::fileTextTabs($sendData[$withoutExtension]['text'] ?? false), //here the simple formation of tabs according to what is
'radioGroupTabs' => Template::fileRadioGroupTabs($sendData[$withoutExtension]['radio'] ?? false),
'checkboxTabs' => Template::fileCheckboxTabs($sendData[$withoutExtension]['checkbox'] ?? false),
],
],
],
"carbonCopies" => [
[
"email" => 'mylyrium#gmail.com',
"name" => 'copies',
"recipientId" => "2",
"routingOrder" => '1',
],
],
],
],
],
'document' => [
'documentId' => $id,
'name' => $filename,
'transformPdfFields' => 'true',
],
];
$id++;
This error means that the Admin account is not properly configured to have the document visibility enabled.
To do so, go to your DocuSign Admin Account and scroll down to Sending Settings.
Make sure that one of the below options is selected instead of Off
For more information on the Document Visibility drop-down options, see the official documentation
I'm new in this ELK stuff. I've been trying to create visualizations using this stack, but I'm not able to use fields such as verb, response, request, etc, I'm only able to select a few available fields:
However, in the Discover section I'm perfectly able to work with those fields. Here is a sample of one of my query results:
(I'm using Kibana 4.4.2, filebeat forwarding to logstash 2.2.3)
{
"_index": "filebeat-2016.04.12",
"_type": "apache_log",
"_id": "AVQMoRFwO5HM5nz1lmXf",
"_score": null,
"_source": {
"message": "187.142.15.173 - - [12/Apr/2016:16:39:23 -0600] \"GET /v1.0/person/297312123/client/1132347/profile HTTP/1.1\" 200 2051 \"-\" \"Android CEX 2.2.0\"",
"#version": "1",
"#timestamp": "2016-04-12T22:39:27.064Z",
"beat": {
"hostname": "myhost",
"name": "myhost"
},
"count": 1,
"fields": null,
"input_type": "log",
"offset": 30034512,
"source": "/var/log/httpd/access_log",
"type": "apache_log",
"host": "myhost",
"tags": [
"beats_input_codec_plain_applied"
],
"clientip": "187.142.15.173",
"ident": "-",
"auth": "-",
"timestamp": "12/Apr/2016:16:39:23 -0600",
"verb": "GET",
"request": "/v1.0/person/297312123/client/1132347/profile",
"httpversion": "1.1",
"response": "200",
"bytes": "2051",
"referrer": "\"-\"",
"agent": "\"Android CEX 2.2.0\"",
},
"fields": {
"#timestamp": [
1460500767064
]
},
"sort": [
1460500767064
]
}
What could posibly be wrong with this?
Here is my config file:
filter {
if [type] == "syslog" {
grok {
match => { "message" =>
"%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}"
}
add_field => [ "received_at", "%{#timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
if [type] == "apache_log" {
grok {
# match => [ "message", "%{COMBINEDAPACHELOG}" ]
# match => { "message" => "%{COMBINEDAPACHELOG}" }
# add_field => [ "received_at", "%{#timestamp}" ]
# add_field => [ "received_from", "%{host}" ]
match => [ "message", "%{COMBINEDAPACHELOG}" ]
}
#syslog_pri { }
#date {
# match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
#}
}
}
Thanks in advance!
My first thought would be the kibana field cache. Go to Settings->Indexes, select your index, and click the orange Reload button.
Logstash filter by tags for different websites
Issue: I have multiple websites inside a single IIS Server.. I want to add a "Tag" for each of the log files i am sending towards logstash
This is my logstash forwarder config
Each log file represents a different website.. so i want to add tags for each of these logs and be able to filter by this particular tag.
"logs\svr05\ex*",
{
"network": {
"servers": [ "logsvr1.logs.local:5000", "logsvr2.logs.local:5000" ],
"timeout": 15,
"ssl ca": "logstash-forwarder-new.crt"
},
"files": [
{
"paths": [
"logs\\svr08\\ex*",
"logs\\svr05\\ex*",
"logs\\svr04\\ex*",
"logs\\svr03\\ex*"
],
"fields": { "type": "iis" },
"dead time": "24h"
}
]
}
This is my IIS config for logstash..
filter {
if [type] == "iis" {
if [message] =~ "^#" {
drop {}
}
grok {
break_on_match => false
match => [
"message", "%{TIMESTAMP_ISO8601:log_timestamp} %{IPORHOST:s-sitename} %{IPORHOST:s-ip} %{URIPROTO:cs-method} %{URIPATH:cs-uri-stem} (?:%{NOTSPACE:cs_query}|-) %{NUMBER:src_port} %{NOTSPACE:cs_username} %{IP:clientip} %{NOTSPACE:useragent} %{NUMBER:sc-substatus} %{NUMBER:sc_win32_status} %{NUMBER:sc-bytes} %{NUMBER:cs-bytes} %{NUMBER:timetaken}"
]
}
date {
locale => "en"
match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
target => "#timestamp"
timezone => "Indian/Maldives"
}
useragent {
source=> "useragent"
prefix=> "browser"
}
geoip {
source => "clientip"
target => "geoip"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
add_field => [ "src_ip", "%{clientip}" ]
convert => [ "[geoip][coordinates]", "float" ]
replace => [ "#source_host", "%{clientip}" ]
replace => [ "#message", "%{message}" ]
rename => [ "cs_method", "method" ]
rename => [ "cs_stem", "request" ]
rename => [ "useragent", "agent" ]
rename => [ "cs_username", "username" ]
rename => [ "sc_status", "response" ]
rename => [ "timetaken", "time_request" ]
}
}
}
filter
{
if [type] == "iis" {
mutate {
remove_field => [ "clientip", "host", "hostname", "logtime" ]
}
}
}
Suppose I want to send logs different apps
app1.egov.mv
app2.egov.mv
how can i add tags for these different IIS applications? and filter them in the discovery module to make graphs for specific websites using the tag? :|
regards,
Ismail
You already know how to add the type field so just use the same method to add another field containing the name of the host:
{
...,
"files": [
{
"paths": [
"logs\\svr08\\ex*",
"logs\\svr05\\ex*",
"logs\\svr04\\ex*",
"logs\\svr03\\ex*"
],
"fields": {
"type": "iis",
"virtualhost": "app1.egov.mv"
},
"dead time": "24h"
}
]
}
Obviously, if your different logfile patterns are for different servers you'll have to split your configuration:
{
...,
"files": [
{
"paths": [
"logs\\svr08\\ex*"
],
"fields": {
"type": "iis",
"virtualhost": "app1.egov.mv"
},
"dead time": "24h"
},
{
"paths": [
"logs\\svr05\\ex*"
],
"fields": {
"type": "iis",
"virtualhost": "app2.egov.mv"
},
"dead time": "24h"
},
...
]
}
Another option (that I prefer) is to have the web server itself include the hostname in each log entry.