Well, I know this seems to be possible I just don't know how. To begin with, I am using traditional operators (without #task decorator) but I am interested in XComArgs return output format from these operators that can be used in downstream tasks. Below is a sample example
task_1 = DummyOperator(
task_id = 'task_1'
) # returns {"data": {"foo" : [{"cmd": "ls"}]}}
task_2 = BashOperator(
task_2='task_2',
cmd=task_1.output['return_value']['data']['foo'][0]['cmd'] # does not give what I need and returns null.
#cmd = f"{{ ti.xcom_pull(task_ids = 'task_1', key='return_value')['data']['foo'][0]['cmd'] }}" Gives what I need
)
In this example what is working for me which is pure Jinja templating and the new syntax does not work for me using XComArgs. I have tried changing the argument render_template_as_native_obj=True in Dag configuration but does not change anything. I want to use .output format which returns XcomArgs object and is returning the complete dict but have not been able to use the nested keys like above. Also, have tried converting string to JSON and all those combinations but does not seem to work.
Unfortunately, retrieving nested values from XComArgs in a limitation of the TaskFlow API.
The TaskFlow API uses __getitem__ to override the XCom key to use. In your example, the key ends up being "cmd" rather than the value of what cmd represents in that nested object. You'll have to use the original ti.xcom_pull() method until that limitation is addressed.
Related
Trying to add values from the JSON Parser into a dynamic Oracle query but always shows up as blank/empty.
Same results trying to use the formal parameters and declaring them within the Oracle query as well.
Is this possible?
Service Bus trigger. Message comes in and goes into the "For each message" loop. It then gets parsed in the "Parse JSON Message" and when I try to use the dynamic content from that in an Oracle query (e.g. #body('Parse_JSON_Message')?['contentData']?['dynamicValue']), they always show up as nothing/blank.
I can use this same reference later (again #body('Parse_JSON_Message')?['contentData']?['dynamicValue']) with no issues but it always ends up blank in the query.
Even using formal parameterized key values shows the #body('Parse_JSON_Message')?['contentData']?['dynamicValue']) to be NULL.
So my queries are coded as:
SELECT thisValue
WHERE columnName = #body('Parse_JSON_Message')?['contentData']?['dynamicValue'] (have also tried #{body('Parse_JSON_Message')?['contentData']?['dynamicValue']})
But are coming out as:
SELECT thisValue
WHERE columnName =
...with no result obviously.
This query does work if I hard code the value.
If I use dynamic values in my output (where I'm mapping data), it works fine. So how should I correctly dynamic values in this Oracle query?
I am using fullfilment section on dialogflow on a fairly basic program I have started to show myself I can do a bigger project on dialogflow.
I have an object setup that is a dictionary.
I can make the keys a constant through
const KEYS=Object.keys(overflow);
I am going through the values using
if(KEYS.length>0){
var dictionary = overflow[keys[i]]
if I stringify dictionary using
JSON.stringify(item);
I get:
{"stack":"overflow","stack2":"overflowtoo", "stack3":3}
This leads me to believe I am actually facing a dictionary, hence the name of the variable.
I am accesing a string variable such as stack and unlike stack3
Everything I have read online tells me
dictionary.stack
Should work since
JSON.stringify(item);
Shows me:
{"stack":"overflow","stack2":"overflowtoo","stack3":3}
Whenever I:
Try to add the variable to the response string.
Append it to a string using output+=${item.tipo};
I get an error that the function crashed. I can replace it with the stringify line and it works and it gives me the JSON provided so the issue isnt there
Dictionary values are created as such before being accessed on another function:
dictionary[request.body.responseId]={
"stack":"overflow",
"stack2":"overflowtoo",
"stack3":3 };
Based on the suggestion here I saw that the properties where being accessed properly but their type was undefined. Going over things repeatedly I saw that the properties where defined as list rather than single values.
Dot notation works when they stop being lists.
Thanks for guiding me towards the problem.
I'm currently writing an app that accesses google bigquery via their "#google-cloud/bigquery": "^2.0.6" library. In one of my queries I have a where clause where i need to pass a list of ids. If I use UNNEST like in their example and pass an array of strings, it works fine.
https://cloud.google.com/bigquery/docs/parameterized-queries
However, I have found that UNNEST can be really slow and just want to use IN on its own and pass in a string list of ids. No matter what format of string list I send, the query returns null results. I think this is because of the way they convert parameters in order to avoid sql injection. I have to use a parameter because I, myself want to avoid SQL injection attacks on my app. If i pass just one id it works fine, but if i pass a list it blows up so I figure it has something to do with formatting, but I know my format is correct in terms of what IN would normally expect i.e. IN ('', '')
Has anyone been able to just pass a param to IN and have it work? i.e. IN (#idParam)?
We declare params like this at the beginning of the script:
DECLARE var_country_ids ARRAY<INT64> DEFAULT [1,2,3];
and use like this:
WHERE if(var_country_ids is not null,p.country_id IN UNNEST(var_country_ids),true) AND ...
as you see we let NULL and array notation as well. We don't see issues with speed.
I'm trying to extract a security_token from this response :
{}&&{"containers":{"userID":"p8admin","connected":true,"desktop":"icm"},
"userid":"p8admin",
"user_displayname": "p8admin",
"security_token":"-1829880900612241155",
"messages":[{"adminResponse":null,
"moreInformation":null,
"explanation":null,
"number":"0",
"userResponse":null,
"text":"p8admin connect\u00e9."
}]
}
I've tried combining transform and jsonPath :
.check(bodyString.transform(_.split("&&")(1)).jsonPath("&.security_token").saveAs("security_token"))
but i get this error :
value jsonPath is not a member of com.excilys.ebi.gatling.core.check.MatcherCheckBuilder
Let me know if there is a simple way to achieve this.
Thanks
From the documentation on checks:
This API provides a dedicated DSL for chaining the following steps:
defining the check
extracting
transforming
verifying
saving
Since the response isn't valid JSON, you'll need to use bodyString as the type. You can then transform and then save, but you can't go back to step 1. You can parse the value you need out of the JSON during the transform step.
As Stéphane pointed out, the easiest way to get the value is to use a regex check and extract the security_token value directly, as long as you don't need the rest of your JSON object for any logic.
I have the same problem, I used the regex() function like this :
.check(regex(""""security_token":"(.*)",""").saveAs("security_token"))
I want to create a "prepared statement" in postgres using the node-postgres module. I want to create it without binding it to parameters because the binding will take place in a loop.
In the documentation i read :
query(object config, optional function callback) : Query
If _text_ and _name_ are provided within the config, the query will result in the creation of a prepared statement.
I tried
client.query({"name":"mystatement", "text":"select id from mytable where id=$1"});
but when I try passing only the text & name keys in the config object, I get an exception :
(translated) message is binding 0 parameters but the prepared statement expects 1
Is there something I am missing ? How do you create/prepare a statement without binding it to specific value in order to avoid re-preparing the statement in every step of a loop ?
I just found an answer on this issue by the author of node-postgres.
With node-postgres the first time you issue a named query it is
parsed, bound, and executed all at once. Every subsequent query issued
on the same connection with the same name will automatically skip the
"parse" step and only rebind and execute the already planned query.
Currently node-postgres does not support a way to create a named,
prepared query and not execute the query. This feature is supported
within libpq and the client/server protocol (used by the pure
javascript bindings), but I've not directly exposed it in the API. I
thought it would add complexity to the API without any real benefit.
Since named statements are bound to the client in which they are
created, if the client is disconnected and reconnected or a different
client is returned from the client pool, the named statement will no
longer work (it requires a re-parsing).
You can use pg-prepared for that:
var prep = require('pg-prepared')
// First prepare statement without binding parameters
var item = prep('select id from mytable where id=${id}')
// Then execute the query and bind parameters in loop
for (i in [1,2,3]) {
client.query(item({id: i}), function(err, result) {...})
}
Update: Reading your question again, here's what I believe you need to do. You need to pass a "value" array as well.
Just to clarify; where you would normally "prepare" your query, just prepare the object you pass to it, without the value array. Then where you would normally "execute" your query, set the value array in the object and pass it to the query. If it's the first time, the driver will do the actual prepare for you the first time around, and simple do binding and execution for the rest of the iteration.