Just wondering if anyone could enlighten me to what sth might be. (Seen in Tranalyzer flow files.) Basically it's a web analysis category (ip address, port, sth, etc) but I'm not sure what meant by it and there is no mention in the documentation.
(Also for bonus points what would a value of dir mean for sth?)
I'd appreciate any help.
sth means : STatement Handle
The connection to a database.
See http://perlmeme.org/tutorials/connect_to_db.html and https://stackoverflow.com/a/13208866/465183
Edit :
In perl, if I display the content of the object using Data::Dumper with a DBI script :
$VAR1 = bless( {}, 'DBI::st' );
but that's not very helpful. It's means only that's a DBI::st object.
Related
Please forgive me, I don't even know if what I am asking is the correct terminology.
So...here goes.
I built a custom firmware of Nodemcu Dali ( from Hackerspace Stutgart.) This includes a dali lighting control "flavour" as they refer to. I had to modify it to work with the most recent LUA version. Anyway that works and the MODULE is built into the firmware.
From the LUA commandline / Interpretor (Esplorer interface, I can call the module and it all works fine.
To use the module you enter:
dali.arc(address_Mode,0,parameter)
or
dali.send(Address_Mode,Command,Address,parameter)
Address_mode can be: dali.SLAVE , dali.GROUP
Command can be dali.UP_200MS , dali.IMMEDIATE_OFF , dali.GO_TO_SCENE --... about 50 commands.
An Example command to send the light level 128 to all drivers would be as follows:
dali.arc(dali.BROADCAST,0,128) -- direct arc mode ( all lights,*ignored*,50% dimmed)
I want to use MQTT to control this thing.
I could use MQTT topics:
dali_topic/arc_broadcast -- for dali.arc(dali.BROADCAST,var1,var2)
dali_topic/group -- for dali.arc(dali.GROUP,var1,var2)
dali_topic/slave -- for dali.arc(dali.SLAVE,var1,var2)
and my payload string would only have to 2 variables,comma seperated eg. 0,128.
This I can all do day long but now I want to make it "better"...
I want to be able to rather send the message " dali.BROADCAST,0,128" which the code should then sort into a table with elements:
table[1] = dali.BROADCAST
table[2] = 0
table[3] = 128
and call dali.arc(table[1],table[2],table[3])
The table creation works,but I cannot get dali.BROADCAST passed to the module /function? call. First off because it is a string and second because it cannot be converted to a number or whatever substitute is required.
If this can be done them the Command field could aslo be sent with the MQTT payload rather than needing 50 MQTT topics.
I suppose I could also just try a lot of if statements or search a lookup table, but perhaps there is a simple way to just insert the command field in to function/module call?
Any assistance greatly appreciated
Edit here is some LUA output:
table ={"dali.BROADCAST",0,128}
dali.arc(table[1],table[2],table[3])
result:
Lua error: stdin:1: bad argument #1 to 'arc' (number expected, got boolean)
since if you print("dali.BROADCAST") you get nil
However
table[4] = dali.BROADCAST
dali.arc(table[4],table[2],table[3])
result works fine.
print( type(dali.BROADCAST ))
gives number
so how to pass my mqtt string dali.BROADCAST which is received as "dali.BROADCAST" and convert it to just dali.BROADCAST?
note I am not sending "" the message is however sent by MQTT as a CSV string.
From the Firmware source for the dali module.. in the module folder: dali.c
LROT_BEGIN(dali, NULL, 0)
LROT_FUNCENTRY( setup, dali_setup )
LROT_FUNCENTRY( arc, dali_arc )
LROT_FUNCENTRY( send, dali_send )
LROT_NUMENTRY( BROADCAST, BROADCAST )
LROT_NUMENTRY( SLAVE, SLAVE )
LROT_NUMENTRY( GROUP, GROUP )
LROT_NUMENTRY( IMMEDIATE_OFF, DALI_IMMEDIATE_OFF)
LROT_NUMENTRY( GO_TO_SCENE, DALI_GO_TO_SCENE)
The shackspace github link is the correct one, it is simply based on LUA1.45 or something low like that. I only had to modify dali.c in Modules to work with the lastest LUA.
The relevant dali files in the firmware is located in
app/modules
app/include
app/dali
folders
EDIT: Thinking about it, you probably always end up indexing dali, in which case you can do so directly by just structuring your table like this:
table[1] = "arc"
table[2] = "BROADCAST"
table[3] = 0
table[4] = 128
This way you can get to dali.BROADCAST by doing dali[table[2]] and to dali.arc by doing dali[table[1]].
HINT: You should probably still keep a whitelist of what is allowed where because someone could send any string and your program shouldn't just blindly index the dali table with that and return it.
You probably want something like this
Here's the relevant code:
function deepindex(tab, path)
if type(path)~="string" then
return nil, "path is not a string"
end
local index, rest = path:match("^%.?([%a%d]+)(.*)")
if not index then
index, rest = path:match("^%[(%d+)%](.*)")
index = tonumber(index)
end
if index then
if #rest>0 then
if tab[index] then
return deepindex(tab[index], rest)
else
return nil, "full path not present in table"
end
else
return tab[index]
end
else
return nil, "malformed index-path string"
end
end
Homework: this function also works with [] indexing for numbers, which you don't need. It should be easy to simplify the function to only do string-indexing with .
You would use that on the global environment to index it with a single string:
deepindex(_G, "dali.BROADCAST")
-- Which is the same as
_G.dali.BROADCAST
-- And, unless dali is a local, also
dali.BROADCAST
Keep in mind though, that this lets you remote-index _G with anything, which is a huge security nightmare. Better do this:
local whitelist = {}
whitelist.dali = dali
deepindex(whitelist, "dali.BROADCAST") -- this works
deepindex(whitelist, "some.evil.submodule") -- This does nothing
Was looking for a Wiki entry with more details, but I found none. If you happen to know where documentation is that specifies more about the available Lua commands, please include a link in your question.
It appears that there may be other functions to approach the same outcome, but I'm not certain how they are worded in your particular build.
https://github.com/shackspace/nodemcu-firmware-dali/blob/master/app/dali/dali_encode.c
What's returned when you print( type( dali.BROADCAST )) ?
I was guessing it might be raw C userdata, the specific case-switch for that arc command, however, I just found a similar Lua project that lists it as hexadecimal 255
https://github.com/a-lurker/Vera-Plugin-DALI-Planet/blob/master/Luup_device/L_DaliPlanet1.lua
Yea, it's likely just sending hexadecimal numbers.
https://en.wikipedia.org/wiki/Digital_Addressable_Lighting_Interface
try sending dali.arc( 0xFF, 0x00, 0x80 )
or dali.arc( 0xFE, 0x80 )
They make it sound like '1111 1110' ( 0xFE ) is directly followed by the brightness value, so that second command might light.
I'm not sure why it doesn't appear to be sending the correct codes when you place them in a table. What you've written appears to be correct, but it's likely a one-way broadcast, so you don't receive back any error messages...
If you can't get the arc command to work with tables, possibly you'll have better luck with the dali.send() command. Might just be a flaw in that app. If you can't get it resolved, submit a bug report to their GitHub page.
I have an array of usernames (eg. ['abc','def','ghi']) to be added under 'user' label in the graph.
Now I first want to check if the username already exists (g.V().hasLabel('user').has('username','def')) and then add only those for which the username property doesn't match under 'user' label.
Also, can this be done in a single gremlin query or groovy script?
I am using titan graph database, tinkerpop3 and gremlin REST server.
With "scripts" you can always pass a multi-line/command script to the server for processing to get what you want done. This question is then answered with normal programming techniques using variables, if/then statements, etc:
t = g.V().has('person','name','bill')
t.hasNext() ? t.next() : g.addV('person').property('name','bill').next()
or perhaps:
g.V().has('person','name','bill').tryNext().orElseGet{
g.addV('person').property('name','bill').next()}
But these are groovy scripts and ultimately TinkerPop recommends avoiding scripts and closures in favor of a pure traversal. The general way to handle a "get or create" in a single traversal is to do something like this:
gremlin> g.V().has('person','name','bill').fold().
......1> coalesce(unfold(),
......2> addV('person').property('name','bill'))
==>v[18]
Also see this StackOverflow question for more information on upsert/"get or create" patterns.
You can do it directly using :
g.V().has('user','username','def').fold().coalesce(unfold(),addV('user').property('username','def'))
Just adding into this answer. It is better to use the following idempotent query. Coalesce works like an if-else statement. Refer to, https://spin.atomicobject.com/2021/08/10/idempotent-queries-in-gremlin/ for more information. Also, if you are noticing that the entry is not being saved, make sure you commit the changes using .next().
g.V().hasLabel('user').has('username','def')
.fold()
.coalesce(
__.unfold(),
__.addV('user').property('username','def')
)
.next()
I am using the following code for partial update
POST /website/blog/1/_update
{
"script" : "ctx._source.views+=1"
}
is there any alternative way I can achieve the same thing. because I don't want to change anything in
groovy script because last time I changed the settings and my server was compromised.
So someone please help me with the solution or some security measures if there is no work around.
No, you cannot dynamically change a field value without using a script.
You can use file-based scripts though, which means that you can disable dynamic scripting (default in ES 1.4.3+) while still using scripting in a safe, trusted way.
config/
elasticsearch.yml
logging.yml
scripts/
your_custom_script.groovy
You could have the script store:
ctx._source.views += your_param
Once stored, you can then access the script by name, which bypasses dynamic scripting.
POST /website/blog/1/_update
{
"script": "your_custom_script",
"params" : {
"your_param" : 1
}
}
Depending on the version of Elasticsearch, the script parameter is better named (e.g., ES 2.0 uses "inline" for dynamic scripts), but this should get you off the ground.
I am using websockets , nodejs v0.10.12 and also PostgreSQL 9.1, with PostGIS 2.0.
Now, on websockets, on the server side, in order to gather textual data and send them to the client I perform a query using node's pg plugin.
I have something like
var query = client.query('SELECT p_name,p_date FROM pins WHERE p_id ='+ja)
//send them and render in client as html
query.on("row", function (row, result) {result.addRow(row);});
query.on("end", function (result) {
for (var i=0; i<result.rows.length; i++){
connection.send(
'Name</br>'
+result.rows[i].p_name+
'</br>Date</br>'
+result.rows[i].p_date+
'</br>'
}
client.end();
});
Now, here is the tricky part. I want to render the date like 25/02/2012.
With the above code, I get Sat Feb 02 2002 02:00:00 GMT+0200 (Χειμερινή ώρα GTB)
To get DD/MM/YYYY I have to put a line of code like
SET datestyle = "SQL, DMY";
This is apparently PHP and I am using Javascript because I work with websockets.
The only thing I could think of is editing the above query like so
var query = client.query('SET datestyle = "SQL, DMY"; SELECT p_name,p_date FROM pins WHERE p_id ='+ja)
I dont get any errors, but on the client the date renders null.
How can I fix this?
Thanks
OK. Where to start?
This:
var query = client.query('SELECT p_name,p_date FROM pins WHERE p_id ='+ja)
is not the correct way to build a query. Used a parameterised query and protect yourself from SQL injection.
SET datestyle = "SQL, DMY";
This is apparently PHP and I am using Javascript because I work with websockets.
What? I'm trying to think of something constructive about this sentence, but the best I can think of is "What?". It is far from apparent that the above is PHP, because it isn't. The fact that you are sending it to the database ought to give you a hint that it's SQL. Also, you're not using javascript because you work with websockets. You're using javascript because you're using javascript - websockets are nothing to do with anything here.
The only thing I could think of...
Doesn't include looking in the manuals.
Go to the PostgreSQL website, click through to the documentation and manuals, and on the contents page click "Functions and Operators" and then "Data type formatting functions". Here is the link for you:
http://www.postgresql.org/docs/current/static/functions-formatting.html
You'll notice that the PostgreSQL developers not only produce extensive and detailed manuals, but they keep multiple versions online and make it simple to switch back and fore to see what's changed.
There is a whole section on this page on how to format date-times in different ways, with clear descriptions of each effect. I didn't find this using the documentation search or anything clever like that - just the obvious links on each page.
If you did a search you would find plenty on the datestyle parameter, and a little further digging would show that you can set it per-session or as a default for a given user or database.
Finally though, don't do it that way at all. Return ISO-standard date formats like #mu said (YYYY-MM-DD etc). and format them in your javascript client code.
Oh - while I'm no expert, I'm not sure that </br> is valid HTML, XHTML or XML either. Did you perhaps mean <br/>?
A VERY nice to have would be if I could edit object-literals in this editor's text-field instead of JSON expressions.
If I could replace the JSON parse with a simple eval - it will make editing sooooo much easier! (and help me design document structures for my projects soooo much more easily)
I mean, gosh!! it's not a protocol school, it's an editor's tool.
The goal of the tool is not to teach me the protocol and comment me on every petty mistake, but to help me design documents for the software.
Why must it ensist on strict JSON? Can't it live with Object Literals, and do for us the
JSON.stringify( eval(editor_textarea.value))
woulnd't that be cool? LOL :D
(yea yea, catching errors and feeding back to the user)
(and for who ever missed the difference - it is mainly in the quote marks in attribute names.
the dry strict JSON protocol require quotemarks ALWAYS, no question asked, where JS object literal require quote-marks only for attribute names that are not legal JS variable names and accepts also numbers without quotation marks)
Strict dry JSON:
{ "attribute" : "value"
, "mapmap" :
{ "map" :
{ "attr" : "sdss"
, "123" : "ss32332"
, "val" : 23323
, "456" : "ss32332"
}
}
}
Object Literal
{ attribute: "value"
, mapmap :
{ map :
{ attr : "sdss"
, 123 : "ss32332"
, val : 23323
, 456 : "ss32332"
}
}
}
Well, it won't solve me missing commas or mismatching brakets, but it does make life easier, where quote marks are a big part of the scaffold.
If you can point me to where I can change this even as patch on the futon I'll be soooOOO greatful :)
Maybe later we can integrate there an editor helper such as the cool one in github source-editor or the one in jsfiddle, that helps you indent and color things nicely.
But lets start with a simple eval.
it will make life easier... :)
It can also let me generate complicated documents using JS code without any additional test software...
Happy coding :)
P.S
If you know the answer here - you might know the answer to this question:
couchdb futon document editor - can I customize the indentation rules?
I had a quick browse, and I believe this is where you will want to add your eval:
https://github.com/apache/couchdb/blob/master/share/www/script/futon.browse.js#L911
and here:
https://github.com/apache/couchdb/blob/master/share/www/script/futon.browse.js#L902
You can edit your local couchdb instance share/www/script/futon.browse.js if you want to see live changes.