sending a list as POST requests - python-3.x

With python I am trying to debug this PHP code (which is inside a lot of other code that would clutter the question)
$sql = rtrim("INSERT INTO someTable (foo, bar, id) VALUES " . str_repeat("(?, ?, $id)," , count($contents)), ',');
$stmt = $dbh->prepare($sql);
foreach ($contents as $i => $line) {
$stmt->bindParam(2*$i+1, $line['foo'], PDO::PARAM_STR);
$stmt->bindParam(2*$i+2, $line['bar'], PDO::PARAM_STR);
echo $line['dutch'] . '->' . $line['english'] . '\n';
if (!$stmt->execute()) {
$dbh->rollBack();
return false;
}
}
$dbh->commit();
return true;
for this debugging I made this piece of code
import requests
s = requests.Session()
content = [
{'foo': 'f', 'bar': 'b'},
{'foo': 'o', 'bar': 'a'},
{'foo': 'o', 'bar': 'r'}
]
s.post("http://localhost:8000", data={'content': content})
Which failed to add the data to my database
So, I tried a little debugging
print_r($_POST)
which returned
Array
(
[someOtherVariables] => otherValues
[content] => foo
)
I had expected content to contain an array of objects yet it somehow contains only the name of the first index.
Does anyone know why this python code is not sending the full list and how I could modify my code so that it does work?
I have also tried sending the data with the json instead of the data variable s.post("http://localhost:8000", json={'content': content}) but that does not seem to solve my problem.

I think it could be that you need to convert your python dict to a json string, before sending it via requests.
import json
# a Python object (dict):
x = {
"name": "John",
"age": 30,
"city": "New York"
}
# convert into JSON:
y = json.dumps(x)
In your case:
contentAsjsonString = json.dumps(content)
s.post("http://localhost:8000", data={'content': contentAsjsonString })

Related

firestore python do not overwrite data

i have this in a collection:
resul = {
'name': 'bob',
'position': 'ceo',
}
when i run a python code again with a different position value a new data , it's overwriting the position and i dont want it to
result i want:
resul = {
'name':'bob',
'position':'ceo',
'age':25,
}
result i am getting:
resul = {
'name':'bob',
'position':'co-founder',
'age':25,
}
code using:
info_dic={}
info_dict.update(resul)
infoREferenc = db.collection(u'test').document(u'bob')
infoREferenc.set({"general":info_dic}, merge=True)
If the field(ex. position,age,name) is existing in your map(general) it will just update the value of the field. If not, it will write the field and insert with its value. I suggest add a new field in your map for additional position, for example:
info_dic={
u'name': u'bob',
u'position':u'ceo',
u'secondary-position': u'co-founder',
'age': 26
}
infoREferenc = db.collection(u'test').document(u'bob')
infoREferenc.set({'general':info_dic}, merge=True)

How do I get a value from a json dict that's not constant?

I'm trying to write an automation script that needs to get the values from the output below. The problem is the CValue is not a constant number. It can range anywhere from 1 - x sample values. Is there a way I can store each value properly?
{
'Output': {
'Name': 'Sample',
'Version': {
'Errors': [],
'VersionNumber': 2,
'AValue': 'Hello',
'BValue': ['val:val:BVal'],
'CValue': [{
'DValue': 'aaaaa-bbbbb-cccc',
'Name': 'Sample_Name_1'
}, {
'DValue': 'aaaaa-bbbbb-ddddd',
'Name': 'Sample_Name_2'
}]
}
},
'RequestId': 'eeeee-fffff-gggg'
}
Right now, I'm doing it in the most inefficient way by storing each value separately. My code looks like something below:
def get_sample_values():
test_get = command.sdk(xxxx)
dset_1 = test_get['Output']['Version']['CValue'][0]['DValue']
dset_2 = test_get['Output']['Version']['CValue'][1]['DValue']
return dset_1, dset_2
It works but it's limited to only 2 sets of the dset. Can you please provide input on how I can do it more efficiently?
Use case is this, I need the DValues for another function that requires it. The format for that request is going to be something like:
Source = {
'SourceReference': {
'DataReference': [
{
'EValue': 'string, string, string'
'FValue': DValue1
},
'EValue': 'string, string, string'
'FValue': DValue2
}
]
}
Use a list comprehension to create a list constructed from the desired element of the CValue dicts, then return the list.
return [x['DValue'] for x in test_get['Output']['Version']['CValue']]
Does this work for you?
# either
def get_sample_values():
test_get = command.sdk(xxxx)
return test_get['Output']['Version']['CValue']
# or a generator - maybe not that useful here, but possible
def get_sample_values():
test_get = command.sdk(xxxx)
yield from test_get['Output']['Version']['CValue']
# Then you can use it
for value in get_sample_values():
print(value)
# or
print(values[3])
# for the generator
values = list(get_sample_values())
print(values[3])
For more information https://realpython.com/introduction-to-python-generators/

Remove hyphens from keys in deeply nested map

I posted this question in the Groovy mailing lists, but I've not yet gotten an answer. I was wondering if someone can help here. I am re-posting relevant text from my original question.
I have an input json that’s nested, that is read via a JsonSlurper, and some of the keys have hyphens in them. I need to replace those keys that have hyphens with underscores and convert it back to json for downstream processing. I looked at the JsonGenerator.Options documentation and I could not find any documentation for this specific requirement.
I also looked through options to iterate through the Map that is produced from JsonSlurper, but unfortunately I’m not able to find an effective solution that iterates through a nested Map, changes the keys and produces another Map which could be converted to a Json string.
Example Code
import groovy.json.*
// This json can be nested many levels deep
def inputJson = """{
"database-servers": {
"dc-1": [
"server1",
"server2"
]
},
"discovery-servers": {
"dc-3": [
"discovery-server1",
"discovery-server2"
]
}
}
"""
I need to convert the above to json that looks like the example below. I can iterate through and convert using the collectEntries method which only works on the first level, but I need to do it recursively, since the input json can be an nested many levels deep.
{
"database_servers": {
"dc_1": [
"server1",
"server2"
]
},
"discovery_servers": {
"dc_3": [
"discovery-server1",
"discovery-server2"
]
}
}
Seems like you just need a recursive method to process the slurped Map and its sub-Maps.
import groovy.json.JsonSlurper
JsonSlurper slurper = new JsonSlurper()
def jsonmap = slurper.parseText( inputJson )
Map recurseMap( def inputMap ) {
return inputMap.collectEntries { key, val ->
String newkey = key.replace( "-", "_" )
if ( val instanceof Map ) {
return [ newkey, recurseMap( val ) ]
}
return [ newkey, val ]
}
}
def retmap = recurseMap( jsonmap )
println retmap // at this point you can use output this however you like

node.js - Is there any proper way to parse JSON with large numbers? (long, bigint, int64)

When I parse this little piece of JSON:
{ "value" : 9223372036854775807 }
This is what I get:
{ hello: 9223372036854776000 }
Is there any way to parse it properly?
Not with built-in JSON.parse. You'll need to parse it manually and treat values as string (if you want to do arithmetics with them there is bignumber.js) You can use Douglas Crockford JSON.js library as a base for your parser.
EDIT2 ( 7 years after original answer ) - it might soon be possible to solve this using standard JSON api. Have a look at this TC39 proposal to add access to source string to a reviver function - https://github.com/tc39/proposal-json-parse-with-source
EDIT1: I created a package for you :)
var JSONbig = require('json-bigint');
var json = '{ "value" : 9223372036854775807, "v2": 123 }';
console.log('Input:', json);
console.log('');
console.log('node.js bult-in JSON:')
var r = JSON.parse(json);
console.log('JSON.parse(input).value : ', r.value.toString());
console.log('JSON.stringify(JSON.parse(input)):', JSON.stringify(r));
console.log('\n\nbig number JSON:');
var r1 = JSONbig.parse(json);
console.log('JSON.parse(input).value : ', r1.value.toString());
console.log('JSON.stringify(JSON.parse(input)):', JSONbig.stringify(r1));
Output:
Input: { "value" : 9223372036854775807, "v2": 123 }
node.js bult-in JSON:
JSON.parse(input).value : 9223372036854776000
JSON.stringify(JSON.parse(input)): {"value":9223372036854776000,"v2":123}
big number JSON:
JSON.parse(input).value : 9223372036854775807
JSON.stringify(JSON.parse(input)): {"value":9223372036854775807,"v2":123}
After searching something more clean - and finding only libs like jsonbigint, I just wrote my own solution. Is not the best, but it solves my problem. For those that are using Axios you can use it on transformResponse callback (this was my original problem - Axios parses the JSON and all bigInts cames wrong),
const jsonStr = `{"myBigInt":6028792033986383748, "someStr":"hello guys", "someNumber":123}`
const result = JSON.parse(jsonStr, (key, value) => {
if (typeof value === 'number' && !Number.isSafeInteger(value)) {
let strBig = jsonStr.match(new RegExp(`(?:"${key}":)(.*?)(?:,)`))[1] // get the original value using regex expression
return strBig //should be BigInt(strBig) - BigInt function is not working in this snippet
}
return value
})
console.log({
"original": JSON.parse(jsonStr),
"handled": result
})
A regular expression is difficult to get right for all cases.
Here is my attempt, but all I'm giving you is some extra test cases, not the solution. Likely you will want to replace a very specific attribute, and a more generic JSON parser (that handles separating out the properties, but leaves the numeric properties as strings) and then you can wrap that specific long number in quotes before continuing to parse into a javascript object.
let str = '{ "value" : -9223372036854775807, "value1" : "100", "strWNum": "Hi world: 42 is the answer", "arrayOfStrWNum": [":42, again.", "SOIs#1"], "arrayOfNum": [100,100,-9223372036854775807, 100, 42, 0, -1, 0.003] }'
let data = JSON.parse(str.replace(/([:][\s]*)(-?\d{1,90})([\s]*[\r\n,\}])/g, '$1"$2"$3'));
console.log(BigInt(data.value).toString());
console.log(data);
you can use this code for change big numbers to strings and later use BigInt(data.value)
let str = '{ "value" : -9223372036854775807, "value1" : "100" }'
let data = JSON.parse(str.replace(/([^"^\d])(-?\d{1,90})([^"^\d])/g, '$1"$2"$3'));
console.log(BigInt(data.value).toString());
console.log(data);

Multiple key search in CouchDB

Given the following object structure:
{
key1: "...",
key2: "...",
data: "..."
}
Is there any way to get this object from a CouchDB by quering both key1 and key2 without setting up two different views (one for each key) like:
select * from ... where key1=123 or key2=123
Kind regards,
Artjom
edit:
Here is a better description of the problem:
The object described above is a serialized game state. A game has exactly one creator user (key1) and his opponent (key2). For a given user I would like to get all games where he is involved (both as creator and opponent).
Emit both keys (or only one if equal):
function(doc) {
if (doc.hasOwnProperty('key1')) {
emit(doc.key1, 1);
}
if (doc.hasOwnProperty('key2') && doc.key1 !== doc.key2) {
emit(doc.key2, 1);
}
}
Query with (properly url-encoded):
?include_docs=true&key=123
or with multiple values:
?include_docs=true&keys=[123,567,...]
UPDATE: updated to query multiple values with a single query.
You could create a CouchDB view which produces output such as:
["key1", 111],
["key1", 123],
["key2", 111],
["key2", 123],
etc.
It is very simple to write a map view in javascript:
function(doc) {
emit(["key1", doc["key1"]], null);
emit(["key2", doc["key2"]], null);
}
When querying, you can query using multiple keys:
{"keys": [["key1", 123], ["key2", 123]]}
You can send that JSON as the data in a POST to the view. Or preferably use an API for your programming language. The results of this query will be each row in the view that matches either key. So, every document which matches on both key1 and key2 will return two rows in the view results.
I also was struggling with simular question, how to use
"select * from ... where key1=123 or key2=123".
The following view would allow you to lookup customer documents by the LastName or FirstName fields:
function(doc) {
if (doc.Type == "customer") {
emit(doc.LastName, {FirstName: doc.FirstName, Address: doc.Address});
emit(doc.FirstName, {LastName: doc.LastName, Address: doc.Address});
}
}
I am using this for a web service that queries all my docs and returns every doc that matches both the existence of a node and the query. In this example I am using the node 'detail' for the search. If you would like to search a different node, you need to specify.
This is my first Stack Overflow post, so I hope I can help someone out :)
***Python Code
import tornado.httpserver
import tornado.ioloop
import tornado.options
import tornado.web
import httplib, json
from tornado.options import define,options
define("port", default=8000, help="run on the given port", type=int)
class MainHandler(tornado.web.RequestHandler):
def get(self):
db_host = 'YOUR_COUCHDB_SERVER'
db_port = 5984
db_name = 'YOUR_COUCHDB_DATABASE'
node = self.get_argument('node',None)
query = self.get_argument('query',None)
cleared = None
cleared = 1 if node else self.write('You have not supplied an object node.<br>')
cleared = 2 if query else self.write('You have not supplied a query string.<br>')
if cleared is 2:
uri = ''.join(['/', db_name, '/', '_design/keysearch/_view/' + node + '/?startkey="' + query + '"&endkey="' + query + '\u9999"'])
connection = httplib.HTTPConnection(db_host, db_port)
headers = {"Accept": "application/json"}
connection.request("GET", uri, None, headers)
response = connection.getresponse()
self.write(json.dumps(json.loads(response.read()), sort_keys=True, indent=4))
class Application(tornado.web.Application):
def __init__(self):
handlers = [
(r"/", MainHandler)
]
settings = dict(
debug = True
)
tornado.web.Application.__init__(self, handlers, **settings)
def main():
tornado.options.parse_command_line()
http_server = tornado.httpserver.HTTPServer(Application())
http_server.listen(options.port)
tornado.ioloop.IOLoop.instance().start()
if __name__ == '__main__':
main()
***CouchDB Design View
{
"_id": "_design/keysearch",
"language": "javascript",
"views": {
"detail": {
"map": "function(doc) { var docs = doc['detail'].match(/[A-Za-z0-9]+/g); if(docs) { for(var each in docs) { emit(docs[each],doc); } } }"
}
}
}

Resources