Get value in Nested Dictionary Python Odoo - python-3.x

I have a problem...again. It is related to my previous question in Cron. I've got JSON value and I want to enter it in database. I need help in getting values in this nested dict. Plz help!
JSON
{'folders': [{'id': 94, 'name': 'Retargeting January 2021', 'totalBlacklisted': 606, 'uniqueSubscribers': 19988, 'totalSubscribers': 19382},
{'id': 90, 'name': 'Leads', 'totalBlacklisted': 0, 'uniqueSubscribers': 0, 'totalSubscribers': 0},
{'id': 84, 'name': 'Retargeting Year End', 'totalBlacklisted': 1367, 'uniqueSubscribers': 18847, 'totalSubscribers': 17480},
{'id': 79, 'name': 'CRM Folder', 'totalBlacklisted': 0, 'uniqueSubscribers': 3, 'totalSubscribers': 3},
{'id': 56, 'name': 'Curioo P', 'totalBlacklisted': 282, 'uniqueSubscribers': 3279, 'totalSubscribers': 2997}]}
Python
res = simplejson.loads(response.text)
self.env['get.folders'].create({
'id' : self.id,
'name': res['name'],
'email_blacklist': res['totalBlacklisted'],
'email_subscribers': res['totalSubscribers'],
'unique_subscribers': res['uniqueSubscribers'],
'foldersId': res['id'],
})
EDIT
At last it works. I try to spell out the values and I don't know how but it works this way. Thanks #Jack Dane for your help.
for folder in folders.get("folders"):
names = folder['name']
ids = folder['id']
blacklist = folder['totalBlacklisted']
subscribe = folder['totalSubscribers']
unique = folder['uniqueSubscribers']
self.env['sendinblue.get_folders'].create({
# 'id' : folder['id'],
'name_folder': names,
'email_blacklist': blacklist,
'email_subscribers': subscribe,
'unique_subscribers': unique,
'foldersId': ids,
})

You can loop through the folders using a foreach loop call the create function:
folders = {'folders': [{'id': 94, 'name': 'Retargeting January 2021', 'totalBlacklisted': 606, 'uniqueSubscribers': 19988, 'totalSubscribers': 19382},
{'id': 90, 'name': 'Leads', 'totalBlacklisted': 0, 'uniqueSubscribers': 0, 'totalSubscribers': 0},
{'id': 84, 'name': 'Retargeting Year End', 'totalBlacklisted': 1367, 'uniqueSubscribers': 18847, 'totalSubscribers': 17480},
{'id': 79, 'name': 'CRM Folder', 'totalBlacklisted': 0, 'uniqueSubscribers': 3, 'totalSubscribers': 3},
{'id': 56, 'name': 'Curioo P', 'totalBlacklisted': 282, 'uniqueSubscribers': 3279, 'totalSubscribers': 2997}]}
for folder in folders.get("folders"):
self.env['get.folders'].create({
'id' : self.id,
'name': folder['name'],
'email_blacklist': folder['totalBlacklisted'],
'email_subscribers': folder['totalSubscribers'],
'unique_subscribers': folder['uniqueSubscribers'],
'foldersId': folder['id'],
})
In my case, I have used folders as the variable which will be returned as a JSON.
If you need any clarification let me know,
Thanks,

Related

Python Altair, query for current axis limits

I know how to set axis limits and whatnot, but how do I query for currently used axis limits?
I don't think this is possible unless you specifically set an axis limit first:
import altair as alt
from vega_datasets import data
source = data.cars.url
chart = alt.Chart(source).mark_circle().encode(
x='Horsepower:Q',
y='Miles_per_Gallon:Q',
)
chart.to_dict()
{'config': {'view': {'continuousWidth': 400, 'continuousHeight': 300}},
'data': {'url': 'https://cdn.jsdelivr.net/npm/vega-datasets#v1.29.0/data/cars.json'},
'mark': 'circle',
'encoding': {'x': {'field': 'Horsepower', 'type': 'quantitative'},
'y': {'field': 'Miles_per_Gallon', 'type': 'quantitative'}},
'$schema': 'https://vega.github.io/schema/vega-lite/v5.2.0.json'}
If you set the domain, you can see it in the spec:
chart = alt.Chart(source).mark_circle().encode(
x=alt.X('Horsepower:Q', scale=alt.Scale(domain=[0, 250])),
y='Miles_per_Gallon:Q',
)
chart.to_dict()
{'config': {'view': {'continuousWidth': 400, 'continuousHeight': 300}},
'data': {'url': 'https://cdn.jsdelivr.net/npm/vega-datasets#v1.29.0/data/cars.json'},
'mark': 'circle',
'encoding': {'x': {'field': 'Horsepower',
'scale': {'domain': [0, 250]},
'type': 'quantitative'},
'y': {'field': 'Miles_per_Gallon', 'type': 'quantitative'}},
'$schema': 'https://vega.github.io/schema/vega-lite/v5.2.0.json'}
and get it via chart.to_dict()['encoding']['x']['scale']['domain'].

Forming a list of dicts in a loop according to if conditions

I need your help.
I have this code:
import ipaddress
from ipaddress import IPv4Network
prefixes = []
ip_addresses_all = [{'address': '10.0.0.1/24', 'vrf': {'id': 31,'name': 'god_inet'},
{'address': '10.0.0.10/24', 'vrf': {'id': 33, 'name': 'for_test'},
{'address': '10.1.1.1/30', 'vrf': {'id': 8, 'name': 'ott_private_net'},
{'address': '10.1.1.2/30', 'vrf': {'id': 11,'name': 'ott_public_net'},
{'address': '10.10.0.129/30', 'vrf': None,},
{'address': '10.10.0.130/30', 'vrf': None,},
{'address': '10.10.0.137/30', 'vrf': None,},
{'address': '10.10.0.138/30', 'vrf': None,}]
for ip in ip_addresses_all:
prefix = str(ipaddress.ip_network(ip.address, False))
mask_length = int(IPv4Network(prefix).prefixlen)
description_interface = ip.address
if ip.vrf:
ip_vrf_name = ip.vrf.name
ip_vrf_id = ip.vrf.id
ip_vrf = ip.vrf
else:
ip_vrf_name = 'null'
ip_vrf_id = 'null'
ip_vrf = 'null'
prefix_dict = {'prefix': prefix,
'vrf': {'name': ip_vrf_name,
'id': ip_vrf_id},
'prefix_description': [description_interface]}
if prefixes:
for i in prefixes:
if prefix != i['prefix']:
prefixes.append(prefix_dict)
elif i['prefix'] == prefix and i['vrf']['name'] == ip_vrf_name and description_interface not in i['prefix_description']:
i['prefix_description'].append(description_interface)
else:
prefixes.append(prefix_dict)
pprint(prefixes)
So, I wanna append the 'prefixes' list with dicts according to this logic : if prefix not in prefixes = [] - create new dict , if prefix with the same vrf already exists, then append a description string to a description key and update existing dict.
I struggle with this for 8 hours and it doesnt work, Ive got infinite loops=)
Intended output, something like that:
{'10.0.0.0/24': {'prefix_description': ['10.0.0.10/24'],'vrf': {'id': 31, 'name': 'god_inet'}},
'10.0.0.0/24': {'prefix_description': ['10.0.0.10/24'],'vrf': {'id': 33, 'name': 'for_test'}},
'10.1.1.0/30': {'prefix_description': ['10.1.1.2/30'],'vrf': {'id': 8, 'name': 'ott_private_net'}},
'10.1.1.0/30': {'prefix_description': ['10.1.1.2/30'],'vrf': {'id': 11, 'name': 'ott_public_net'}},
'10.10.0.128/30': {'prefix_description': ['10.10.0.129/30',
'10.10.0.130/30'],'vrf': {'id': 'null', 'name': 'null'}},
'10.10.0.136/30': {'prefix_description': ['10.10.0.137/30',
'10.10.0.138/30'],'vrf': {'id': 'null', 'name': 'null'}}}
Or better in a list like that:
[{'prefix': '10.0.0.0/24',
'prefix_description': ['77-GOD-VPN-2 ---- Vlan40'],
'vrf': {'id': 31, 'name': 'god_inet'}},
{'prefix': '10.0.0.0/24',
'prefix_description': ['78-ELS-CORE ---- Vlan142'],
'vrf': {'id': 33, 'name': 'for_test'}}]
make prefixes a dict and handle that code in the for loop
import ipaddress
from ipaddress import IPv4Network
prefixes = {}
ip_addresses_all = [{'address': '10.0.0.1/24', 'vrf': {'id': 31,'name': 'god_inet'}},
{'address': '10.0.0.10/24', 'vrf': {'id': 33, 'name': 'for_test'}},
{'address': '10.1.1.1/30', 'vrf': {'id': 8, 'name': 'ott_private_net'}},
{'address': '10.1.1.2/30', 'vrf': {'id': 11,'name': 'ott_public_net'}},
{'address': '10.10.0.129/30', 'vrf': None,},
{'address': '10.10.0.130/30', 'vrf': None,},
{'address': '10.10.0.137/30', 'vrf': None,},
{'address': '10.10.0.138/30', 'vrf': None,}]
for ip in ip_addresses_all:
prefix = str(ipaddress.ip_network(ip['address'], False))
mask_length = int(IPv4Network(prefix).prefixlen)
description_interface = ip['address']
if ip['vrf']:
ip_vrf_name = ip['vrf']['name']
ip_vrf_id = ip['vrf']['id']
ip_vrf = ip['vrf']
else:
ip_vrf_name = 'null'
ip_vrf_id = 'null'
ip_vrf = 'null'
prefix_dict = {
'vrf': {'name': ip_vrf_name,
'id': ip_vrf_id},
'prefix_description': [description_interface]}
if prefix in prefixes and \
prefixes[prefix]['vrf']['name'] == ip_vrf_name and \
description_interface not in prefixes[prefix]['prefix_description']:
prefixes[prefix]['prefix_description'].append(description_interface)
else:
prefixes[prefix] = prefix_dict
from pprint import pprint
pprint(prefixes)
output
{'10.0.0.0/24': {'prefix_description': ['10.0.0.10/24'],
'vrf': {'id': 33, 'name': 'for_test'}},
'10.1.1.0/30': {'prefix_description': ['10.1.1.2/30'],
'vrf': {'id': 11, 'name': 'ott_public_net'}},
'10.10.0.128/30': {'prefix_description': ['10.10.0.129/30', '10.10.0.130/30'],
'vrf': {'id': 'null', 'name': 'null'}},
'10.10.0.136/30': {'prefix_description': ['10.10.0.137/30', '10.10.0.138/30'],
'vrf': {'id': 'null', 'name': 'null'}}}

how to convert data to standard json

messages=%5B%7B%22values%22%3A+%7B%22momentum%22%3A+%220.00%22%7D%2C+%22exchange%22%3A+%22binance%22%2C+%22market%22%3A+%22BNT%2FETH%22%2C+%22base_currency%22%3A+%22BNT%22%2C+%22quote_currency%22%3A+%22ETH%22%2C+%22indicator%22%3A+%22momentum%22%2C+%22indicator_number%22%3A+0%2C+%22analysis%22%3A+%7B%22config%22%3A+%7B%22enabled%22%3A+true%2C+%22alert_enabled%22%3A+true%2C+%22alert_frequency%22%3A+%22once%22%2C+%22signal%22%3A+%5B%22momentum%22%5D%2C+%22hot%22%3A+0%2C+%22cold%22%3A+0%2C+%22candle_period%22%3A+%224h%22%2C+%22period_count%22%3A+10%7D%2C+%22status%22%3A+%22hot%22%7D%2C+%22status%22%3A+%22hot%22%2C+%22last_status%22%3A+%22hot%22%2C+%22prices%22%3A+%22+Open%3A+0.000989+High%3A+0.000998+Low%3A+0.000980+Close%3A+0.000998%22%2C+%22lrsi%22%3A+%22%22%2C+%22creation_date%22%3A+%222020-05-10+16%3A16%3A23%22%2C+%22hot_cold_label%22%3A+%22%22%2C+%22indicator_label%22%3A+%22%22%2C+%22price_value%22%3A+%7B%22open%22%3A+0.000989%2C+%22high%22%3A+0.000998%2C+%22low%22%3A+0.00098%2C+%22close%22%3A+0.000998%7D%2C+%22decimal_format%22%3A+%22%25.6f%22%7D%2C+%7B%22values%22%3A+%7B%22leading_span_a%22%3A+%220.00%22%2C+%22leading_span_b%22%3A+%220.00%22%7D%2C+%22exchange%22%3A+%22binance%22%2C+%22market%22%3A+%22BNT%2FETH%22%2C+%22base_currency%22%3A+%22BNT%22%2C+%22quote_currency%22%3A+%22ETH%22%2C+%22indicator%22%3A+%22ichimoku%22%2C+%22indicator_number%22%3A+1%2C+%22analysis%22%3A+%7B%22config%22%3A+%7B%22enabled%22%3A+true%2C+%22alert_enabled%22%3A+true%2C+%22alert_frequency%22%3A+%22once%22%2C+%22signal%22%3A+%5B%22leading_span_a%22%2C+%22leading_span_b%22%5D%2C+%22hot%22%3A+true%2C+%22cold%22%3A+true%2C+%22candle_period%22%3A+%224h%22%2C+%22hot_label%22%3A+%22Bullish+Alert%22%2C+%22cold_label%22%3A+%22Bearish+Alert%22%2C+%22indicator_label%22%3A+%22ICHIMOKU+4+hr%22%2C+%22mute_cold%22%3A+false%7D%2C+%22status%22%3A+%22cold%22%7D%2C+%22status%22%3A+%22cold%22%2C+%22last_status%22%3A+%22cold%22%2C+%22prices%22%3A+%22+Open%3A+0.000989+High%3A+0.000998+Low%3A+0.000980+Close%3A+0.000998%22%2C+%22lrsi%22%3A+%22%22%2C+%22creation_date%22%3A+%222020-05-10+16%3A16%3A23%22%2C+%22hot_cold_label%22%3A+%22Bearish+Alert%22%2C+%22indicator_label%22%3A+%22ICHIMOKU+4+hr%22%2C+%22price_value%22%3A+%7B%22open%22%3A+0.000989%2C+%22high%22%3A+0.000998%2C+%22low%22%3A+0.00098%2C+%22close%22%3A+0.000998%7D%2C+%22decimal_format%22%3A+%22%25.6f%22%7D%2C+%7B%22values%22%3A+%7B%22bbp%22%3A+%220.96%22%2C+%22mfi%22%3A+%2298.05%22%7D%2C+%22exchange%22%3A+%22binance%22%2C+%22market%22%3A+%22BNT%2FETH%22%2C+%22base_currency%22%3A+%22BNT%22%2C+%22quote_currency%22%3A+%22ETH%22%2C+%22indicator%22%3A+%22bbp%22%2C+%22indicator_number%22%3A+1%2C+%22analysis%22%3A+%7B%22config%22%3A+%7B%22enabled%22%3A+true%2C+%22alert_enabled%22%3A+true%2C+%22alert_frequency%22%3A+%22once%22%2C+%22candle_period%22%3A+%224h%22%2C+%22period_count%22%3A+20%2C+%22hot%22%3A+0.09%2C+%22cold%22%3A+0.8%2C+%22std_dev%22%3A+2%2C+%22signal%22%3A+%5B%22bbp%22%2C+%22mfi%22%5D%2C+%22hot_label%22%3A+%22Lower+Band%22%2C+%22cold_label%22%3A+%22Upper+Band+BB%22%2C+%22indicator_label%22%3A+%22Bollinger+4+hr%22%2C+%22mute_cold%22%3A+false%7D%2C+%22status%22%3A+%22cold%22%7D%2C+%22status%22%3A+%22cold%22%2C+%22last_status%22%3A+%22cold%22%2C+%22prices%22%3A+%22+Open%3A+0.000989+High%3A+0.000998+Low%3A+0.000980+Close%3A+0.000998%22%2C+%22lrsi%22%3A+%22%22%2C+%22creation_date%22%3A+%222020-05-10+16%3A16%3A23%22%2C+%22hot_cold_label%22%3A+%22Upper+Band+BB%22%2C+%22indicator_label%22%3A+%22Bollinger+4+hr%22%2C+%22price_value%22%3A+%7B%22open%22%3A+0.000989%2C+%22high%22%3A+0.000998%2C+%22low%22%3A+0.00098%2C+%22close%22%3A+0.000998%7D%2C+%22decimal_format%22%3A+%22%25.6f%22%7D%5D
i need to convert this data in python3 to standard json for post json api
any solution ?
thanks
That looks like it's been URL form encoded.
Try
import urllib.parse
import json
# note **without** the message= part
stuff = "%5B%7B%22values%22%3A+%7B%22momentum%22%3A+%220.00%22%7D%2C+%22exchange%22%3A+%22binance%22%2C+%22market%22%3A+%22BNT%2FETH%22%2C+%22base_currency%22%3A+%22BNT%22%2C+%22quote_currency%22%3A+%22ETH%22%2C+%22indicator%22%3A+%22momentum%22%2C+%22indicator_number%22%3A+0%2C+%22analysis%22%3A+%7B%22config%22%3A+%7B%22enabled%22%3A+true%2C+%22alert_enabled%22%3A+true%2C+%22alert_frequency%22%3A+%22once%22%2C+%22signal%22%3A+%5B%22momentum%22%5D%2C+%22hot%22%3A+0%2C+%22cold%22%3A+0%2C+%22candle_period%22%3A+%224h%22%2C+%22period_count%22%3A+10%7D%2C+%22status%22%3A+%22hot%22%7D%2C+%22status%22%3A+%22hot%22%2C+%22last_status%22%3A+%22hot%22%2C+%22prices%22%3A+%22+Open%3A+0.000989+High%3A+0.000998+Low%3A+0.000980+Close%3A+0.000998%22%2C+%22lrsi%22%3A+%22%22%2C+%22creation_date%22%3A+%222020-05-10+16%3A16%3A23%22%2C+%22hot_cold_label%22%3A+%22%22%2C+%22indicator_label%22%3A+%22%22%2C+%22price_value%22%3A+%7B%22open%22%3A+0.000989%2C+%22high%22%3A+0.000998%2C+%22low%22%3A+0.00098%2C+%22close%22%3A+0.000998%7D%2C+%22decimal_format%22%3A+%22%25.6f%22%7D%2C+%7B%22values%22%3A+%7B%22leading_span_a%22%3A+%220.00%22%2C+%22leading_span_b%22%3A+%220.00%22%7D%2C+%22exchange%22%3A+%22binance%22%2C+%22market%22%3A+%22BNT%2FETH%22%2C+%22base_currency%22%3A+%22BNT%22%2C+%22quote_currency%22%3A+%22ETH%22%2C+%22indicator%22%3A+%22ichimoku%22%2C+%22indicator_number%22%3A+1%2C+%22analysis%22%3A+%7B%22config%22%3A+%7B%22enabled%22%3A+true%2C+%22alert_enabled%22%3A+true%2C+%22alert_frequency%22%3A+%22once%22%2C+%22signal%22%3A+%5B%22leading_span_a%22%2C+%22leading_span_b%22%5D%2C+%22hot%22%3A+true%2C+%22cold%22%3A+true%2C+%22candle_period%22%3A+%224h%22%2C+%22hot_label%22%3A+%22Bullish+Alert%22%2C+%22cold_label%22%3A+%22Bearish+Alert%22%2C+%22indicator_label%22%3A+%22ICHIMOKU+4+hr%22%2C+%22mute_cold%22%3A+false%7D%2C+%22status%22%3A+%22cold%22%7D%2C+%22status%22%3A+%22cold%22%2C+%22last_status%22%3A+%22cold%22%2C+%22prices%22%3A+%22+Open%3A+0.000989+High%3A+0.000998+Low%3A+0.000980+Close%3A+0.000998%22%2C+%22lrsi%22%3A+%22%22%2C+%22creation_date%22%3A+%222020-05-10+16%3A16%3A23%22%2C+%22hot_cold_label%22%3A+%22Bearish+Alert%22%2C+%22indicator_label%22%3A+%22ICHIMOKU+4+hr%22%2C+%22price_value%22%3A+%7B%22open%22%3A+0.000989%2C+%22high%22%3A+0.000998%2C+%22low%22%3A+0.00098%2C+%22close%22%3A+0.000998%7D%2C+%22decimal_format%22%3A+%22%25.6f%22%7D%2C+%7B%22values%22%3A+%7B%22bbp%22%3A+%220.96%22%2C+%22mfi%22%3A+%2298.05%22%7D%2C+%22exchange%22%3A+%22binance%22%2C+%22market%22%3A+%22BNT%2FETH%22%2C+%22base_currency%22%3A+%22BNT%22%2C+%22quote_currency%22%3A+%22ETH%22%2C+%22indicator%22%3A+%22bbp%22%2C+%22indicator_number%22%3A+1%2C+%22analysis%22%3A+%7B%22config%22%3A+%7B%22enabled%22%3A+true%2C+%22alert_enabled%22%3A+true%2C+%22alert_frequency%22%3A+%22once%22%2C+%22candle_period%22%3A+%224h%22%2C+%22period_count%22%3A+20%2C+%22hot%22%3A+0.09%2C+%22cold%22%3A+0.8%2C+%22std_dev%22%3A+2%2C+%22signal%22%3A+%5B%22bbp%22%2C+%22mfi%22%5D%2C+%22hot_label%22%3A+%22Lower+Band%22%2C+%22cold_label%22%3A+%22Upper+Band+BB%22%2C+%22indicator_label%22%3A+%22Bollinger+4+hr%22%2C+%22mute_cold%22%3A+false%7D%2C+%22status%22%3A+%22cold%22%7D%2C+%22status%22%3A+%22cold%22%2C+%22last_status%22%3A+%22cold%22%2C+%22prices%22%3A+%22+Open%3A+0.000989+High%3A+0.000998+Low%3A+0.000980+Close%3A+0.000998%22%2C+%22lrsi%22%3A+%22%22%2C+%22creation_date%22%3A+%222020-05-10+16%3A16%3A23%22%2C+%22hot_cold_label%22%3A+%22Upper+Band+BB%22%2C+%22indicator_label%22%3A+%22Bollinger+4+hr%22%2C+%22price_value%22%3A+%7B%22open%22%3A+0.000989%2C+%22high%22%3A+0.000998%2C+%22low%22%3A+0.00098%2C+%22close%22%3A+0.000998%7D%2C+%22decimal_format%22%3A+%22%25.6f%22%7D%5D"
parsed = urllib.parse.unquote_plus(stuff) # <<< encoded form, get rid of +
as_json = json.loads(parsed)
print(as_json)
gives me
[{'values': {'momentum': '0.00'}, 'exchange': 'binance', 'market': 'BNT/ETH', 'base_currency': 'BNT', 'quote_currency': 'ETH', 'indicator': 'momentum', 'indicator_number': 0, 'analysis': {'config': {'enabled': True, 'alert_enabled': True, 'alert_frequency': 'once', 'signal': ['momentum'], 'hot': 0, 'cold': 0, 'candle_period': '4h', 'period_count': 10}, 'status': 'hot'}, 'status': 'hot', 'last_status': 'hot', 'prices': ' Open: 0.000989 High: 0.000998 Low: 0.000980 Close: 0.000998', 'lrsi': '', 'creation_date': '2020-05-10 16:16:23', 'hot_cold_label': '', 'indicator_label': '', 'price_value': {'open': 0.000989, 'high': 0.000998, 'low': 0.00098, 'close': 0.000998}, 'decimal_format': '%.6f'}, {'values': {'leading_span_a': '0.00', 'leading_span_b': '0.00'}, 'exchange': 'binance', 'market': 'BNT/ETH', 'base_currency': 'BNT', 'quote_currency': 'ETH', 'indicator': 'ichimoku', 'indicator_number': 1, 'analysis': {'config': {'enabled': True, 'alert_enabled': True, 'alert_frequency': 'once', 'signal': ['leading_span_a', 'leading_span_b'], 'hot': True, 'cold': True, 'candle_period': '4h', 'hot_label': 'Bullish Alert', 'cold_label': 'Bearish Alert', 'indicator_label': 'ICHIMOKU 4 hr', 'mute_cold': False}, 'status': 'cold'}, 'status': 'cold', 'last_status': 'cold', 'prices': ' Open: 0.000989 High: 0.000998 Low: 0.000980 Close: 0.000998', 'lrsi': '', 'creation_date': '2020-05-10 16:16:23', 'hot_cold_label': 'Bearish Alert', 'indicator_label': 'ICHIMOKU 4 hr', 'price_value': {'open': 0.000989, 'high': 0.000998, 'low': 0.00098, 'close': 0.000998}, 'decimal_format': '%.6f'}, {'values': {'bbp': '0.96', 'mfi': '98.05'}, 'exchange': 'binance', 'market': 'BNT/ETH', 'base_currency': 'BNT', 'quote_currency': 'ETH', 'indicator': 'bbp', 'indicator_number': 1, 'analysis': {'config': {'enabled': True, 'alert_enabled': True, 'alert_frequency': 'once', 'candle_period': '4h', 'period_count': 20, 'hot': 0.09, 'cold': 0.8, 'std_dev': 2, 'signal': ['bbp', 'mfi'], 'hot_label': 'Lower Band', 'cold_label': 'Upper Band BB', 'indicator_label': 'Bollinger 4 hr', 'mute_cold': False}, 'status': 'cold'}, 'status': 'cold', 'last_status': 'cold', 'prices': ' Open: 0.000989 High: 0.000998 Low: 0.000980 Close: 0.000998', 'lrsi': '', 'creation_date': '2020-05-10 16:16:23', 'hot_cold_label': 'Upper Band BB', 'indicator_label': 'Bollinger 4 hr', 'price_value': {'open': 0.000989, 'high': 0.000998, 'low': 0.00098, 'close': 0.000998}, 'decimal_format': '%.6f'}]
Whereas if you want a JSON string to POST somewhere, call as_string = json.dumps(parsed)

How do I remove an item in my dictionary?

I am trying to remove an item ("logs") from my dictionary using the del method.
this is my code:
del response.json() ["logs"]
print(response.json())
this is my JSON dictionary:
{'count': 19804,
'next': {'limit': 1, 'offset': 1},
'previous': None,
'results':
[{'id': '334455',
'custom_id': '112',
'company': 28,
'company_name': 'Sunshine and Flowers',
'delivery_address': '34 olive beach house, #01-22, 612345',
'delivery_timeslot': {'lower': '2019-12-06T10:00:00Z', 'upper': '2019-12-06T13:00:00Z', 'bounds': '[)'},
'sender_name': 'Edward Shine',
'sender_email': '',
'sender_contact': '91234567',
'removed': None,
'recipient_name': 'Mint Shine',
'recipient_contact': '91234567',
'notes': '',
'items': [{'id': 21668, 'name': 'Loose hair flowers', 'quantity': 1, 'metadata': {}, 'removed': None}, {'id': 21667, 'name': "Groom's Boutonniere", 'quantity': 1, 'metadata': {}, 'removed': None}, {'id': 21666, 'name': 'Bridal Bouquet', 'quantity': 1, 'metadata': {}, 'removed': None}],
'latitude': '1.28283838383642000000',
'longitude': '103.2828037266201000000',
'created': '2019-08-15T05:40:30.385467Z',
'updated': '2019-08-15T05:41:27.930110Z',
'status': 'pending',
'verbose_status': 'Pending',
'**logs**': [{'id': 334455, 'order': '50c402d8-7c76-45b5-b883-e2fb887a507e', 'order_custom_id': '112', 'order_delivery_address': '34 olive beach house, #01-22, 6123458', 'order_delivery_timeslot': {'lower': '2019-12-06T10:00:00Z', 'upper': '2019-12-06T13:00:00Z', 'bounds': '[)'}, 'message': 'Order was created.', 'failure_reason': None, 'success_code': None, 'success_description': None, 'created': '2019-08-15T05:40:30.431790Z', 'removed': None}, {'id': 334455, 'order': '50c402d8-7c76-45b5-b883-e2fb887a507e', 'order_custom_id': '112', 'order_delivery_address': '34 olive beach house, #01-22, 612345', 'order_delivery_timeslot': {'lower': '2019-12-06T10:00:00Z', 'upper': '2019-12-06T13:00:00Z', 'bounds': '[)'}, 'message': 'Order is pending.', 'failure_reason': None, 'success_code': None, 'success_description': None, 'created': '2019-08-15T05:40:30.433139Z', 'removed': None}],
'reschedule_requests': [],
'signature': None}]
but it is saying this error
KeyError: 'logs'
what am i doing wrong? please assist
Every time you call response.json(), it returns a new dict, so the key you delete from response.json() won't be reflected in the next call to response.json().
You should instead save the returning value of response.json() to a variable before deleting the desired key:
data = response.json()
del data['results'][0]['logs']
print(data)

pass array of arrays from routes to view node.js

I am trying to pass an array of arrays from routes to view page. My data that I am trying to pass is :
[
[10, 10],
[20, 50],
[30, 120],
[40, 80],
[50, 90],
[60, 50],
[70, 70],
[80, 90],
[90, 150],
[100, 50],
[110, 40],
[120, 70],
[130, 20],
[140, 40],
[200, 30]
]
I am getting it in below format:
["10,10,20,50,30,120,40,80,50,90,60,50,70,70,80,90,90,150,100,50,110,40,120,70,130,20,140,40,200,30"]
but I need it in the same format I am sending.
My index.js(routes file) is:
router.get('/', function(req, res, next) {
var dataset = [
[10, 10],
[20, 50],
[30, 120],
[40, 80],
[50, 90],
[60, 50],
[70, 70],
[80, 90],
[90, 150],
[100, 50],
[110, 40],
[120, 70],
[130, 20],
[140, 40],
[200, 30]
];
console.log(dataset);
res.render('index',{"data" : [dataset]});
});
module.exports = router;
and in my view file, I am trying to get it like:
<div class="hold_data" data-info={{data}}></div>
Please suggest if anyone knows how this can be achieved. Thanks in advance :)
Try JSON.stringify as below -
res.render('index',{"data" : JSON.stringify(dataset)});
Hope this helps.

Resources