I'm getting 0 qty in depth stream on binance - rust

When connected to wss://stream.binance.com:9443/ws/ethusdt#depth in place of a quantity i sometimes (in fact quite often) get 0 like in example message:
{
"e": "depthUpdate",
"E": 1623439441427,
"s": "ETHUSDT",
"U": 8550164870,
"u": 8550165574,
"b": [
[
"2406.49000000",
"0.00000000"
],
[
"2406.48000000",
"0.00000000"
],
[
"2406.35000000",
"0.00000000"
],
[
"2406.25000000",
"0.00000000"
],
[
"2406.23000000",
"0.00000000"
],
[
"2406.21000000",
"0.00000000"
],
[
"2406.20000000",
"0.00000000"
],
[
"2406.19000000",
"0.00000000"
],
[
"2406.17000000",
"0.00000000"
],
[
"2406.16000000",
"0.00000000"
],
[
"2406.15000000",
"0.00000000"
],
[
"2406.14000000",
"0.00000000"
],
[
"2406.01000000",
"0.00000000"
],
[
"2405.99000000",
"0.00000000"
],
[
"2405.97000000",
"0.00000000"
],
[
"2405.96000000",
"0.00000000"
],
[
"2405.95000000",
"0.00000000"
],
[
"2405.94000000",
"0.00000000"
],
[
"2405.93000000",
"0.00000000"
],
[
"2405.86000000",
"0.00000000"
],
],
"a": [
[
"2405.59000000",
"0.00000000"
],
[
"2405.60000000",
"0.00000000"
],
[
"2405.61000000",
"0.00000000"
],
[
"2405.64000000",
"0.00000000"
],
[
"2405.65000000",
"0.00000000"
],
[
"2405.66000000",
"0.95380000"
],
[
"2405.67000000",
"8.00000000"
],
[
"2405.83000000",
"0.00000000"
],
[
"2406.10000000",
"0.98850000"
],
[
"2406.11000000",
"3.31912000"
],
[
"2406.12000000",
"3.28392000"
],
[
"2406.13000000",
"0.00000000"
],
[
"2406.16000000",
"0.00000000"
],
[
"2406.17000000",
"0.00000000"
],
[
"2406.18000000",
"0.00000000"
],
[
"2406.19000000",
"0.32100000"
],
[
"2406.20000000",
"0.00000000"
],
[
"2406.21000000",
"0.00000000"
],
[
"2406.22000000",
"0.00000000"
],
[
"2406.23000000",
"0.00000000"
],
[
"2406.26000000",
"1.00000000"
],
[
"2406.29000000",
"3.00000000"
],
[
"2406.30000000",
"0.75917000"
],
[
"2406.31000000",
"0.00000000"
],
[
"2406.36000000",
"0.00000000"
],
[
"2406.37000000",
"0.54000000"
],
[
"2406.46000000",
"7.81255000"
],
[
"2406.47000000",
"10.38000000"
],
[
"2406.48000000",
"3.00000000"
],
[
"2406.49000000",
"1.50103000"
],
[
"2406.50000000",
"0.00000000"
],
[
"2406.51000000",
"1.48621000"
],
[
"2406.62000000",
"0.00000000"
],
[
"2406.63000000",
"9.40536000"
],
[
"2406.64000000",
"17.91499000"
],
[
"2406.70000000",
"2.76583000"
],
[
"2406.78000000",
"0.00000000"
],
[
"2406.79000000",
"0.00000000"
],
[
"2406.80000000",
"0.06000000"
],
[
"2406.81000000",
"0.00000000"
],
[
"2406.83000000",
"0.00000000"
],
]
}
Here the second number in each pair represents Bid or Ask quantity.
I get the same exact behavior in Postman and using rust client library on other coin pairs as well.
Is it intentional or some kind of a bug?

This is normal. I don't know why binance does not fix this, but as API doc say you should ignore it.
How to manage a local order book correctly
If the quantity is 0, remove the price level.

Related

How to convert LabelMe output to COCO format?

I am new to this, can any one please help me to convert labelme json output to coco format output.
here is labelme output file. Thanks in advance.
{
"version": "5.1.1",
"flags": {},
"shapes": [
{
"label": "bean",
"points": [
[
3183.5227272727275,
459.65909090909093
],
[
3174.431818181818,
468.1818181818182
],
[
3162.5,
476.1363636363636
],
[
3151.1363636363635,
487.5
],
[
3144.318181818182,
508.52272727272725
],
[
3142.0454545454545,
543.75
],
[
3144.8863636363635,
571.0227272727273
],
[
3151.7045454545455,
589.7727272727273
],
[
3159.090909090909,
602.8409090909091
],
[
3168.181818181818,
614.2045454545455
],
[
3178.9772727272725,
621.5909090909091
],
[
3192.6136363636365,
628.4090909090909
],
[
3208.5227272727275,
628.4090909090909
],
[
3231.25,
619.8863636363636
],
[
3256.25,
604.5454545454545
],
[
3271.590909090909,
583.5227272727273
],
[
3277.2727272727275,
553.9772727272727
],
[
3277.2727272727275,
524.4318181818181
],
[
3271.0227272727275,
496.59090909090907
],
[
3257.3863636363635,
471.59090909090907
],
[
3238.068181818182,
459.09090909090907
],
[
3212.5,
455.6818181818182
]
],
"group_id": null,
"shape_type": "polygon",
"flags": {}
},
{
"label": "bean",
"points": [
[
2968.75,
320.45454545454544
],
[
2952.840909090909,
327.27272727272725
],
[
2935.7954545454545,
333.52272727272725
],
[
2922.7272727272725,
341.47727272727275
],
[
2915.340909090909,
358.52272727272725
],
[
2911.931818181818,
371.59090909090907
],
[
2913.068181818182,
388.6363636363636
],
[
2919.318181818182,
405.6818181818182
],
[
2928.409090909091,
423.8636363636364
],
[
2947.7272727272725,
448.8636363636364
],
[
2967.6136363636365,
460.22727272727275
],
[
2986.3636363636365,
463.6363636363636
],
[
3003.409090909091,
464.77272727272725
],
[
3020.4545454545455,
463.6363636363636
],
[
3036.931818181818,
454.54545454545456
],
[
3049.431818181818,
446.02272727272725
],
[
3056.818181818182,
431.8181818181818
],
[
3063.068181818182,
411.9318181818182
],
[
3065.340909090909,
390.34090909090907
],
[
3056.818181818182,
372.15909090909093
],
[
3044.318181818182,
353.40909090909093
],
[
3026.7045454545455,
338.6363636363636
],
[
3013.6363636363635,
328.40909090909093
],
[
2992.6136363636365,
318.1818181818182
],
[
2982.9545454545455,
318.75
]
],
"group_id": null,
"shape_type": "polygon",
"flags": {}
},

how to get all details of a particular userid in a json file in python

i want to access the particular userid details
[
{
"userID": 998926445,
"contentID": [
[
"5bbae768c1df412352000004"
],
[
"5ba8d4fac1df413dae0002cf"
],
[
"5ca61afced8f7d3a5f00102d"
],
[
"5b9c9cacc1df41453400003f"
],
[
"5c8a8044a58c4046b30030f2"
],
[
"5ba9070bc1df413dae0003c3"
],
[
"5bbb1087c1df4140a6000162"
],
[
"5c95142bed8f7d5ede004ef4"
],
[
"5ba905e5c1df413dae0003b9"
],
[
"5bb89799c1df41262300062a"
]
]
},
{
"userID": 998926445,
"contentID": [
[
"5baa8ef5c1df41479a0004b8"
],
[
"5c8a8063a58c4046c8000e89"
],
[
"5bbc7a16c1df412a82000008"
],
[
"5bb8964ec1df41262300060c"
],
[
"5bbc4f92c1df4140a6000abe"
],
[
"5bbb0ecbc1df4140a60000fc"
],
[
"5ba90aa2c1df413dae000429"
],
[
"5bf2bb06c1df411238003054"
],
[
"5cb0c006ed8f7d6a1d00146a"
],
[
"5bbc9825c1df41384100024c"
]
]
},
{
"userID": 998926445,
"contentID": [
[
"5bb8974cc1df412623000622"
],
[
"5b9c9cadc1df414534000047"
],
[
"5b8e5b32c1df412918000048"
],
[
"5b9c9cacc1df41453400003f"
],
[
"5bb8ac8ac1df4126230008a0"
],
[
"5b9fad7bc1df4145340000a7"
],
[
"5bbb1171c1df4140a600016c"
],
[
"5c8a8071a58c4046c8000e8d"
],
[
"5ba90dbac1df413dae00043d"
],
[
"5ba8f905c1df413dae000397"
]
]
}
Try to do something like this. You will get the list. then you can work on fixing it. Let us assume json_list is your original json list.
from collections import defaultdict
dd = defaultdict(list)
for i in json_list:
dd[i['userID']].append([j[0] for j in i['contentID']])
dd = dict(dd)
print(dd)
Your output will be something like this:
{998926445: [['5bbae768c1df412352000004', '5ba8d4fac1df413dae0002cf', '5ca61afced8f7d3a5f00102d', '5b9c9cacc1df41453400003f', '5c8a8044a58c4046b30030f2', '5ba9070bc1df413dae0003c3', '5bbb1087c1df4140a6000162', '5c95142bed8f7d5ede004ef4', '5ba905e5c1df413dae0003b9', '5bb89799c1df41262300062a'], ['5baa8ef5c1df41479a0004b8', '5c8a8063a58c4046c8000e89', '5bbc7a16c1df412a82000008', '5bb8964ec1df41262300060c', '5bbc4f92c1df4140a6000abe', '5bbb0ecbc1df4140a60000fc', '5ba90aa2c1df413dae000429', '5bf2bb06c1df411238003054', '5cb0c006ed8f7d6a1d00146a', '5bbc9825c1df41384100024c'], ['5bb8974cc1df412623000622', '5b9c9cadc1df414534000047', '5b8e5b32c1df412918000048', '5b9c9cacc1df41453400003f', '5bb8ac8ac1df4126230008a0', '5b9fad7bc1df4145340000a7', '5bbb1171c1df4140a600016c', '5c8a8071a58c4046c8000e8d', '5ba90dbac1df413dae00043d', '5ba8f905c1df413dae000397']]}

GROK pattern to match URIPATH

Here is my sample URL
http://localhost:8080/abc2/query/errorLogs
was trying to extract only query/errorLogs. For this i have tried below GROK patten
(%{URIPROTO}://%{URIHOST}(?<path>/[^/]+/[^/]+/[^/]+))
Below output i am getting
{
"URIPROTO": [
[
"http"
]
],
"URIHOST": [
[
"localhost:8080"
]
],
"IPORHOST": [
[
"localhost"
]
],
"HOSTNAME": [
[
"localhost"
]
],
"IP": [
[
null
]
],
"IPV6": [
[
null
]
],
"IPV4": [
[
null
]
],
"port": [
[
"8080"
]
],
"path": [
[
"/abc2/query/errorLogs"
]
]
}
but i was expecting path should be "/query/errorLogs".
try this :
(%{URIPROTO}://%{URIHOST}(?<first_path>/[^/]+)%{GREEDYDATA:path})
result:
port 8080
first_path /abc2
path /query/errorLogs

grok help for logstash

My logs look as such
00009139 2015-03-03 00:00:20.142 5254 11607 "HTTP First Line: GET /?main&legacy HTTP/1.1"
I tried using grok debugger to get this information formatted with no success. Is there any way to get this format using grok? The quoted string would be the message
So I used the following formatting simply by using the grok patterns page.
%{NUMBER:Sequence} %{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}? %{NUMBER:Process}%{NUMBER:Process2}%{WORD:Message}
This is the closest I could get with the current info.
%{INT}%{SPACE}%{TIMESTAMP_ISO8601}%{SPACE}%{INT:pid1}%{SPACE}%{INT:pid2}%{SPACE}%{GREEDYDATA:message}
With the above grok pattern, this is what the grokdebugger "catches":
{
"INT": [
[
"00009139"
]
],
"SPACE": [
[
" ",
" ",
" ",
" "
]
],
"TIMESTAMP_ISO8601": [
[
"2015-03-03 00:00:20.142"
]
],
"YEAR": [
[
"2015"
]
],
"MONTHNUM": [
[
"03"
]
],
"MONTHDAY": [
[
"03"
]
],
"HOUR": [
[
"00",
null
]
],
"MINUTE": [
[
"00",
null
]
],
"SECOND": [
[
"20.142"
]
],
"ISO8601_TIMEZONE": [
[
null
]
],
"pid1": [
[
"5254"
]
],
"pid2": [
[
"11607"
]
],
"message": [
[
""HTTP First Line: GET /?main&legacy HTTP/1.1""
]
]
}
Hope I was of some help.
Try to replace %{WORD:Message} at the end of your grok with %{QS:message}.
hope this helps :)

MultiPolgyon search on MongoDB

I can't seem to figure this out and would appreciate any and all help.
I am using Node.js to make a query to MongoDB using the MultiPolygon filter. I am connecting to the native driver rather than using Mongoose because it doesn't seem like Mongoose supports MultiPolygon (absent in docs).
I receive a malformed query error though
GatheredListings.collection.find({
$and: [
{ 'rentAmount': { $gte: filterParameters.minPrice } },
{ 'rentAmount': { $lte: filterParameters.maxPrice } },
{ 'availabilityDate': { $gte: filterParameters.date } },
{ 'propertyType': filterParameters.propertyType },
{ 'bedrooms': filterParameters.bedrooms },
{ 'location': {
$geoWithin: {
$geometry: {
type : "MultiPolygon" ,
coordinates: [
geoArrayCollection
]
}
}}
}
]
}).toArray(function(err, properties) {
if (err || !properties)
{
console.log('err ' + err);
callback("No properties found", null);
}
else {
console.log('Number of properties retrieved: ' + properties.length);
callback(null, properties);
}
});`
The error is :
`err MongoError: Malformed geo query: { $geoWithin: { $geometry: { type: "MultiPolygon", coordinates: [ [ [ [ 43.77307711737606, -79.53517913818359 ], [ 43.79191518340848, -79.4476318359375 ], [ 43.75559702541283, -79.43887710571289 ], [ 43.7501411993079, -79.46514129638672 ], [ 43.7541091221655, -79.46943283081055 ], [ 43.75547303488856, -79.47406768798828 ], [ 43.75510106177428, -79.47715759277344 ], [ 43.75435710860915, -79.48093414306641 ], [ 43.75200119590339, -79.48282241821289 ], [ 43.74592499302, -79.48316574096678 ], [ 43.74282465186857, -79.49621200561523 ], [ 43.7404682852067, -79.49604034423828 ], [ 43.73873195568971, -79.49295043945312 ], [ 43.73525914559611, -79.49501037597655 ], [ 43.73538317799622, -79.50410842895508 ], [ 43.73290248118248, -79.51148986816406 ], [ 43.73947610307701, -79.51320648193359 ], [ 43.73649945803657, -79.52642440795898 ], [ 43.77307711737606, -79.53517913818359 ] ], [ [ 43.79203909839882, -79.4476318359375 ], [ 43.80380985089954, -79.39647674560547 ], [ 43.76334592336985, -79.38712120056152 ], [ 43.76179622406369, -79.39501762390137 ], [ 43.73600333614323, -79.43381309509277 ], [ 43.79203909839882, -79.4476318359375 ] ] ] ] } } }
Thank you
The MultiPolygon wasn't formatted correctly. You wanted this:
{
"type": "MultiPolygon",
"coordinates": [
[
[
[
43.77307711737606,
-79.5351791381836
],
[
43.79191518340848,
-79.4476318359375
],
[
43.75559702541283,
-79.43887710571289
],
[
43.7501411993079,
-79.46514129638672
],
[
43.7541091221655,
-79.46943283081055
],
[
43.75547303488856,
-79.47406768798828
],
[
43.75510106177428,
-79.47715759277344
],
[
43.75435710860915,
-79.4809341430664
],
[
43.75200119590339,
-79.48282241821289
],
[
43.74592499302,
-79.48316574096678
],
[
43.74282465186857,
-79.49621200561523
],
[
43.7404682852067,
-79.49604034423828
],
[
43.73873195568971,
-79.49295043945312
],
[
43.73525914559611,
-79.49501037597655
],
[
43.73538317799622,
-79.50410842895508
],
[
43.73290248118248,
-79.51148986816406
],
[
43.73947610307701,
-79.5132064819336
],
[
43.73649945803657,
-79.52642440795898
],
[
43.77307711737606,
-79.5351791381836
]
]
], // <--- missing this
[ // <--- missing this
[
[
43.79203909839882,
-79.4476318359375
],
[
43.80380985089954,
-79.39647674560547
],
[
43.76334592336985,
-79.38712120056152
],
[
43.76179622406369,
-79.39501762390137
],
[
43.73600333614323,
-79.43381309509277
],
[
43.79203909839882,
-79.4476318359375
]
]
]
]
}
but you were missing a layer of closing and opening of arrays that I have highlighted. The structures are big and nested so it's hard to eyeball what is going on; I'd suggest using tools like jsonlint and geojsonlint to contrast my corrected structure with your previous structure. It's the only way I was able to spot the problem.

Resources