Building a tree structure with Gun - node.js

I am trying to implement, through Gun, a tree structure that allows to express relationships between companies, subsidiaries of these companies and their products.
The structure that I have done is as follows:
there is a root node: this is the main seller of some product (for example, sensors) and all its children (depth nodes 1) are its customers;
there are children nodes, not leaves, from level depth 1 on: these are the customers of the root knot; they can have different branches; there can be sub-sub-sub-branches of a branch and so on, there are no limits to the number of sub-subbranches;
there are leaf knots: these are the products contained in a branch, a sub-branch, a sub-sub-branch...
Here's how I thought of this:
create a client set;
put clients nodes inside client set;
create a branch set;
put branches nodes inside branch set;
link branches set to client set;
create a sensor set;
put sensors nodes inside sensor set;
link sensors set to branches set.
Somethink like this:
[clients]
/\
/ \
(MecDoland) (FrankMirros)
| |
[branches] [branches]
/ | \ |
(NY) (CA) (DA) ...
| | |
| [sensors]...
| |
| ...
[sensors]
/|\
......
Here's my implementation
const Gun = require('gun')
const gun = new Gun()
// Just a util function
function printError(ack) {
if (ack.err) {
console.log(ack.err)
}
}
// [clients]
const clients = gun.get('clients') // group
// (MecDoland)
const mac = clients.get('MecDoland').put({ data: 12 })
// (FrankMirros)
const pm = clients.get('FrankMirros').put({ data: 13 })
console.log('Adding to "clients" set MecDoland...')
// [clients] -> (MecDoland)
clients.set(mac, printError)
console.log('Adding to "clients" set philip morris...')
// [clients] -> (FrankMirros)
clients.set(pm, printError)
// [branches]
const branches = gun.get('branches') // group
// (MecDolandNY)
const macny = gun.get('MacDonaldNY').put({ data: 1 }, printError)
// (MecDolandCA)
const macca = gun.get('MecDolandCA').put({ data: 2 }, printError)
// [branches] -> (MecDolandNY)
branches.set(macny, printError)
// [branches] -> (MecDolandCA)
branches.set(macca, printError)
// [MecDoland] -> [branches]
mac.set(branches, printError)
//clients.map().once(v => console.log('CLIENT:',v))
//branches.map().once(v => console.log('BRANCH:',v._))
const sensors = gun.get('sensorsMecDolandNY') // group
const temp = gun.get('temperatue').put({ measure: 'celsius', value: 37 })
const press = gun.get('pressure').put({ measure: 'Bar', value: 2 })
sensors.set(temp)
sensors.set(press)
macny.set(sensors)
// Show all clients
gun
.get('clients')
.map()
.once(v => console.log(v))
There are two points that I haven't been able to clarify:
1 - I don't know if I am using Gun sets well: I tried to add some relationships but didn't work out; consider the following instructions:
macDoland.get('client').put(rootNode)
frankMirrors.get('client').put(rootNode)
I tried to put a client relationship for both customer nodes, but printing rootNode showed me that rootNode had client relationship only with frankMirrors node! The first relationship with macDoland node disappeared so I started using Gun sets.
My code works (more or less) but I don't know if I am using Gun set properly and if this is the right way to do it, given the kind of relationship I want to build between main vendors, customers, customers' branches and sensors in each branch.
2 - I'd like to make this approach more generic and scalable: for now there are clients, branches and sensors; this may change in future and the number of levels between clients and sensors may increase.
This means that there may be an arbitrary number of levels between client nodes (depth 1) and sensors (max depth): for example, a very small client may be without branches and thus its client node smallCompany would directly be linked to its sensors; for another example, a gigantic client may have (say) five level of depths and I don't want to hard-code Gun set names such as macDolandNySubSubSubSubBranch.
How could I provide scalability for an arbitrary number of sub-branches?

Related

One to many relationship between two assets hyperledger composer

I am trying a test a use case where I have taken two assets: a car and parts. I want to link the car with different instances of parts using a transaction. My model and js files are below :
namespace org.sample.test
asset Part identified by partId {
o String partId
o String partName
o String partManufacturer
}
asset Car identified by Vin {
o String Vin
--> Part part optional
o String modelNumber
}
transaction MakeCar{
o String carid
o String carmodel
o String[] PartId
}
/**
* Sample transaction processor function.
* #param {org.sample.test.MakeCar} tx The sample transaction instance.
* #transaction
*/
async function makecar(tx) { // eslint-disable-line no-unused-vars
var factory = getFactory();
var vehicle = factory.newResource('org.sample.test','Car',tx.carid);
vehicle.modelNumber = tx.carmodel;
var part = factory.newRelationship('org.sample.test','Part',tx.PartId);
vehicle.part = part;
const assetRegistry = await getAssetRegistry('org.sample.test.Car');
await assetRegistry.add(vehicle);
// Update the asset in the asset registry.
}
I also tried first creating the asset using the getfactory then creating relations by traversing partIds one by one using array but then as my Car asset is not created yet its throwing error.
I updated my transaction fucntion :
async function makecar(tx) { // eslint-disable-line no-unused-vars
var factory = getFactory();
var part;
var vehicle = factory.newResource('org.sample.test','Car',tx.carid);
vehicle.modelNumber = tx.carmodel;
var i=0;
while (i<tx.PartId.length)
{
part = factory.newRelationship('org.sample.test','Part',tx.PartId[i]);
vehicle.part = part;
i++;
}
assetRegistry = await getAssetRegistry('org.sample.test.Car');
await assetRegistry.add(vehicle);
}
Now its giving error : t: Instance org.sample.test.Car#OOOO has property part with type org.sample.test.Part that is not derived from org.sample.test.Part[]
the problem is this line:
var part = factory.newRelationship('org.sample.test','Part',tx.PartId);
it should be [something like]:
var part = factory.newRelationship('org.example.trading','Part',tx.PartId[0]); // 1st element of an array
its because you've defined tx.PartId as an array of relationships in your transaction definition, so you need to access the relevant element.
At this point, I'm not sure how you want to move forward, but your Car (vehicle) asset has an optional one to one relationship with Part (part Id) which is the optional field in your model. Perhaps it needs to be an array of relationships ? -> Part[] part optional But replacing the line above, will at least have it working, in its present form. An example of using an array of relationships is shown in the answer in this SO: -> Creating new participant and adding array of assets by reference to it (in particular :shares array in the model there)

baconjs: throttle consecutive events with criteria

I'm coding a messaging app with Node.js and I need to detect when the same user sends N consecutive messages in a group (to avoid spammers). I'm using a bacon.js Bus where I push the incoming messages from all users.
A message looks like this:
{
"text": "My message",
"user": { "id": 1, name: "Pep" }
}
And this is my working code so far:
const Bacon = require('baconjs')
const bus = new Bacon.Bus();
const CONSECUTIVE_MESSAGES = 5;
bus.slidingWindow(CONSECUTIVE_MESSAGES)
.filter((messages) => {
return messages.length === MAX_CONSECUTIVE_MESSAGES &&
_.uniqBy(messages, 'user.id').length === 1;
})
.onValue((messages) => {
console.log(`User ${__.last(messages).user.id}`);
});
// ... on every message
bus.push(message);
It creates a sliding window, to keep only the number on consecutive messages I want to detect. On every event, it filters the array to let the data flow to the next step only if all the messages in the window belong to the same user. Last, in the onValue, it takes the last message to get the user id.
The code looks quite dirty/complex to me:
The filter doesn't look very natural with streams. Is there a better way to emit an event when N consecutive events match some criteria? .
Is there a better way to receive just a single event with the user (instead of an array of messages) in the onValue function.
It doesn't really throttle. If a user sends N messages in one year, he or she shouldn't be detected. The stream should forget old events somehow.
Any ideas to improve it? I'm open to migrating it to rxjs if that helps.
Maybe start with
latestMsgsP = bus.slidingWindow(CONSECUTIVE_MESSAGES)
.map(msgs => msgs.filter(msg => msgAge(msg) < AGE_LIMIT))
See if we should be blockking someone
let blockedUserIdP = latestMsgsP.map(getUserToBlock)
Where you can use something shamelessly imperative such as
function getUserToBlock(msgs) {
if (msgs.length < CONSECUTIVE_MESSAGES) return
let prevUserId;
for (var i = 0; i < msgs.length; i++) {
let userId = msgs[i].user.id
if (prevUserId && prevUserId != userId) return
prevUserId = userId
}
return prevUserId
}
Consider mapping the property you’re interested in as early as possible, then the rest of the stream can be simpler. Also, equality checks on every item in the sliding window won’t scale well as you increase the threshold. Consider using scan instead, so you simply keep a count which resets when the current and previous values don’t match.
bus
.map('.user.id')
.scan([0], ([n, a], b) => [a === b ? n + 1 : 1, b])
.filter(([n]) => n >= MAX_CONSECUTIVE_MESSAGES)
.onValue(([count, userId]) => void console.log(`User ${userId}`));

GDAX api, create candle graph

I would like to create a candle graph using GDAX api. I am currently using the HTTP request for historical data https://docs.gdax.com/#get-historic-rates bug this is marked that I should use websocket API. Unfortunately I don't know how to handle historic data through Gdax websocket api https://github.com/coinbase/gdax-node. Could someone help me ?
Here are 1m candlesticks built from match channel using GDAX websocket
"use strict";
const
WebSocket = require('ws'),
PRECISION = 8
function _getPair(pair) {
return pair.split('-')
}
let ws = new WebSocket('wss://ws-feed.pro.coinbase.com')
ws.on('open', () => {
ws.send(JSON.stringify({
"type": "subscribe",
"product_ids": [
"ETH-USD",
"BTC-USD"
],
"channels": [
"matches"
]
}))
})
let candles = {}
let lastCandleMap = {}
ws.on('message', msg => {
msg = JSON.parse(msg);
if (!msg.price)
return;
if (!msg.size)
return;
// Price and volume are sent as strings by the API
msg.price = parseFloat(msg.price)
msg.size = parseFloat(msg.size)
let productId = msg.product_id;
let [base, quote] = _getPair(productId);
// Round the time to the nearest minute, Change as per your resolution
let roundedTime = Math.floor(new Date(msg.time) / 60000.0) * 60
// If the candles hashmap doesnt have this product id create an empty object for that id
if (!candles[productId]) {
candles[productId] = {}
}
// If the current product's candle at the latest rounded timestamp doesnt exist, create it
if (!candles[productId][roundedTime]) {
//Before creating a new candle, lets mark the old one as closed
let lastCandle = lastCandleMap[productId]
if (lastCandle) {
lastCandle.closed = true;
delete candles[productId][lastCandle.timestamp]
}
// Set Quote Volume to -1 as GDAX doesnt supply it
candles[productId][roundedTime] = {
timestamp: roundedTime,
open: msg.price,
high: msg.price,
low: msg.price,
close: msg.price,
baseVolume: msg.size,
quoteVolume: -1,
closed: false
}
}
// If this timestamp exists in our map for the product id, we need to update an existing candle
else {
let candle = candles[productId][roundedTime]
candle.high = msg.price > candle.high ? msg.price : candle.high
candle.low = msg.price < candle.low ? msg.price : candle.low
candle.close = msg.price
candle.baseVolume = parseFloat((candle.baseVolume + msg.size).toFixed(PRECISION))
// Set the last candle as the one we just updated
lastCandleMap[productId] = candle
}
})
What they're suggesting is to get the historic rates https://docs.gdax.com/#get-historic-rates from this endpoint, then keep your candles up to date using the Websocket feed messages - whenever a 'match'/ticker message is received, you update the last candle accordingly.
From the docs: "The maximum number of data points for a single request is 300 candles. If your selection of start/end time and granularity will result in more than 300 data points, your request will be rejected." so probably why you can't get more than 2 days worth of data.
ps. I have a live orderbook implemented here and a basic candle stick chart - lots not hooked together fully still but its at least available for preview until its complete https://github.com/robevansuk/gdax-java/

Hyperledger getParticipants

function Exchange(exchange){
// We do the actual exchange here:
// We first need to get both actual nodes:
var nodeIdFrom=exchange.nodeIdFrom;
var quantity =exchange.quantity;
var price = exchange.price;
var nodeIdTo =exchange.nodeIdTo;
return getParticipantRegistry('org.acme.mynetwork.Node')
.then(function(ParticipantRegistry){
ParticipantRegistry.get(nodeIdFrom)
.then(function(Participant){
Participant.Need=Participant.Need+quantity;
Participant.Balance_account=Participant.Balance_account+quantity*price;
return ParticipantRegistry.update(Participant);
});
});
I'm trying to execute a transaction defined as:
transaction Exchange{
o String nodeIdFrom
o String nodeIdTo
o Double quantity
o Double Price
}
To execute a transaction (we take money somewhere and put it somewhere else). With only the ids of the nodes as a parameter.
But right now that function does not work, you can execute it on the playground but my node is not modified.
Is it possible to apply a transaction without giving node as Node (node is a Participant).
it should work - here's an example (Using a fictitious sample 'Trader' Network and like you, I have defined 'qty' as a 'Double' in the Transaction model definition itself) of updating an Asset by a specific identifier (you're doing something similar - Participant by ID) and then - updating the asset's quantity using the Promises chain below. Suggest to use console.log() for outputs too when debugging.
So - given Transaction model:
transaction TraderById {
o String tradeId
o String tradingSymbol
o Double qty
}
and an Asset modeled as:
asset Commodity identified by tradingSymbol {
o String tradingSymbol
o String description
o String mainExchange
o Double quantity
--> Trader owner
}
you can update the Asset quantity ('quantity') as follows:
/**
*
* #param {org.acme.trading.TraderById} tradeById - the trade to be processed
* #transaction
*/
function TradeById(tradeById){
var commodityRegistry;
return getAssetRegistry('org.acme.trading.Commodity')
.then(function(registry){
commodityRegistry=registry;
return commodityRegistry.get(tradeById.tradingSymbol);
})
.then(function(result){
result.quantity-=tradeById.qty;
return commodityRegistry.update(result);
});
}

How to set conditions/filters on references when using Load* methods

I have two tables: Customer and Orders. The customer has a reference to Orders like such:
[Reference]
public List<Order> Orders { get; set; }
The Order class has an attribute Deleted. I'd like to load all Customers (or a subset), and include the Orders, but not the ones with Deleted=true. Can this be done with LoadSelect methods, or what is the recommended way?
Something that would roughly equal the following SQL:
select * from Customers C
join Orders O on O.CustomerId = C.Id
where O.Deleted = False
Is this the best way?
var orderIds = resp.Customers.Select(q => q.Id).ToList();
var allOrders = Db.Select<Order>(o => orderIds.Contains(o.CustomerId) && !o.Deleted);
foreach (var order in allOrders)
{
var cust = resp.Customers.First(q => q.Id == order.custivityId);
if (cust.Orders == null) cust.Orders = new List<Order>();
cust.Orders.Add(order);
}
I've just added a new Merge API in this commit to automatically join a parent collection with their child references that will make this a little easier.
With the new API you can select customers and orders separately and merge the collections together, e.g:
//Select only Customers which have valid orders
var customers = db.Select<Customer>(q =>
q.Join<Order>()
.Where<Order>(o => o.Deleted == false)
.SelectDistinct());
//Select valid orders
var orders = db.Select<Order>(o => o.Deleted == false);
customers.Merge(orders); //merge the results together
customers.PrintDump(); //print the results of the merged collection
This change is available from v4.0.39+ that's now available on MyGet.

Resources