I need a legacy java application to pull information from a meteor's collection.
Ideally, I would need a simple service where my app would be able to download the latest list of items prices. A scenario like going on (through an http GET):
www.mystore.com/listOfPrices
would return a json with an array
[{"item":"beer", price:"2.50"}, {"item":"water":, price:"1"}]
The problem is that I cannot make a meteor page printing the result "as is" because meteor assumes the client supports javascript. Note that I do plan to implement the java DDP client in a latter stage but here I would like to start with a very simple service.
Idea: I thought of doing my own Node.js request aside of the running meteor service in order to retrieve a snapshot of the collection. Then this request would be using a server based javascript DDP client in order to subscribe and filter to then return the collection once loaded as a jSON document (array).
Any idea on how to achieve this ?
Looks like you want to provide a REST interface. See the MeteorPedia page on REST for how to expose collection data. It might be as simple as
prices = new Mongo.Collection('prices');
// Add access points for `GET`, `POST`, `PUT`, `DELETE`
HTTP.publish({collection: prices}, function (data) {
// here you have access to this.userId, this.query, this.params
return prices.find({});
});
Related
I am using smartapi provided by angelbroking.
I want to make a stock ticker which can display realtime price of stocks like this one
https://www.tickertape.in/screener?utm_source=gads&utm_medium=search&utm_campaign=screener&gclid=Cj0KCQiA8ICOBhDmARIsAEGI6o1xfYgsbvDEB6c2OFTEYRp9e5UDnJxgCyBJJphdKTduZ_EOHCAchpoaAp-WEALw_wcB
I am able to connect to websocket using the sdk provided in documentation but I don't know how to display that data in my html page.
Please suggest if you know how to get the json data from nodejs console to html.
The nodejs code is
let { SmartAPI, WebSocket } = require("smartapi-javascript");
let web_socket = new WebSocket({
client_code: "P529774",
feed_token: "0973308957"
});
web_socket.connect()
.then(() => {
web_socket.runScript("nse_cm|2885", "cn") // SCRIPT: nse_cm|2885, mcx_fo|222900 TASK: mw|sfi|dp
web_socket.runScript("nse_cm|2885", "mw")
/*setTimeout(function() {
web_socket.close()
}, 60000)*/
})
web_socket.on('tick', receiveTick)
function receiveTick(data) {
console.log("receiveTick:::::", data)
}
The response I get is similar to this :
[{"ak":"ok","task":"mw","msg":"mw"}]
[{"lo":"1797.55","ts":"ACC-EQ","tp":null,"ltp":"1800.05","ltq":"27","bs":"16","tk":"22","ltt":"31\/08\/2017 11:32:01",
"lcl":null,"tsq":"76435","cng":"-11.15","bp":"1800.00","bq":"510","mc":"34012.01277(Crs)","isdc":"18.77872
(Crs)","name":"sf","tbq":"76497","oi":null,"yh":"1801.25","e":"nse_cm","sp":"1800.90","op":"1814.00","c": "1811.20",
"to":"145093696.35","ut":"31-Aug-2017 11:32:01","h":"1817.55","v":"80391","nc":"- 00.62","ap":"1804.85","yl":"1800.00","ucl":null,"toi":"16654000" }]
The github repo for smartapi nodejs
https://github.com/angelbroking-github/smartapi-javascript
The API Docs
https://smartapi.angelbroking.com/docs/Introduction
There are many ways, here's two:
Cache the last message + HTTP polling
This is not the most efficient solution, but perhaps the simplest. Each time your recieveTick() callback hits, you could save the response message in a global object / collection (cache it). Better yet, you could pre-process the message and therefore just cache whatever info you actually care about in that global collection and save bandwidth on the connection between your frontend HTML and backend.
Then, add an HTTP endpoint to your backend that serves up the last info relevant to a given ticker. You could use Express.js or some other simple HTTP server library. That way when your frontend calls
http://<backend_host>:<backend_port>/tickers/<ticker>
Your backend will read from the cached data and serve up the needed data.
Create your own websocket and forward the data
This is a better solution, specially if your data providers API has a quick (subsecond) refresh rate. Create your own websocket server that will make a websocket connection with your frontend. Then, when you get a message from the data providers websocket, simply processes it in whatever way you would like (to get it into the format your frontend wants) then forward it to the frontend by using your websocket server. This will also be done within the recieveTick() function.
There are many websocket tools for nodejs. For help with the websocket stuff check this out https://ably.com/blog/web-app-websockets-nodejs
Also just a quick note, in your question you said "...how to get the json data from nodejs console to html". This kind of suggests that you would like to write the data to the console, and then read it from the console to html. This isn't the way you should think about it. The console was one destination, and the html is another, both originating from the websocket callback.
I have about 20 documents currently in my collection (and I'm planning to add many more probably in the 100s). I'm using the MongoDB Node.js clients collection.foreach() method to iterate through each one and based on the document records go to 3 different endpoints: two APIs (Walmart and Amazon) and one website scrape (name not relevant). Each document contains the relevant data to execute the requests and then I update the documents with the returned data.
The problem I'm encountering is the Walmart API and the website scrape will not return data toward the end of the iteration. Or at least my database is not getting updated. My assumption is that the foreach method is firing off a bunch of simultaneous requests and either I'm bumping up against some arbitrary limit of simultaneous requests allowed by the endpoint or the endpoints simply can't handle this many requests and ignore anything above and beyond its "request capacity." I've ran some of the documents that were not updating through the same code but in a different collection that contained just a single document and they did update so I don't think it's bad data inside the document.
I'm running this on Heroku (and locally for testing) using Node.js. Results are similar both on Heroku instance and locally.
If my assumption is correct I need a better way to structure this so that there is some separation between requests or maybe it only does x records on a single pass.
It sounds like you need to throttle your outgoing web requests. There's a fantastic node module for doing this called limiter. The code looks like this:
var RateLimiter = require('limiter').RateLimiter;
var limiter = new RateLimiter(1, 1000);
var throttledRequest = function() {
limiter.removeTokens(1, function() {
console.log('Only prints once per second');
});
};
throttledRequest();
throttledRequest();
throttledRequest();
I'm trying to decide between two methods for inserting a new document to a collection from the client using Meteor.js. Call a Server Method or using the db API directly.
So, I can either access the db api directly on the client:
MyCollection.insert(doc)
Or, I can create a new Server Method (under the /server dir):
Meteor.methods({
createNew: function(doc) {
check(doc, etc)
var id = MyCollection.insert(doc);
return project_id;
}
});
And then call it from the client like this:
Meteor.call('createNew', doc, function(error, result){
// Carry on
});
Both work but as far as I can see from testing, I only benefit from latency compensation (the local cache updating and showing on the screen before the server responds) if I hit the db api directly, not if I use a Server Method, so my preference is for doing things this way. But I also get the impression the most secure approach is to use a Method on the server (mainly because Emily Stark gave it as an example in her video here) but then the db api is available on the client no matter what so why would a Server Method be better?
I've seen both approaches taken when reading source code elsewhere so I'm stumped.
Note. In both cases I have suitable Allow/Deny rules in place:
MyCollection.allow({
insert: function(userId, project){
return isAllowedTo.createDoc(userId, doc);
},
update: function(userId, doc){
return isAllowedTo.editDoc(userId, doc);
},
remove: function(userId, doc){
return isAllowedTo.removeDoc(userId, doc);
}
});
In short: Which is recommended and why?
The problem was that I had the method declarations under the /server folder, so they were not available to the client and this broke latency compensation (where the client creates stubs of these methods to simulate the action but in my case could not because it couldn't see them). After moving them out of this folder I am able to use Server Methods in a clean, safe and latency-compensated manner (even with all my Allow/Deny rules set to false - they do nothing and only apply to direct db api access from the client, not server).
In short: don't use the db api on the client or allow/deny rules on the server, forget they ever existed and just write Server Methods, make sure they're accessible to both client and server, and use these for crud instead.
I'm still finding my feet with node.js / express.js. I would like to pass the result of one rest web service to another... or create a new routed service from two web service functions...
function A connects to mysql and builds a json object
e.g.
url:port/getMySQLRecord/:recordid
function B add a new document (JSON object) to a mongoDB collection
it accepts an AJAX POST
e.g.
url:port/insertMongoDoc
function A and B both currently work as REST web services... (how can I best pipe the result of A to B?)
it seems inefficient for the HTTP client to call A and pass the results to B.
I mean using 2 x bandwidth when the server already has the object data doesn't seem the best option.
if this were nix I'd be using | ...
//pseudocode loadRecord middleware just queries mysql for req.params.recordid
// and stores the result as req.body, then calls next()
//storeReqDotBodyInMongo just takes req.body and does a mongo insert, then calls next()
//sendResponse is just res.send(req.body)
app.get('/getMySQLRecord/:recordid', loadRecord, sendResponse);
app.post('/insertMongoDoc', express.bodyParser(), storeReqDotBodyInMongo, sendResponse);
app.get('/getMySQLAndInsertMongo/:record', loadRecord, storeReqDotBodyInMongo, sendResponse);
Note the similarity of connect middleware to unix pipes. Just instead of stdio they use req/res/next.
I asked a question a few months ago, to which Meteor seems to have the answer.
Which, if any, of the NoSQL databases can provide stream of *changes* to a query result set?
How does Meteor receive updates to the results of a MongoDB query?
Thanks,
Chris.
You want query.observe() for this. Say you have a Posts collection with a tags field, and you want to get notified when a post with the important tag is added.
http://docs.meteor.com/#observe
// collection of posts that includes array of tags
var Posts = new Meteor.Collection('posts');
// DB cursor to find all posts with 'important' in the tags array.
var cursor = Posts.find({tags: 'important'});
// watch the cursor for changes
var handle = cursor.observe({
added: function (post) { ... }, // run when post is added
changed: function (post) { ... } // run when post is changed
removed: function (post) { ... } // run when post is removed
});
You can run this code on the client, if you want to do something in each browser when a post changes. Or you can run this on the server, if you want to say send an email to the team when an important post is added.
Note that added and removed refer to the query, not the document. If you have an existing post document and run
Posts.update(my_post_id, {$addToSet: {tags: 'important'}});
this will trigger the 'added' callback, since the post is getting added to the query result.
Currently, Meteor really works well with one instance/process. In such case all queries are going through this instance and it can broadcast it back to other clients. Additional, it polls MongoDB every 10s for changes to the database which were done by outside queries. They are plans for 1.0 to improve the scalability and hopefully allow multiple instances to inform each one about changes.
DerbyJS on the other hand is using Redis PubSub.
From the docs:
On the server, a collection with that name is created on a backend Mongo server. When you call methods on that collection on the server,
they translate directly into normal Mongo operations.
On the client, a Minimongo instance is created. Minimongo is essentially an in-memory, non-persistent implementation of Mongo in
pure JavaScript. It serves as a local cache that stores just the
subset of the database that this client is working with. Queries on
the client (find) are served directly out of this cache, without
talking to the server.
When you write to the database on the client (insert, update, remove),
the command is executed immediately on the client, and,
simultaneously, it's shipped up to the server and executed there too.
The livedata package is responsible for this.
That explains client to server
Server to client from what I can gather is the livedata and mongo-livedata packages.
https://github.com/meteor/meteor/tree/master/packages/mongo-livedata
https://github.com/meteor/meteor/tree/master/packages/livedata
Hope that helps.