Beginner here, I'm using Firebase real time database and I need my API to constantly return that value when something has been added see my code below.
apiCalls.get('/api/getallusers',function(req,res){
userFunc.getAllUsers(function(err,result){
if (err) return res.status(500).send('internal server error!');
res.status(200).write(JSON.stringify(result));
res.end();
return res;
})
})
this will return the error
Error [ERR_STREAM_WRITE_AFTER_END]: write after end
but if i remove res.end it will show 1 record and constantly load until the page times out..
is what I'm doing possible or are there different ways to do it.
also I'm using firebase cloud functions for this api.
UPDATE:
Uploaded the API but it does not return anything...
here is the link https://us-central1-testproject-e6819.cloudfunctions.net/api1/api/getUser
tried axios and Event Source
Firebase functions logs the values but it does not return it..
If you're viewing the API response like a web page, your browser is buffering the data it's received until there's enough of it to form a more full page. Your browser is expecting content that ends, not some endless stream of data.
You should remove .end() if you expect to be able to continue to write to the output stream.
Also, I recommend using the Server-Sent Events (SSE) protocol for this. https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events It provides a nice standards-based abstraction that makes it very easy to handle event streams client-side.
const eventSource = new EventSource('https://api.example.com/someApi');
eventSource.addEventListener('userupdate', (e) => {
console.log(e.data);
});
Server-side, there are a couple Express-based middlewares to make this even easier than it already is.
Operations in Cloud Functions must be relatively short-lived and end deterministically. There is no way to keep a connection open from Cloud Functions to the client.
Typically consider what triggers the need to send new data. For example, if it is triggered by the fact that a new user is registered, you can use trigger your Cloud Functions from Firebase Authentication. Then the function could for example write to the Realtime Database (or Cloud Firestore), and your client/app listens to the database for realtime updates. That way you're using all the pieces of Firebase in the way they're designed: Cloud Functions for short-lived updates triggered from events in the system, and the Realtime Database or Cloud Firestore for sending realtime updates.
If that doesn't work for your use-case, you'll need a runtime environment that allows you to keep processes alive. Something like App Engine flex, Kubernetes, or many other options come to mind for that.
Related
I've been trying out MongoDB as database for my Flutter project lately, since I want to migrate from pure Firebase database (some limitations in Firebase are an issue for my project, like the "in-array" limit of 10 for queries).
I already made some CRUD operations methods in some Firebase Cloud Functions, using MongoDB. I'm now able to save data and display it as a Future in a Flutter App (a simple ListView of Users in a FutureBuilder).
My question is : how would it be possible to create a StreamBuilder thanks to MongoDB and Firebase Cloud Functions ? I saw some stuff about watch collection and Stream change but nothing clear enough for me (usually I read a lot of examples or tutorial to understand).
Maybe some of you would have some clues or maybe tutorial that I can read/watch to learn a little bit more about that subject ?
For now, I have this as an example (NodeJS Cloud Function stored in Firebase), which obviously produces a Future in my Future app (not realtime) :
exports.getUsers = functions.https.onCall(async (data, context) => {
const uri = "mongodb+srv://....";
const client = new MongoClient(uri);
await client.connect();
var results = await client.db("myDB").collection("user").find({}).toArray();
await client.close();
return results;
});
What would you advice me to obtain a Stream instead of a Future, using maybe watch collection and Stream change from MongoDB, providing example if possible !
Thank you very much !
Cloud Functions are meant for short-lived operations, not for long-term listeners. It is not possible to create long-lived connections from Cloud Functions, neither to other services (such as you're trying to do to MongoDB here) nor from Cloud Functions back to the calling client.
Also see:
If I implement onSnapshot real-time listener to Firestore in Cloud Function will it cost more?
Can a Firestore query listener "listen" to a cloud function?
the documentation on EventArc, which is the platform that allows you build custom triggers. It'll be (a lot* more. involved though.
I am using smartapi provided by angelbroking.
I want to make a stock ticker which can display realtime price of stocks like this one
https://www.tickertape.in/screener?utm_source=gads&utm_medium=search&utm_campaign=screener&gclid=Cj0KCQiA8ICOBhDmARIsAEGI6o1xfYgsbvDEB6c2OFTEYRp9e5UDnJxgCyBJJphdKTduZ_EOHCAchpoaAp-WEALw_wcB
I am able to connect to websocket using the sdk provided in documentation but I don't know how to display that data in my html page.
Please suggest if you know how to get the json data from nodejs console to html.
The nodejs code is
let { SmartAPI, WebSocket } = require("smartapi-javascript");
let web_socket = new WebSocket({
client_code: "P529774",
feed_token: "0973308957"
});
web_socket.connect()
.then(() => {
web_socket.runScript("nse_cm|2885", "cn") // SCRIPT: nse_cm|2885, mcx_fo|222900 TASK: mw|sfi|dp
web_socket.runScript("nse_cm|2885", "mw")
/*setTimeout(function() {
web_socket.close()
}, 60000)*/
})
web_socket.on('tick', receiveTick)
function receiveTick(data) {
console.log("receiveTick:::::", data)
}
The response I get is similar to this :
[{"ak":"ok","task":"mw","msg":"mw"}]
[{"lo":"1797.55","ts":"ACC-EQ","tp":null,"ltp":"1800.05","ltq":"27","bs":"16","tk":"22","ltt":"31\/08\/2017 11:32:01",
"lcl":null,"tsq":"76435","cng":"-11.15","bp":"1800.00","bq":"510","mc":"34012.01277(Crs)","isdc":"18.77872
(Crs)","name":"sf","tbq":"76497","oi":null,"yh":"1801.25","e":"nse_cm","sp":"1800.90","op":"1814.00","c": "1811.20",
"to":"145093696.35","ut":"31-Aug-2017 11:32:01","h":"1817.55","v":"80391","nc":"- 00.62","ap":"1804.85","yl":"1800.00","ucl":null,"toi":"16654000" }]
The github repo for smartapi nodejs
https://github.com/angelbroking-github/smartapi-javascript
The API Docs
https://smartapi.angelbroking.com/docs/Introduction
There are many ways, here's two:
Cache the last message + HTTP polling
This is not the most efficient solution, but perhaps the simplest. Each time your recieveTick() callback hits, you could save the response message in a global object / collection (cache it). Better yet, you could pre-process the message and therefore just cache whatever info you actually care about in that global collection and save bandwidth on the connection between your frontend HTML and backend.
Then, add an HTTP endpoint to your backend that serves up the last info relevant to a given ticker. You could use Express.js or some other simple HTTP server library. That way when your frontend calls
http://<backend_host>:<backend_port>/tickers/<ticker>
Your backend will read from the cached data and serve up the needed data.
Create your own websocket and forward the data
This is a better solution, specially if your data providers API has a quick (subsecond) refresh rate. Create your own websocket server that will make a websocket connection with your frontend. Then, when you get a message from the data providers websocket, simply processes it in whatever way you would like (to get it into the format your frontend wants) then forward it to the frontend by using your websocket server. This will also be done within the recieveTick() function.
There are many websocket tools for nodejs. For help with the websocket stuff check this out https://ably.com/blog/web-app-websockets-nodejs
Also just a quick note, in your question you said "...how to get the json data from nodejs console to html". This kind of suggests that you would like to write the data to the console, and then read it from the console to html. This isn't the way you should think about it. The console was one destination, and the html is another, both originating from the websocket callback.
I have a firebase cloud function that is an endpoint for an external API, and it handles a POST request.
This external API POSTS data to my cloud function endpoint at random intervals (this cloud function gets pinged with a POST request based on when a result is returned from this external API, and there can be multiple at once and its unpredictable)
exports.handleResults = functions.https.onRequest((req, res) => {
if (req.method === 'POST') {
// run code here that handles the POST payload
}
})
What happens when there is more than one POST request that come in at the same time?
Is there a queue? Does it finish the first request before moving on to the next?
Or if another request comes in while the function is running, does it block/ignore the request until the function is done?
Cloud Functions will automatically scale up the server instances running your functions when it determines that more capacity is needed. Those instances will run your function concurrently. The instances will be scaled down when they are no longer needed. The exact behavior is not documented - it should be considered an implementation detail that may change over time.
To learn more about this, watch my video about Cloud Functions scaling and isolation.
We are trying to migrate our zip microservice from regular application in nodejs Express to AWS API Gateway integrated with AWS Lambda.
Our current application sends request to our API, gets list of attachments and then visits those attachments and pipes their content back to user in form of zip archive. It looks something like this:
module.exports = function requestHandler(req, res) {
//...
//irrelevant code
//...
return getFileList(params, token).then(function(fileList) {
const filename = `attachments_${params.id}`;
res.set('Content-Disposition', `attachment; filename=${filename}.zip`);
streamFiles(fileList, filename).pipe(res); <-- here magic happens
}, function(error) {
errors[error](req, res);
});
};
I have managed to do everything except the part where I have to stream content out of Lambda function.
I think one of possible solutions is to use aws-serverless-express, but I'd like a more elegant solution.
Anyone has any ideas? Is it even possible to stream out of Lambda?
Unfortunately lambda does not support streams as events or return values. (It's hard to find it mentioned explicitly in the documentation, except by noting how invocation and contexts/callbacks are described in the working documentation).
In the case of your example, you will have to await streamFiles and then return the completed result.
(aws-serverless-express would not help here, if you check the code they wait for your pipe to finish before returning: https://github.com/awslabs/aws-serverless-express/blob/master/src/index.js#L68)
n.b. There's a nuance here that a lot of the language SDK's support streaming for requests/responses, however this means connecting to the stream transport, e.g. the stream downloading the complete response from the lambda, not listening to a stream emitted from the lambda.
Had the same issue, now sure how you can do stream/pipe via the native lambda + API Gateway directly... but it's technically possible.
We used Serverless Framework and were able to use XX.pipe(res) using this starter kit (https://github.com/serverless/examples/tree/v3/aws-node-express-dynamodb-api)
What's interesting is that this just wraps over native lambda + API Gateway so, technically it is possible as they have done it.
Good luck
I'm working on a google cloud project that involves moving data from pubsub,
splitting the packet into several pieces and submitting each individual
piece to datastore. Datastore and the splitting parts work just fine, but when it comes to getting the message from pubsub, the window just keeps loading with no result. I've let it run for a while now and the request does not time out.
I tried including the subscription.on inside a setIntervalcall so that I can switch it off and on for a span of time, but this does not prevent the app from stalling.
Could this be happening because the subscription.on function is running continuously, not exiting and allowing express (the framework generating the app engine's page) to generate the page? If so, how can I avert this? What other approach could I be taking to successfully get data from pubsub?
I'm willing to provide more details if necessary. Thanks in advance.
EDIT: Including pubsub call
var psMessage = '';
subscription.on('message', function(message) {
psMessage = message;
message.ack();
});