I want to perform actions on push received from server. I have a custom service worker which has the event listener for push. Now i want to dispatch redux actions when i recieve the push from server. My custom service worker lives inside public folder at the moment. And i am unable to import the store inside this file. Any help would be highly appreciated. Thanks!
It is not possible to dispatch Redux actions inside your Service Worker code.
However, what you actually want to do, is communicate using the postMessage API. Using postMessage, you can send the browser JS context a message when the SW receives a push from the server. You can find more info eg. here https://developer.mozilla.org/en-US/docs/Web/API/Client/postMessage.
Why is it like this?
This is a consequence of the different execution contexts. Your normal JS code (React, Redux, actions etc.) run in the browser context where as your Service Code runs in the SW context. Those two contexes don't share ANY variables or state. For that reason, you cannot call any functions (eg. Redux actions) inside your SW code. You need to communicate between the two contexes and then, upon receiving a message in the browser context, call whatever functions you wish.
Related
Setup: typescript 4.9.4, next.js 13.0.6
What I’m trying to do: messenger app. Messaging system based in Server-sent events (SSE), not WS
What’s the idea: incoming messages are handled by SSE endpoint: https://github.com/sincerely-manny/messenger.nextjs/blob/main/pages/api/messenger/incoming.ts
outgoing messages are being accepted as POST-requests here: https://github.com/sincerely-manny/messenger.nextjs/blob/main/pages/api/messenger/outgoing.ts
Singleton-class is collecting list of clients/connections and response-objects: https://github.com/sincerely-manny/messenger.nextjs/blob/main/lib/sse/serverSentEvents.ts
Whenever anyone needs to send a message he is to grab the instance of SSE class and trigger the "send" method.
Front-end part: https://github.com/sincerely-manny/messenger.nextjs/blob/main/app/messenger/page.tsx
Expected behaviour: upon establishing first connection instance of SSE class is created. Then every call of the send method finds corresponding to the client response object and and puts message to the stream
Actual behaviour: upon connecting to sse endpoint instance (1) of the class is created. Client is registered in list. But (!) sending a message creates another (2) instance of singleton-class with empty clients list. Hence sent message is lost. But (!!) after refreshing the page and creating new connection app takes this second (2) instance, puts client there and everything starts working as expected.
The question: how’s that possible and what should I do to avoid this unwanted behaviour.
Update: turns out the problem persists only in dev mode, while compiling pages on-the-fly. That makes it easier, but doesn’t explain why it happens.
I am new to the Javascript world, need to design a system where React Ui is calling a Node Server Rest API, which internally trigger Python Rest API, which is a hosting a long running (min 5 minutes) optimization model. We don't intend to make a blocking call instead wanted to notify the React Ui using Push notification (Observable Pattern). Something similar to Stack over flow page, where it provides an information that how many new questions are posted or a new answer or new comment is posted, which can then be refreshed and reviewed.
Sifting through the RxJs and Socket IO what I understood is, we can do the following:
Make an Async call using Observable, on the server Socket IO is used to subscribe and notify the client, which will do the communication between Node server and the React Client and follow a similar Push notification system at the Python server level, which notifies the Node server. Node server also executes a DB operation once the optimization job is complete
Another option is implementing the Queuing service between Node and python, which provides more durability and resilience to process the data using Pub - Sub model
Though our overall requirements are not enterprise oriented, we have low concurrency and resilience is not a critical factor, can some verify and let me know whether the above mentioned design options correct consideration or there are simpler options, which can help us meet our push notification use case
Given an event driven micro service architecture with asynchronous messaging, what solutions are there to implementing a 'synchronous' REST API wrapper such that requests to the REST interface wait for a response event to be published before sending a response to the client?
Example: POST /api/articles
Internally this would send a CreateArticleEvent in the services layer, eventually expecting an ArticleCreatedEvent in response containing the ID of the persisted article.
Only then would the REST interface response to the end client with this ID.
Dealing with multiple simultaneous requests - is keeping an in-memory map of inflight requests in the REST api layer keyed by some correlating identifier conceptually a workable approach?
How can we deal with timing out requests after a certain period?
Generally you don't need to maintain a map of in-flight requests, because this is basically done for you by node.js's http library.
Just use express as it's intended, and this is probably something you never really have to worry about, as long as you avoid any global state.
If you have a weirder pattern in mind to build, and not sure how to solve it. It might help to share a simple example. Chances are that it's not hard to rebuild and avoid global state.
With express, have you tried middleware? You can chain a series of callback functions with a certain timeout after the article is created.
I assume you are in the context of Event Sourcing and microservices? If so I recommend that you don't publish a CreateArticleEvent to the event store, and instead directly create the article in the database and then publish the ArticleCreatedEvent to the Event store.
Why you ask? Generally this pattern is created to orchestrate different microservices. In the example show in the link above, it was used to orchestrate how the Customer service should react when an Order is created. Note the past tense. The Order Service created the order, and Customer Service reacts to it.
In your case it is easier (and probably better) to just insert the order into the database (by calling the ArticleService directly) and responding with the article ID. Then just publish the ArctileCreatedEvent to your event store, to trigger other microservices that may want to listen to it (like, for example, trigger a notification to the editor for review).
Event Sourcing is a good pattern, but we don't need to apply it to everything.
I have a node server, which needs to:
Serve the web pages
Keep querying an external REST API and save data to database and send data to clients for certain updates from REST API.
Task 1 is just a normal node tasks. But I don't know how to implement the task 2. This task won't expose any interface to outside. It's more like a background task.
Can anybody suggest? Thanks.
To make a second node.js app that runs at the same time as your first one, you can just create another node.js app and then run it from your first one using child_process.spawn(). It can regularly query the external REST API and update the database as needed.
The part about "Send data to clients for certain updates from REST API" is not so clear what you're trying to do.
If you're using socket.io to send data to connected browsers, then the browsers have to be connected to your web server which I presume is your first node.js process. To have the second node.js process cause data to be sent through the socket.io connections in the first node.js process, you need some interprocess way to communicate. You can use stdout and stdin via child_process.spawn(), you can use some feature in your database or any of several other IPC methods.
Because querying a REST API and updating a database are both asynchronous operations, they don't take much of the CPU of a node.js process. As such, you don't really have to do these in another node.js process. You could just have a setInterval() in your main node.js process, query the API every once in a while, update the database when results are received and then you can directly access the socket.io connections to send data to clients without having to use a separate process and some sort of IPC mechanism.
Task 1:
Express is good way to accomplish this task.
You can explore:
http://expressjs.com/
Task 2:
If you are done with Expressjs. Then you can write your logic with in Express Framework.
This task then can be done with node module forever. Its a simple tool that runs your background scripts forever. You can use forever to run scripts continuously (whether it is written in node.js or not)
Have a look:
https://github.com/foreverjs/forever
I'm trying to run some server only code from an event on the client in derby.js
I'm using x-bind to bind the event on the view like so:
click me
and on the app:
exports.func=function(e,el,next){
// i want to run some server code here, but it runs on the client only
}
So:
Can this be done in any way?
if not, is there any way to use sockets in a 'native' way on derby.js
I simply don't want to fall back to ajax with server routes when all the rest is real time.
You can route the request onto the server via the model ( model.fetch() & model.subscribe() ). If it's just retrieving some data from the server, you are basically all set. Keep a reference to the model for when you need it (in app.ready callback as pointed out by switz).
To use sockets directly or to extend the model (which uses sockets in the background) see
https://groups.google.com/forum/?pli=1#!topic/derbyjs/60gouek7334