In my Sails project, I have a User model/controller and a Request model/controller, as well as a Dashboard controller. A user can make a request for data using RequestController.create, and an administrator can approve it using RequestController.grant.
What I want to do is to notify a user whenever one of his/her requests is approved (updated). In RequestController.grant, I call Request.publishUpdate(...), and in my DashboardController.display, I call
Request.find(req.session.user.id, function(err, requests) {
...
Request.subscribe(req, requests, ['update'])
...
});
Then, in the view /dashboard/display, I put in <script> tags:
<script>
// Socket handling for notifications
io.socket.on("request", function(obj) {
alert(obj.verb);
alert(obj.data);
alert(obj.previous);
});
</script>
However, upon approving a user's request and going to the dashboard, no alerts show up. The script for sails.io is already loaded, with no errors in the console. Am I doing something wrong?
The likely problem here is that you're using Request.subscribe in an HTTP call. I'm assuming that's the case since it seems like you're problem using DashboardController.display to display a view. In this case, the Request.subscribe doesn't do anything at all (it should really display a warning) because it can't possibly know which socket to subscribe!
If you'd like to keep your controller logic the same (you might be using the requests array as a view local to bootstrap some data on the page), that's fine; a quick refactor would be to test whether or not its a socket call in the action:
// Also note the criteria for the `find` query below--just sending an
// ID won't work for .find()!
Request.find({user: req.session.user.id}, function(err, requests) {
...
// If it's a socket request, subscribe and return success
if (req.isSocket) {
Request.subscribe(req, requests, ['update'])
return res.send(200);
}
// Otherwise continue on displaying the view
else {
...
return res.view('dashboard', {requests: requests});
}
});
Then somewhere in your view call io.socket.get('/dashboard/display') to subscribe to request instances.
You could also use blueprints to subscribe to the requests from the front end, doing something like io.socket.get('/request?user='+userId), if you put the user ID in the locals and add it somewhere in your view template, e.g. <script>var userId=<%=user.id%></script>.
Related
What I want to do
I'm trying to intercept a third party website's fetch events and modify its request body in a Chrome extension. Modifying the request body is not allowed by the chrome.webRequest.onBeforeRequest event handler. But it looks like regular service workers do have the ability to listen for and modify fetch events and manually respond to the request using my own response, which means I should be able to intercept the request, modify the body, send the modified request to the API, and then respond to the original request with my modified request's response.
The problem
It looks like neither of these event handlers ever get triggered, despite plenty of fetch events being triggered by the website, as I can see in the Network panel.
// background.js
self.onfetch = (event) => console.log(event.request); // never shows
// or
self.addEventListener("fetch", (event) => {
console.log(event.request); // never shows
});
I can verify that the service worker is running by seeing other console.logs appearing in the service worker console, both top-level logs as well as logs triggered by the "install" event
// background.js
console.log(self); // works
self.addEventListener("install", (event) => {
console.log(event); // works
});
Hypothesis
Do the fetch event handlers not get triggered because extension service workers are not allowed access to these for security reasons? That would make sense, I just haven't seen this documented anywhere explicitly so it would be good to know if this is indeed a platform limitation or if I'm doing something wrong.
Alternate solutions?
If this is indeed a limitation of the extensions platform, is there any way other way I can use a Chrome extension to modify request bodies on a third party website?
I'd like to know how does NodeJS process multiple GET requests from different users/browsers which have event emitted to return the results? I'd like to think of it as each time a user executes the GET request, it's as if a new session is started for that user.
For example if I have this GET request
var tester = require('./tester-class');
app.get('/triggerEv', async function(req, res, next) {
// Start the data processing
tester.startProcessing('some-data');
// tester has event emitters that are triggered when processing is complete (success or fail)
tester.on('success', function(data) {
return res.send('success');
}
tester.on('fail', function(data) {
return res.send('fail');
}
}
What I'm thinking is that if I open a browser and run this GET request by passing some-data and start processing. Then open another browser to execute this GET request with different data (to simulate multiple users accessing it at the same time), it will overwrite the previous startProcessing function and rerun it again with the new data.
So if multiple users execute this GET request at the same time, would it handle it separately for each user as if it was different and independent sessions then return when there's a response for each user's sessions? Or will it do as I mentioned above (this case I will have to somehow manage different sessions for each user that triggers this GET request)?
I want to make it so that each user that executes this GET request doesn't interfere with other users that also execute this GET request at the same time and the correct response is returned for each user based on their own data sent to the startProcessing function.
Thanks, I hope I'm making sense. Will clarify if not.
If you're sharing the global tester object among different requests, then the 2nd request will interfere with the first request. Since all incoming requests use the same global environment in node.js, the usual model is that any request that may be "in-flight" for awhile needs to create its own resources and keep them for itself. Then, if some other request arrives while the first one is still waiting for something to complete, then it will also create its own resources and the two will not conflict.
The server environment does not have a concept of "sessions" in the way you're using the term. There is no separate server-session or server state that each request lives in other than the request and response objects that are created for each incoming request. This is not like PHP - there is not a whole new interpreter state for each request.
I want to make it so that each user that executes this GET request doesn't interfere with other users that also execute this GET request at the same time and the correct response is returned for each user based on their own data sent to the startProcessing function.
Then, don't share any resources between requests and don't use any objects that have global state. I don't know what your tester is, but one way to keep multiple requests separate from each other is to just make a new tester object for each request so they can each use it to their heart's content without any conflict.
To implement password reset request in loopback (send email to the user with reset link), we need to handle the event resetPasswordRequest.
This is a possible implementation below
Client.on('resetPasswordRequest', function(info) {
var options = {
type: 'email',
to: info.email,
from: '....',
...
};
Client.email.send(options, function(err, res) {
if (err) console.log(err);
});
});
With this approach, if an error occurs it is simply logged to the console. Throwing an error that won't be handled doesn't feel like a better solution either.
Why is it not mentioned in the docs to use an afterRemoteHook to add this logic or even create a new custom endpoint ? Both solutions seem better at handling errors.
I think your code is based on example application, isn't it? If so, this approach is used by developer of example application but is not required implementation. You may use any other appropriate solution and one is that what you've mentioned in your question.
As for emitting event - it has it's advantage. You emit event and immediately send response to request. So client app will not wait until email sending part will send email - this can take from seconds to tens of seconds.
You may implement email sending log and make another request to it while user is waiting for password reset email thus notify him if any error will occur.
From the other hand this is only example but not required implementation for using in production.
I'm building a node.js & express app that connects to an IoT device over TCP. On the index page of the app I am rendering the page and running a function that starts to ping the device. Eventually the device responds, I open a TCP socket, and I use socket.io to emit an event to the front end. This process takes much longer than the time to render the page.
When I refresh the page, I do not want to re-ping the device. I need to "save" the state of the connection. Knowing that the device is already connected, I should not need to re-run my connection function.
Possible solutions. My thoughts:
Boolean variable for TCP socket status. In the node.js net documentation I do not see a variable for socket connection status. Another stackoverflow answer said ._connected is undocumented and could work but 'this is not the node.js way'.
Sessions. I could save device state in a session, and keep track of it on re-load. However, based on my reading I can't save the session information after the res.render is called. I specifically want to save the connection status after reload.
Use a local variable. However this is 'reset' on page load.
Save state in JSON file. Use a separate deviceState.js file with state information. I could export that file and use it as a required module in my index page.
My question is - how can I save the state of the device connection even when the page is reloaded? My hunch is there is some combo of session and local variable but I am not sure how these could work based on my points above.
Here's a simplified version of the index route. Let me know if it is missing anything that would help solve this problem:
router.get('/', function(req, res, next) {
function connectToDevice() {
// ping device and open TCP socket...
// eventually the following function is called as an eventlistener to
// a net socket.on('connect')...
function onConnect(socket) {
res.io.emit('machine-online');
}
}
connectToDevice();
res.render('index', {
title: 'Page title'
});
}
This is my first time posting on stackoverflow. I am still learning the relevant key words and have been unable to find a solution to this problem.
The way I solved this is #4, save state in external JSON.
deviceStatus.js: File at the root of the app structure that holds some information in JSON object.
var status = {};
var deviceStatus;
deviceStatus = function() {
status = {
"isOnline": false,
};
return status;
};
module.exports = deviceStatus();
Then in my index.js: Require the deviceStatus module.
var status = require('../deviceStatus');
And I am using this to render the page: In the (not shown) connectToDevice() function, I set status.isOnline == true. So here, if the device is offline, then I connect and render the page. If not, only render the page, do not connect.
if(status.isOnline == false) {
connectToDevice();
}
res.render('index', {
title: 'Page title',
machineOnline: status.isOnline
});
There might be a better way to do this, but this is the method that works for me. When the app re-loads the status.isOnline starts as false, which works since it is not connected yet.
I'm using socket stream to send the data to the logged in user by using the following code
var ss = require('socketstream');
....
....
ss.api.publish.user('userId', content);
but the ss.api.publish is undefined is what the error i'm receiving.
Where am i going wrong.
Please advice.
The API for a publish to a user is:
Sending to Users
Once a user has been authenticated (which basically means their session now includes a value for req.session.userId), you can message the user directly by passing the userId (or an array of IDs) to the first argument of ss.publish.user as so:
// in a /server/rpc file
ss.publish.user('fred', 'specialOffer', 'Here is a special offer just for you!');
Important: When a user signs out of your app, you should call req.session.setUserId(null, cb) to prevent the browser from receiving future events addressed to that userId. Note: This command only affects the current session. If the user is logged in via other devices/sessions these will be unaffected.
The above is taken from the original document describing the socketstream pub/sub api.
as you can see, you need to supply one more argument than you thought. That is, becasue on the client side you need to subscribe to a message channel in order to get the message. In the above example, you need to do this in your client side code:
ss.event.on('specialOffer', function(message){
alert(message);
});