In an express/mongoose server application, I want to receive a request, then change state to one which can receive another type of request, and when the second request completes successfully I want to respond to the first one.
The way I've implemented this is to use a placeholder document in MongoDB, and set a mongoose post save hook upon receiving the first request. This async middleware is a closure which holds a reference to the response from the first request.
The second request modifies this placeholder document with new information from another remote client. Upon saving this, the post save hook gets run, which determines if this is the correct document, and validates the change w.r.t the first request. If that passes, the first response is sent. Otherwise, the hook continues waiting for the correct change, checking all the saves that happen to that schema.
My problem is that even after the correct and accepted changes happen and the response is returned to the first client, the (shell) post-save hook still remains. Now, this does return instantly upon seeing that the response has been sent successfully, but it bothers me that it still exists and gets called for all saves.
This is an application that's meant to run with an anticipated 1k-10k such requests over its lifetime. So unless the application is periodically restarted, we might see a significant slowdown from all the post-save hooks getting called.
Now, onto the questions:
Is there a better/easier/straightforward architecture to solve this problem?
If not, should I be worried about all the shell post-save hooks for this use case?
If so, how do I delete a freaking hook?
This is a far more infuriating issue than usual with this sort of thing because of the existence of 'remove' hooks. All the search engines fail to actually point me to disabling/deleting/unhooking/removing middleware functions. Nothing in the docs either.
The best I can come up with is to use a single-argument middleware function, and then overwrite that function with {} or undefined (or another closure function if we encounter another request-type-1). Is this the only solution? With this, I lose the ability to make and retain responses of multiple request-type-1s.
Found two methods of doing this, one from this answer to delete specific entries in the call queue of the schema.
EDIT: The following doesn't work with Mongoose as of 5.1.5 - removePost isn't defined.
The other (better) one was found when perusing the codebase used to implement hooks.
You can remove a post by using
Document.post('set', someFn); // Setting it
Document.removePost('set', someFn); // Removing it
Related
I have prepared a simple demo with react-router-dom 6 and react query.
I have a couple of routes and a fetch call that takes place on the first route (Home).
What I want to achieve is navigating to the About page or any other page and do not perform another request for a certain time period (maybe never again) but if I refresh the page I want to be able to re trigger the request to get the data.
I have tried using staleTime when but if I refresh the page I get no results, just a blank page. refreshInterval works on refresh but does not keep the data when I change routes.
I have also tried this pattern in this article but still I don't get the job done.
Probably it may be that something I don't understand but the question is how do I avoid making the same request over and over again, perfrm it only once but still being able to get the data if I refresh the page when navigating between different routes.
Demo
The solution to the problem came from one of the maintainers on the official github repo eventually and is related to adding placeholderData as an empty array instead of initialData and setting staleTime to Infinity in my case as I only want to perform the request once.
Setting placeholderData gives you the opportunity to show some dummy data normally until you fetch the real but in my case it seems to do the job. More to read regarding this at this source
const { isFetching, data: catImages } = useQuery({
queryKey: ["catImages"],
queryFn: getCatImages,
placeholderData: [],
staleTime: Infinity
});
I'm looking for a way (using SuiteScript 2.0) to handle real-time persistent (stored) field updates, where a field might have changed in NetSuite (for example a lead time was just updated), and it doesn't matter if a user saved the change, or some other automated process changed that field. I just want to be able to pick up on that change:
The moment that it's done, and
Without regard for who or what kicked it off (e.g. it could be a person, but it could also be an automated change from a workflow, or a formula on the field itself which pulls values from another field)
Doing some research I found some options that looked somewhat promising at first. One being the afterSubmit event in a client script, and the other being the fieldChanged event. My issue however is, from what I understood those only really seem to be triggered by a user manually going in and making those changes, but this is only one part of the puzzle and doesn't seem to cover changes made outside of the scope of the user making those changes. Is that correct however? Or would one of those events still be able to capture changes done to that field regardless of who (or what) initiated or triggered the change, and right at the moment the change was saved/ persisted to the database?
UserEvents are basically triggers. In their deployment records you can set the context in which they fire so you can get them to fire in all circumstances (called contexts in Netsuite) but one.
That circumstance is User Events are not fired for record saves made in User Event scripts. i.e., if an AfterSubmit UserEvent script loads, changes and saves your record a fresh user event will not be fired.
I'm interested if anyone had managed to pass current logged in user to lifecycle events
I would like to pass the user to the lifecycle events function to keep track of the user making the changes.
I don't think there is a clean way to do this. Even if you don't mind digging into (and modifying) the framework code, I think you'd have to make a few modifications in a few places:
1 - You'd have to override Model.create to accept a req parameter (or something else you can use to get the logged in user).
2 - You'd have to make sure that all uses in the code (yours AND the sails library's) of Model.create pass in the required parameter. In particular...
3 - You'd have to modify the default action caught by POST /create/[modelname] so that it passed in the appropriate parameters to Model.create
Given that that is a lot of steps (and I likely missed some), I'd recommend another approach:
1 - Disable the REST create route from your /config/blueprints.js file.
2 - Funnel all model creation through a custom method. (Could be a directly exposed controller method, or something in /api/services, or in the model itself). Make sure the method gets access to the logged in user, attach it as data to the model, THEN call the default Model.create from within your custom method.
In a angularjs/express/postgres app, i want to load big list of json object from the db to the client.
Each json object is it's self pretty big. Each on is stored in a separate row.
So i want to display each json object as soon as they are read from db.
I've found the EventSource api, to progressivly send the json object from server to client.
Which works fine.
Then i want to display them on my view as soon as possible.
Working with event source include working with event listeners.
As explain here https://groups.google.com/forum/?fromgroups=#!topic/angular/xMkm81VkR9w
the angular won't notice the change of the model since the scope modification occurs outside the angular world, inside a event listener.
There is a way to trigger to dirty cheking by calling $scope.$apply().
But since my list as more than 200 element, this error is triggered:
Error: 10 $digest() iterations reached. Aborting!
So i'm wondering if there is another way to trigger the dirty checking ? Maybe another way to approch my issue.
EDIT:
The title was changed after reflection on the real problem
In fact the issue come from the partial, where i add filter expression in a ng-show directive
My bad
I need to delete some records related to the current record when it is deactivated. I can get the the event when the record is deactivated but I have looked around for some time on Google and this site for the code to delete records in javascript but I can't find any, though I know there must be some out there.
Can anyone help?
Thanks
I would be alright with doing this with a plugin, all I would need to know is how to pick up that the record has been deactivated
You can register a plugin on the SetState and SetStateDynamic messages (recommend the Pre event in your scenario). Each of these messages will pass an EntityMoniker in the InputParameters property bag which refers to the record that is being deactivated.
In your code you will need to:
Check that the new state in the SetState request is deactivated (since of course a record can usually be reactivated and you don't want to try deleting things then too, presumably)
Pick up the EntityMoniker from IPluginExecutionContext.InputParameters
Run your query to identify and delete related records
Exit the plugin to allow the SetState transaction to complete
If you really want to delete a record with JavaScript there is a sample on the MSDN.
Its a little long winded (its a CRUD example - create, retrieve, update & delete). But it should contain the information you need.
Note there is also an example on that page which doesnt use jQuery (if using jQuery is a problem).
That said I think for this operation would will find it easier to implement, test and maintain with a plugin (so I would go for Greg's answer).
Additionally a plugin will apply in all contexts, e.g. if you deactivate the record in a workflow your JavaScript will not run, but a plugin will.