component I'm making:
I'm making a booking component with a modal. It has a timesheet and options to choose various locations. every time user changes the location, another timesheet is fetched for each location.
network flow of the page:
download location: it fetches for all locations customers can choose from in the db and stores locations state. (this takes about 900ms).
then download timesheet: once locations is set, it selects a default location information then fetches other requests to download 7 days worth of timesheets. (this takes another 900ms, but only 6 async calls are made at a time. the last one is executive after some delay. in total the whole fetch takes about 1800ms). this piece executes after locations update using
useEffect(()=>{ //some code// }, [locations])
page loads up
problem
evidently, this is too slow. the whole thing takes about 2.5 seconds to 3 seconds to load. i am thinking about ways to improve the time it takes for the page to load.
solutions I was thinking
I could just start downloading all the timesheet data before location data is downloaded. I think I can store the default location information somewhere in the site.
option 1) I think I can store default location data in the redux (stored in user schema and gets updated to redux when a user logs in so that it can be used when timesheet component needs to load without having to wait for location data to be downloaded). the problem is i'm now replicating the same location data within the db. maybe it's not a problem, but my OCD wants to do without replicating anything.
option 2) I can get rid of the user specific default data. I think I can just store sth like var defaultLocation = { id: // objectId of a location// } in the component code itself but i'm losing the user specificity, which would be a very nice feature to have.
option 3) ??? I feel like I'm facing a trade off here, but I'm hoping if there would be any other solutions. Any tips and helps would be much appreciated!
Related
Background: I'm using python 3 (with Flask and Bootstrap) to create a website with fields to capture data. There is an API running from a HighQ database to take data from a record, and once the form is submitted, put updated / new data back into the same record in the HighQ database (and also to an SQL database). This is all working successfully.
This is a tool that is being used internally within the company I work for, and hosted on the intranet, so I haven't set up user logins as I want it to be as quick and easy as possible for the team to use this form and update records.
Issue: When two or more instances of the form are being used at the same time (whether it's on one persons computer, or if two people are testing on their own computers) the data submitted from one person's form will overwrite the destination record that the other person has called into their form. I'm struggling to find the solution that ensures each time the form is launched, it ringfences the data so that this does not happen.
I've tried a number of things to resolve this (including lots of stackoverflow searches):
Originally I used global variables to pass data across different routes. I thought the issue was that everyone launching the form could access the same global variables, so I changed the code to remove any global variables, and instead used a combination of taking content within each of the fields in the site, and saving it as a local variable within the route I needed it in, and for any variables that were not obtainable from a field, I saved them to the 'session'.
The problem is still persisting and i'm now at a loss on what to try next. Any help on this would be much appreciated.
I am implementing a movie search application; in which movies will be returned after clicking search.
In the result lists, user can give "like" to the movies displayed to them.
However, if the user selects to sort by like, the my sort function actually is sorting the "old" search result while some movie's number of like may be changed due to user input.
So, I would like to reload the pages so that the page will go back to the server to load new data and then do the sorting again.
May I know if there is a way to reload the pages by calling my current URL and then I add an extra param on the URL like (sort by no of like). So that I can know I should sort it by no of like in the server-side function
Or if there is a simple approach to deal with this case
Edit for better illustration
the search result is a list of movie
stage 1
1.
Titanic
like:3
2.
spiderman
like:3
stage 2 (user give a like to second one)
1.
Titanic
like:3
2.
spiderman
like:4 <----- no of like increased due to the user giving raising point)
stage 3 User suddenly wants to sort the results by no of like; then I sort the results on the fly.
1.
Titanic
like:3
2.
spiderman
like:3 <------ the results is not updated since the result is old
routing function I am using
router.push({
pathname: "/search",
query: {sort: sort}
)
but the reload is not clean. Component is still cached
[edit 2]
In each item in the result list, it is actually wrapped by an result card components
So, to sum up, the server passes the search results to the search page.
Then I iterate the results and then return a number of result cards.
In the result card, it allows users to make an action like raise like.
The reason I can keep it dynamic without calling the API is I kept these two hook.
const [likeCount, setLikeCount] = useState(movie.likeBy.length);
const [liked, setLiked] = useState(false);
But the problem is that when the user clicks the "sort by like", even it gets the new sorted search result. The result card component is still keeping previous data values. It makes the new render weird. That's why I want to make a complete reload that can clear all components' states.
And I found that router.push doesnt do the job
Hope this explains things clear
why do you need to reload the page to get new data you can do it in the same URL
let's assume your baseurl ===>>> "getmovies?sortby="
if you don't pass the value in sortby then it will fetch all data
after adding value in sortby query it will fetch sorted data
sorturl ===>>>"getmovies?sortby=like"
You can use optimistic UI (which is persistence), I have used it with GraphQL and Facebook uses something similar. As soon as a user likes a movie, React updates the cache which it uses to render the UI. It makes the API call in the background and if it is successful, the UI remains the same (hence optimistic).
Edit:
After you have updated your question, I think we can look into states. Currently you are updating states when user likes a movie but when they sort it again, we are using the old results object or array returned from the server at the first time. If we are not calling the server everytime a user sorts the results, we can store a result copy in a component state fetched from the API and consider it as a source of data for dynamic rendering. Now if a user likes a movie, you update the local state of the result object, it will cause render and updates the UI. When user try to sort the movies, you use local state which has all the latest information and sort it.
Has anyone figured out a way to delete a time entry record from a time sheet record, without actually deleting the time sheet? My use case is that I have been syncing work time from JIRA to NetSuite for over a year now. When there is an error or they need to update their time, my integration just deletes it and recreates everything. Never an issue, since the time sheets are not submitted or approved yet, at that point.
Now, we have installed this SuitePeople bundle (sadly, the project managers working on that did not test anything...:/ ), which has completely changed time tracking. Aside from custom fields no longer showing up in the columns (a whole different issue), they are now generating generic timesheets for people to show time off. Those time sheets are not able to be deleted, and their time entries are not able to even be edited (presumably since they were created by the system - at least that's what NetSuite says).
My last hope is to add/edit/delete time entry records when the additional system generated time sheets have been added. But, anytime I try to delete a time entry, I'm given the error that timeentry is not a valid record type (since it is a subrecord).
Any thoughts? Feeling completely at a loss here...
This is good to know since I have a direct integration with JIRA worklogs as well but no SuitePeople.
Can you cancel/reject the timesheet?
Turned out that I had to run this through a RESTlet, where SuiteScript is able to directly search and delete time entry records. Here are the important parts of that script, in case anyone runs into this as well.
var timeEntrySearch=search.create({
type:'timeentry',
columns:[{name:'employee'},{name:'hours'},{name:'internalid'},{name:'memo'}],
filters:[{name:'date',operator:'within',values:[startDate,endDate]},{name:'employee',operator:'is',values:[userID]}]
}).run().each(function(result){
log.debug('results',JSON.stringify(result));
var memoField=result.getValue({name:'memo'});
if(memoField.indexOf('JIRA Time')!=-1){responseArray.push(result);}
return true;
});
for(var el in responseArray){
try{
log.debug('Deleting',JSON.stringify(responseArray[el].id));
record.delete({type:'timeentry',id:responseArray[el].id});
}catch(deleteErr01){
log.debug('ERROR[deleteErr01]',JSON.stringify(deleteErr01));
continue;
}
}
I am building an application that should be able to list several thousand articles with pagination/ infinite scrolling. The user should be able to filter/ sort this list, and currently I am experiencing performance problems when sorting articles.
I set up a very basic application to demonstrate the problem: http://meteor-paginated-subscription-example.meteor.com/ (see Github: https://github.com/lacco/meteor-paginated-subscription-example ). If you open the Firefox/ Chrome console and click on "created at"/ "priority" to initiate the sorting, you see that Template.articles.rendered is called several hundred times on one click. You also see that the table takes some time to be "final", during loading and rendering the order of the rows changes very often.
I am sure that I am doing some crap in my code, but I can't figure it out where the cause of my problems is. Can you help me out?
The collection.cursor, Models.Articles.find({}, ...), is a reactive data source. Any change to the items in that cursor will cause the template articles_table and the subtemplates inside `{{#each articles}} to rerender.
Your resort leads to changing your data subscription so the client calls the servers asking for new data. The flickering you see is caused by displaying the resorted intermediate steps as new data arrives and old data is deleted. Try checking if handle.ready() before returning anything from the helper Template.articles_table.articles so the table is only displayed when your dataset is complete.
Update: In response to additional function requests in the comments. If you store "previousSort" and also store "previousLimit" whenever the sort or limit changes you can display your old query with reactivity turned off while the client waits for for the subscription to become ready:
if ( handle.ready() ){
return Models.Articles.find({}, {sort: sort, limit: limit});
}
else {
return Models.Articles.find({}, {sort: previousSort, limit: previousLimit, reactive: false});
}
This seems like a pretty simple thing but I can't find any discussions that really explain how to do it.
I'm building a scraper with MongoDB and Node.js. It runs once daily and scrapes several hundred urls and records to the database. Example:
Scraper goes to this google image search page for "stack overflow"
Scraper gets the top 100 links from this page
A record of the link's url, img src, page title and domain name are saved to MongoDB.
Here's what I'm trying to achieve:
If the image is no longer in the 100 scraped links, I want to delete it from the databqse
If the image is still in the 100 scraped links, but details have changed (e.g. new page title) I want to find the mongodb record and update it.
If the image doesn't exist already, I want to create a new record
The bit I'm having trouble with is deleting entries that haven't been scraped. What's the best way to achieve this?
So far my code successfully checks whether entries exist, updates them. It's deleting records that are no longer relevant that I'm having trouble with. Pastebin link is here:
http://pastebin.com/35cXcXzk
You either need to timestamp items (and update them on every scrape) and periodically delete items which haven't been updated in a while, or you need to associate items with a particular query. In the latter case, you would gather all of the items previously associated with the query, and mark them off as the new results come in. Any items not marked off the list at the end, need to be deleted.
another possibility is to use the new TTL index option in mongodb 2.4 allowing you to set time to live on documents
http://docs.mongodb.org/manual/tutorial/expire-data/
This will let the server expire them over time instead of having to perform big expensive remove executions.
Another optimization is to use the power of 2 option for collections to avoid the high fragmentation of memory that write, remove cycles create
http://docs.mongodb.org/manual/reference/command/collMod/