I'm working on a node.js web app that's mainly an image based bulletin board. I read somewhere that we always need to cache since it helps improve performance, so normally I added my own caching which basically works like this: when a user visits the bulletin board the number of uploaded posts is cached, when visited again a comparison is made between the cached number and the the available number of posts, if it's the same then the cached query kicks in if not the new posts gets uploaded. Now this isn't a matter of code or anything. My code works perfectly and it does what I want it to do even if I might've explained it wrong or gave a wrong impression somehow. Problem is I read after doing all the work that caching frequently updated data is wrong and this is a bulletin board I'm talking about where users always upload images. So is it best to remove or keep the cache?
"There are two hard things in computer science: cache invalidation,
naming things, and off-by-one errors."
— Phil Karlton
Caching depends heavily on the kind of application. Just because you've heard "caching is always better" doesn't mean you should. A good rule of thumb:
Lean toward caching the data that is read-heavy but seldom updated.
Have a strategy for cache invalidation on write-heavy operations.
If you believe you will be doing crazy amounts of read-writes against the same entity, consider microcaching if you're comfortable with a bit of stale data for, say, less than a minute. Just set an expiry on your cache on a minute.
If you feel like you will be benefit from serving from the cache (and, by the way, what kind of cache are you talking about? In-memory? HTTP caching? Some kind of backend, such as Redis/Memcached?), have a plan to always check cache, and if you can't find it there, serve from your data store and then cache. Then, on POST/PATCH/DELETE calls, invalidate by that key.
In the case of your bulletin board, you might want to consider invalidating cache any time someone posts or uploads anything, then cache the subsequent call. But, as I mentioned, be sure to have a strategy for invalidating.
Adding caching makes your application more complex. You shouldn't add it just because some random people on the internet say you should. There are tons of situations were caching makes things better and there are other that things are made much worse.
Caching is a solution to a problem. So what problem are you seeing? Slow client experience because of repeated downloads? Is your server load high? Bandwidth high? Then perhaps you should add in some kind of caching.
Too often people hear of solutions and work towards a problem. Identify your problems and then look into solutions.
Related
I am using Yandex Metrika for my site Thegoldlive.com and facing a Core Web Vitals issue due to it. I believe it's the main reason due to why my site is getting slow. Any way to get rid of it or should I remove this from the site?
When I remove it from the site, the speed of my site gets better. But, I don't want to remove it because it helps me analyze the visitors on site in the best manner. So that's why asking, is there any way to keep both things parallel?
Running your site through PageSpeed Insights it appears your issues are with loading time (TTFB, FCP, and LCP) and shifting content (CLS).
I'm not familiar with Yandex Metrika, but it seems unlikely an analytics solution will slow down these metrics. Mostly they affect responsiveness metrics like FID and INP.
I can't quite see the reason for slow TTFBs (it seems fast to me!), which will directly affect the other loading metrics. You seem to be using a CDN (cloudflare) and the server response time from lab tests seems fast.
It could be you just get a lot of visitors from slow networks/devices? If so one thing that can help here is ensuring sites are eligible for the Back/Forward cache, so at least they get a fast (instant!) load when going back and forwards within the site. Testing your site for this shows your site is using an unload handler, meaning you can't benefit from this performance gain. It looks like you are using Cloudflare's Rocker Loader - ironically something that's supposed to improve performance but that might be holding you back here. I'd turn that off.
For your layout shift issues (CLS), it's must more obvious. You have an advertisement that pops in and out and pushes all the content down. You'd be better to reserve a block of white space for that to slot into, rather than have it dynamically inserted and moving the text around, which is an irritating experience for site visitors.
I want to save the "state" of my application each time it is changed, and load it each time the application is booted up.
The "state" will be a simple object with a handful of variables in it, the idea is to JSON.stringify it to a file, and JSON.parse it when needed.
From what I understand, this cannot be done using Node's fs, since files on Heroku are not permanent.
I cannot use S3 either, because it's not free (free plan only lasts a year), and this is a hobby project of mine - I am not willing to pay for it.
Another recurring suggestion, is to use some sort of a database, but I think that is a waste of time, since I will only be dealing with one very small file.
Essentially, my question is, how can I achieve something that is closest to this?:
WRITE("filename.txt",JSON.stringify(x));
x=JSON.parse(READ("filename.txt"));
(P.S: I've read somewhere, can't seem to remember where, that Heroku gives free 100MB (Which would be way more than enough). What is that? Does it have anything to do with my code?)
I can think of a few ways to do this for free. They all pretty much boil down to "What free service allows me to read/write arbitrary file content, and access via an API?"…
Do you use or already pay for Dropbox (or something similar?). If so you could you the Dropbox API for Node.js to save/load your application state.
You could use the Github Gist API and just update the same Gist over and over.
Otherwise, you mentioned databases. Sure, a database would be overkill tech-wise, but given your constraints (and the fact that you can get a small db for free on Heroku), and how much overhead implementing one of the aforementioned APIs would be, it might be the best option.
Hope this helps.
I was wondering if anyone can educate me on an area of perf testing an API. I have an API which caches all its operations for extensive periods let's say a day. Based on this I felt there was no need to perf test the API as I would be testing infrastructure, configuration and http caching. Am I correct in my thinking? I am eager to know peoples opinions and any useful docs and papers would be beneficial.
Cache should be load-tested too, just to know your system's overall capacity (yes, most load goes to infrastructure, but at least you'll know that it does actually cache)
Second, but not by priority - you have the cache-miss branch and its capacity, if there's lot of cache misses - cache will not save your app, it still has to perform.
We are currently in the process of organising a student conference.
The issue is that we offer several different events at the same time over the course of a week. The conference runs the whole day.
It's currently been operating on a first come, first served basis, however this has led to dramatic problems in the past, namely the server crashing almost immediately, as 1000+ students all try to get the best events as quickly as they can.
Is anyone aware of a way to best handle this so that each user has the best chance of enrolling in the events they wish to attend, firstly without the server crashing and secondly with people registering for events which have a maximum capacity, all within a few minutes? Perhaps somehow staggering the registration process or something similar?
I'm aware this is a very broad question, however I'm not sure where to look to when trying to solve this problem...
Broad questions have equally broad answers. There are broadly two ways to handle it
Write more performant code so that a single server can handle the load.
Optimize backend code; cache data; minimize DB queries; optimize DB queries; optimize third party calls; consider storing intermediate things in memory; make judicious use of transactions trading off consistency with performance if possible; partition DB.
Horizontally scale - deploy multiple servers. Put a load balancer in front of your multiple front end servers. Horizontally scale DB by introducing multiple read slaves.
There are no quick fixes. It all starts with analysis first - which parts of your code are taking most time and most resources and then systematically attacking them.
Some quick fixes are possible; e.g. search results may be cached. The cache might be stale; so there would be situations where the search page shows that there are seats available, when in reality the event is full. You handle such cases on registration time. For caching web pages use a caching proxy.
I am working on a personal project and I have being considering the security of sensitive data. I want to use API for accessing the Backend and I want to keep the Backend in a different server from the one the user will logon to. This then require a cross domain accessing of data.
Considering that a lot of accessing and transaction will be done, I have the following questions to help guide me in the right path by those who have tried and tested cross domain access. I don't want to assume and implement and run into troubles and redesign when I have launched the service thereby losing sleep. I know there is no right way to do many things in programming but there are so many wrong ways.
How safe is it in handling sensitive data (even with https).
Does it have issues handling a lot of users transactions.
Does it have any downside I not mentioned.
These questions are asked because some post I have read this evening discouraged the use of cross-domain access while some encouraged it. I decided to hear from professionals who have actually used it in a bigger scale.
I am actually building a Mobile App, using Laravel as the backend.
Thanks..
How safe is it in handling sensitive data (even with https).
SSL is generally considered safe (it's used everywhere and is considered the standard). However, it's not any less safe by hitting a different server. The data still has to traverse the pipes and reach its destination which has the same risks regardless of the server.
Does it have issues handling a lot of users transactions.
I don't see why it would. A server is a server. Ultimately, your server's ability to handle volume transactions is going to be based on its power, the efficiency of your code, and your application's ability to scale.
Does it have any downside I not mentioned.
Authentication is the only thing that comes to mind. I'm confused by your question as to how they would log into one but access data from another. It seems that would all just be one application. If you want to revise your question, I'll update my answer.