I've had an issue with my browsers not showing an updated version of a file. Occasionally I will make a change, upload the file via FTP, and the file will immediately show. Other times, I'll clear the cache and hard refresh, and the updated file will take half an hour to show. Here is some info:
Tried on both Chrome and Firefox (incognito and private too).
When I try it on my phone, it shows up fine, and immediately the other browsers start showing correctly too.
Sometimes, even after the browsers start showing correctly, I'll refresh and it will go back to a previous version.
Originally had mod_expires on in my .htaccess, commented it out. No changes.
Website using Sucuri.net and MaxCDN. Sucuri caching is disabled.
Sucuri GZIP compression is enabled. I don't think this should affect anything.
MaxCDN is having trouble immediately purging their cached files and replacing them with the new files when I upload and purge. I assumed my issue is making this happen because I don't believe the MaxCDN implementation could affect my issue. (MaxCDN doesn't run all requests through their servers like Cloudflare.)
Sometimes MaxCDN will update immediately when I purge the file. As far as I know no consistent correlation with any action I take.
Is there something I have overlooked that could be making this issue happen?
Related
I've been struggling to get my update to go through. I incremented my version in the manifest and updated a lot of code, I upload the new package successfully and submit, but I never see the manifest version update in the chrome web store and there are no errors. The draft of the package always shows the same version as well (0.0.0.1 as I have never had a successful update). When I submit the status goes to pending and then public.
My only guess is that something is wrong with my account. This update includes moving from manifest v2 to v3 as well although I feel like I'd get an error if something was wrong there.
I've tried to resubmit this multiple times over the past 3 weeks and always the same lack of update.
There were a few .pem files in a random node module. I hadn't realize I should just remove node_modules from the zipped up package, but for some reason it only errored out and warned me about including .pem files sometimes. Most of the time updating the package seemed to work fine and didn't error. Very strange. Regardless it is fine now!
I'm using Firefox and use console.log quite a bit.
It just got updated to version 100, might have something to do with it.
But now files updates stop appearing in the browsers localhost upon refresh.
It's like the browser is still just seeing the older version of the files.
I have through trying and testing, found out that by going:
History
->Clear Recent History
->Select only cache, over the last 1 hour
And then click ok.
After that I can update the file and the browsers localhost will then show the new version after f5.
But after some time it stops working again in that given file.
And i need to repeat this.
I have tried:
Restarting the computer.
Reinstalling Firefox.
Renaming the given file im using each time.
Running sudo apt get update/upgrade
All to no avail.
It's like there is some cache storing the file and Firefox has just stopped updating that cache as I update the file.
Chromium appears to have the same problem.
I pressed f12 and there manually cleared several of the storages and that appears to solve the problem. But only temporarily. After a bit of time, its back.
It's like its emptying some cache, which once filled. Does not get updated again.
My OS is Linux Mint, 20.2.
And my browser is Firefox, version 100.0.
What's going on, how can I solve this?
Is anybody else having this issue?
Also, whatever the solution. I can expect this to be a problem for anyone trying to use my website. Question is how I can possibly solve this in a manner that solves this for anyone using my future website as well.
I think I found the solution
In Firefox, go to:
about:config
And set the following values:
browser.cache.disk.enable = false
browser.cache.memory.enable = false
network.http.use-cache = false
I think , you should try removing userdata from /home//.config or /home//.config Search for Mozilla or Firefox Folder
For firefox I like to use the Clear Cache addon, especially when doing web development in order to get the most recent version of pages. You may also look at changing the way your browser caches by modifying your code, as outlined here.
I'm making an app in the Cloud9 IDE using Node.js with the Express.js framework. Something very odd is happening to a specific .ejs file where if I try to update it (like typing some mumbo jumbo in an h1 tag and then saving and restarting the server), it NEVER gets reflected in the browser no matter what I do. For example, if I delete my jumbotron, save, restart the server, and then refresh the browser, I still see the same page with the jumbotron. I also tried deleting this entire file and then restarting the server and I still see the page and it doesn't break my application which is bizarre. All other .ejs files are fine and I can see the changes that I make.
I've spent about 4 hours trying to figure this out and no one else seems to have my specific issue. I tried clearing my browser cache, using different browsers, logging in/out from Cloud9, creating a new database, going back to older versions of my code, etc. and nothing seems to be working. I'm not even sure what code to post on here since my entire app is about 2000 lines of code so far. Does anyone have any suggestions because this is really frustrating.
When I update my code, and refresh the page, my server doesn't reflect the changes! It is still loading the old file? (this is running locally). I really don't understand why!
What I mean is that say I make a mistake, or decide to change a line of code. I fix the mistake, or change the line of code. But when I refresh the page, it is as though I have not done so. The mistake or the line of old code still remains behind and that code is ran!
Why?! I am using node's http-server to run my server on a Mac.
Angular sometimes comes with minimizing tools. If this is the case, you have to build your app using these tools to regenerate the files your browser should load.
If not, then probably it is a simple caching issue and you should clear your browser's caches.
In firefox , press Ctrl+Shift+I , then click on the gear (settings icon) ,
look for the option DISABLE CACHE under Advanced Settings , check it , so that it doesn't cache your old code , instead display the new code .
Angular caches your template files. You need to clear the cache, Chrome Developer Tools comes with a setting to disable the cache while the developer tools are open.
Disabling Chrome cache for website development
Our current deploy process goes something like this:
Use grunt to create production assets.
Create a datestamp and point files at our CDN (eg /scripts/20140324142354/app.min.js).
Sidenote: I've heard this process called "versioning" before but I'm not sure if it's the proper term.
Commit build to github.
Run git pull on the web servers to retrieve the new code from github.
This is a node.js site and we are using forever -w to watch for file changes and update the site accordingly.
We have a route setup in our app to serve the latest version of the app via /scripts/*/app.min.js.
The reason we version like this is because our CDN is set to cache JavaScript files indefinitely and this purposely creates a cache miss so that the code is updated on the CDN (and also in our users' browsers).
This works fine most of the time. But where it breaks down is if one of the servers lags a bit in checking out the new code.
Sometimes a client hits the page while a deploy is in progress and tries to retrieve the new JavaScript code from the CDN. The CDN tries to retrieve it but hits a server that isn't finished checking out the new code yet and caches an old or partially downloaded file causing all sorts of problems.
This problem is exacerbated by the fact that our CDN has many edge locations and so the problem isn't always immediately visible to us from our office. Some edge locations may have pulled down old/bad code while others may have pulled down new/good code.
Is there a better way to do these deployments that will avoid this issue?
As a general rule of thumb:
Don't do live upgrades. (unless the language supports it, but even then think twice)
Pulling code using git pull and then waiting for the app to notice changes to files sounds a lot like the 90's: uploading php files to an apache web server using ftp (or sftp if you are cool) and waiting for apache to notice that they were updated. It can't happen atomically, so of course there is a race condition. Some users WILL get a half built and broken site.
I recommend only upgrading your live and running application while no one is using it. Hopefully you have a pool of servers behind a load balancer of some sort, which will allow you to remove them one at a time and upgrade them.
This will mean that users will be able to use both the old and the new site at the same time depending on how and when they access it, but that is much better then not being able to access it at all.
Ideally you would be able to spin up copies of each of the web servers that you have running with the new version of the site. Check that the new version does work, and then atomically update the load balancer so that everyone gets bumped to the new site at the same time. And only once everything is verified to be working perfectly the old machines are shut down and decommissioned, or reused.
step 4 in your procedure should be:
git archive --remote $yourgithubrepo --prefix=$timestamp/ | tar -xf -
stop-server
ln -sf $timestamp current
start-server
your server would use the current directory (well, a symlink) at all times. no matter how long the deploy takes, your application is in a consistent state.
I'll go ahead and post our far-from-ideal monkey-patch that we're using right now.
We deploy once which may or may not go as planned, once we're sure the code is deployed on all the servers we do another build where the only thing that changes is the version number.
Then we deploy again server by server.
The race condition still exists but because the application code between the two versions is the same this masks the issue since no matter which server the CDN hits it gets the "latest" code.