I have an avatar upload page in my site. But when the user successfully uploaded the avatar the browser keeps showing the old one. How to prevent avatar image from being cached by browser?
Haven't tried it myself but this might work:
if ($request_uri ~* "^/image/location.png$") {
add_header Pragma "no-cache";
}
Related
I need to use the res.redirect() to redirect to a data URL using express.
This is my code...
app.get("/",(req,res)=>{
res.redirect("data:text/plain;base64,hhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh");
})
When I use another URL eg: https://www.google.com It works fine.
But when I use a data URL it is not working..
How should I fix it?
I am new to programming and StackOverflow. So, please be kind to me
You could redirect the user to a blob of the dataurl location.
For that just search google for dataurl to blob in js etc.
And also as you are using nodejs you could use the http.request and pipe the response to client just like proxying.
or you could fetch that file with "http.request" and save it in your server and give it when the client needs it
This is likely Chrome issue rather than NodeJS:
https://groups.google.com/a/chromium.org/g/blink-dev/c/GbVcuwg_QjM/m/GsIAQlemBQAJ
In practice, these will be blocked:
Navigations when the user clicks on links in the form of
window.open(“data:…”)
window.location = “data:…”
Meta redirects
I want to reproduce music files from google drive on a web page. I have the link for each file but the response cache headers for the calls are 'no-cache, no-store, max-age=0, must-revalidate" so it will never be saved on the browser cache. Is there any way to add cache to google drive files requests?
The problem:
When you use the drive link for a music file (mp3) https://drive.google.com/a/pucp.pe/uc?id=1kYYS9FZ9Vxif5WJM9ZQcY4SR35NMgoIE&export=download the GET API call receives a 302 code which generates a redirect to another URL, in this case, to 'https://doc-0o-28-docs.googleusercontent.com/docs/securesc/bgp95l3eabkkpccn0qi0qopvc4e7d4mq/us95e8ush1v4b7vvijq1vj1d7ru4rlpo/1556330400000/01732506421897009934/01732506421897009934/1kYYS9FZ9Vxif5WJM9ZQcY4SR35NMgoIE?h=14771753379018855219&e=download'. Each of these calls has no-cache in headers.
I tried using workbox (cache API) but I don't find a way to cache redirects, probably I need to cache both calls (the first GET and the redirect). However, if I use the redirected URL the caching works, but I don't have access to that URL until the first call is made.
I tried to use a proxy server from a NodeJS server
app.get("/test", (req, res) => {
try {
https.get(
URL,
function(response) {
res.writeHead(response.statusCode, {...response.headers,
"Cache-Control": "public, max-age=120",
"Expires": new Date(Date.now() + 120000).toUTCString() })
response.pipe(res);
}
);
} catch (e) {
console.log(e);
}
});
I tried using the first URL with no luck.
I tried using the redirect URL but I get a "Status Code: 302 Found"
One solution could be to download the file and serve it directly from my server but I will be missing the point of using the drive storage. I really want to use google drive storage and not duplicate all files on my server.
Is there a recommended way to do the caching for this case? maybe there is some google drive configuration that I'm missing. Or do you know another approach I could take in this case?
You should be able to cache redirected responses using Workbox (and the Cache Storage API in general). HTTP 30x redirects are followed automatically by default, and you should only have to route the original, non-redirected URL.
Here's a live example of a Glitch project that uses Workbox to cache that MP3 file: https://glitch.com/edit/#!/horn-sausage?path=sw.js:9:0
The relevant snippet of code, that also accounts for the fact that there is no CORS used when serving the files (so you'll get back an opaque response with a status of 0) is:
workbox.routing.registerRoute(
new RegExp('https://drive.google.com/'),
new workbox.strategies.CacheFirst({
plugins: [
new workbox.cacheableResponse.Plugin({statuses: [0, 200]})
],
})
);
I have a Node.js server with Express running on the same machine where Nginx is installed. This is my nginx code in sites-enabled:
upstream project {
server 192.168.0.102:3100;
}
server {
listen 80;
location / {
proxy_pass http://project;
}
}
With this config, when I type my domain on a public computer, the first page of my website shows up, so thats okay. But the webpage is a form where I should be able to upload data to my server, and when I press upload in my website, nothing happens. This is my sudo-code:
app.get("/", function(request,response){
//Code to send html page. This is the part where it runs fine.
});
app.post("/package/upload", function(request,response){
//Code to read from request.body and request.files and save uploaded file to my server.
//Nothing happens when I try to upload through a normal XMLHTTP object in my website.
});
I'm been working on servers for some time, but its my first time using nginx for server optimization and load-balancing. Can somebody help me what I'm missing?
(I can't comment, not enough rep)
It looks like you've setup the proxy_pass only on /. Have you tried defining your POST location /package/upload and the corresponding proxy_pass?
What do you see when you look at the Network panel in your browsers developer tools when you try to upload? HTTP Status codes help a ton when debugging.
i currently have transmission-daemon web ui served by nginx
server {
listen 2324;
server_name torrent.example.com;
location /rpc {
proxy_pass http://127.0.0.1:9091/transmission/rpc;
}
location /transmission {
proxy_pass http://127.0.0.1:9091/transmission;
}
location / {
proxy_pass http://127.0.0.1:9091/transmission/web/;
}
}
i am trying to display this page via https://github.com/stormpath/stormpath-express-sample this dashboard/user interface
in routes/index.js i have
router.get('/torrent', function (req, res, next) {
if (!req.user || req.user.status !== 'ENABLED') {
return res.redirect('/login');
}
var newurl = 'http://127.0.0.1:2324'
request(newurl).pipe(res)
});
i see the html when i goto /torrent but no images/css/js i am thinking the request is not the right tool for this purpose could some one offer a better solution
many thanks
Your HTML probably refers to CSS/images/etc using URLs such as /index.css. The browser makes these into fully-qualified URLs that look like http://torrent.example.com/index.css, which is not proxied by ngnix the way you have it set up.
You probably want to either use URLs such as /transmission/index.css for your CSS (when specified in the HTML), or alternatively have a <base> tag in your HTML for that.
ok so i have made progress with html/css i moved the transmission interface into the express root and imported the html into jade but now i am having a new problem
when i load the /torrent page i can seen in the web console it makes a request to /rpc which i have made a route for
router.get('/rpc|/rpc/', function (req, res) {
var newurl = 'http://127.0.0.1:9091/torrent/rpc/'
request(newurl).pipe(res)
});
but this comes up with a 404 when i change router.get to router.post i get a 405 error
i have removed the 409 error from transmission so this should work
i have solved the issue
i imported the transmission index.html into a jade template
i routed /torrent to render that template
then i made a new route for /rpc|/rpc/ made that do a post request to the backend transmission-daemon
i also changed /js/remote.js to look for the RPC._Root at the atchual domain
currently i am using angularjs and nodejs
i have some angular templates in my server, when the angular requesting the template, the template will be rendered in browser and it will be cached by the browser, if suppose i am changing the layout of the template and updating the template which is in server, now when the user is redirected to the same template the browser renders the template from the browser cache which is old one.
how to overcome this problem?
NOTE : i don't want to clear my whole browser cache since it will affect my overall website performance.
Practical
On some page you are not caching, you can append ?ver= to the url to "break" the cache. Whenever you change the URL it will cause the browser to reload the resource. Add ?ver= to cached resources and change it when you want a reload.
Interesting but less practical
Creating a single page app, I solved this sort of issue using AppCache. With application cache you can ask the page to reload a resource. From the linked page:
// Check if a new cache is available on page load.
window.addEventListener('load', function(e) {
window.applicationCache.addEventListener('updateready', function(e) {
if (window.applicationCache.status == window.applicationCache.UPDATEREADY) {
// Browser downloaded a new app cache.
// Swap it in and reload the page to get the new hotness.
window.applicationCache.swapCache();
if (confirm('A new version of this site is available. Load it?')) {
window.location.reload();
}
} else {
// Manifest didn't changed. Nothing new to server.
}
}, false);
}, false);
Note, this only works on new browsers like Chrome, Firefox, Safari, Opera and IE10. I wanted to suggest a new approach.