Are web page data files stored locally for a time? - web

When we hit any URL( like Facebook.com) on any browser(like chrome) it uses many resources for that particular page like JS files, Images, properties files etc. So, are they stored locally temporarily ?

Yes, it's called the browser cache :-) Websites can also use local storage to store some data on your machine. Additionally along the way various servers might cache the resources. ISPs do this a lot.

Related

saving Image to file system in Asp.net Core

I'm building an application that saves a lot of images using the C# Web API of ASP.NET.
The best way seems to save the images in the file system and save their path into the database.
However, I am concerned about load balancing. Since every server will put the image in its own file system, how can another server behind the same load balancer retrieve the image?
If you have resources for it, I would state that:
the best way is to save them in the file system and save the image path into the database
Is not true at all.
Instead, Id say using an existing file server system is probably going to produce the best results, if you are willing to pay for the service.
For dotnet the 'go to' would be Azure Blob Storage, which is ideal for non-streamed data like images.
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet
Otherwise, you can try and create your own file storage service from scratch. In which case you will effectively be creating a separate service API apart from your main cluster that handles your actual app, this secondary API just handles file storage and runs on its own dedicated server.
You then simply just create an association between Id <-> File Data on the File Server, and you're App Servers can request and push files to the File Server via those Ids.
Its possible but a File Server is for sure one of those types of projects that seems straightforward at first but very quickly you realize its a very very difficult task and it may have been easier to just pay for an existing service.
There might be existing self hosted file server options out there as well!

Best way to implement server side cache in Node JS

I'm trying to implement the server-side cache in Node JS, I've read about express-redis-cache, but how would this solution work with load balanced node servers? I might use something like AWS Redis Service, but it loses the whole purpose of using Redis on some external server as it increases latency. Can you suggest the best approach for this?
PS - I have some .md & .json files using which I generate the .html files and return. So, instead of doing this, I want to have some caching which will store this generated .html files. I'll update the cached content only when my .md & .json files are updated.
I've read about express-redis-cache, but how would this solution work
with load balanced node servers?
It wouldn't be a problem because all your load balanced node servers would connect to the same Redis host, which is fine.
I might use something like AWS Redis Service, but it loses the whole
purpose of using Redis on some external server as it increases latency
It depends how you architect your app. If you are fully hosted on AWS, Elasticache is designed for this, latency would be minimal as connection will be within the VPC which is fast. If you need to connect to elasticache from a client on premise, you still have options: VPN (not ideal) or DirectConnect which would be much faster than a VPN.
Having said that, if you are looking to cache .html files
probably then look at CloudFront instead of a bespoke caching solution using Redis.

Host and Deliver Big Files on Nodejs Nodejitsu

I have a website hosted on Nodejitsu using Nodejs.
I want people to be able to download files. Overall, there are about 1k files of 1MB each for a total of 1GB. Those files are in the same directory than regular code.
When I try to deploy, there's the message: "Snapshot is larger than 70M!"
How are you supposed to deliver files with Nodejs? Do I need to host them on a separate website (ex: mediafire) and I redirect people there? Or is there a special place to put them?
Services like Nodejitsu are meant for hosting your application. It sounds like these are static files, not something generated by your application. I recommend putting these on a CDN if they will get a lot of traffic. CloudFront can easily sit out in front of S3. Otherwise, buy a cheap VPS and dump your files there.
I also recommend not using your application server (your Node.js server) to host static content. While it can certainly do this, software like Nginx is often faster. (Unless of course you have a specific reason to serve these from your application...)

What solutions are there to backup millions of image files and sub-directories on a webserver efficiently?

I have a website that I host on a Linux VPS which has been growing over the years. One of its primary functions is to store images/photos and these image files are typically around 20-40kB each. The way the site is organised at the moment is all images are stored in a root folder ‘photos’ and under that root folder are many subfolders determined by a random filename. For example, one image could have a file name abcdef1234.jpg and that would be stored in the folder photos/ab/cd/ef/. The advantage of this is that there are no directories with excessive numbers of images in them and accessing files is quick. However, the entire photos directory is huge and is set to grow. I currently have almost half a million photos in tens of thousands of sub-folders and whilst the system works fine, it is fairly cumbersome to back up. I need advice on what I could do to make life easier for back-ups. At the moment, I am backing up the entire photos directory each time and I do that by compressing the folder and downloading it. It takes a while and puts some strain on the server. I do this because every FTP client I use takes ages to sift through all the files and find the most recent ones by date. Also, I would like to be able to restore the entire photo set quickly in the event of a catastrophic webserver failure so even if I could back up the data recursively, how cumbersome would it be to have to upload each back stage by stage?
Does anyone have any suggestions perhaps from experience? I am not a webserver administrator and my experience of Linux is very limited. I have also looked into CDN’s and Amazon S3 but this would require a great deal of change to my site in order to make these system work – perhaps I’ll use something like this in the future.
Since you indicated that you run a VPS, I assume you have shell access which gives you substantially more flexibility (as opposed to a shared webhosting plan where you can only interact with a web frontend and an FTP client). I'm pretty sure that rsync is specifically designed to do what you need to do (sync large numbers of files between machines, and do so efficiently).
This gets into Superuser territory, so you might get more advice over on that forum.

Uploading large files in JSF

I want to upload a file that is >16GB. How can I do this in JSF?
When using HTTP, you'll face two limitations. The one on the client side (webbrowser) and the one on the server side (webserver). The average webbrowser (IE/FF/Chrome/etc) has a limit of 2~4GB, depending on the make/version/platform. You cannot control this from the server side on. The enduser has to change the browser settings itself (sometimes this isn't possible at all). The average webserver (Tomcat/JBoss/Glassfish/etc) in turn has a limit of 2GB. You can configure this, but this still won't and can't remove the limitation on the webbrowser.
Your best bet is FTP. If you want to do this by a webpage, consider an applet which utilizes Apache Commons Net FTPClient. There are several ready-to-use opensource/commercial ones by the way.
You however still need to take into account that the disk file system on the FTP server side supports that large files. FAT32 for example has a limit of 4GB per file. NTFS and several *Nix file systems, however, can go up to 16EB.

Resources