I'm looking for a solution to what I would have thought is an fairly common industry scenario, but I may be approaching it incorrectly.
I have a website which includes areas for member file uploads. The files (pdfs, mp3s docs etc) can then be shared with other members according to various authorisation criteria. I'd like the files to be stored and served over a cloud server such as rackspace file cloud, and to be separate from the website in order that file streaming traffic does not affect web performance, and also so that the site doesn't need to be hosted on an expensive cloud package.
I need a solution to ensure that the urls for files on the file server require authorisation checks for access. I had hoped to install a script on rackspace file cloud to do an API check back to my website before serving, but apparently rackspace file cloud doesn't allow this.
Imagine a site where you purchase mp3s. I've done this before where the mp3 file is streamed via a php web script which checks you've paid, but I want don't want my web server to be affected by streaming large or frequently downloaded files.
Surely this is common and simple?!
Rackspace TempURL functionality might fit these needs.
Then the architecture of the solution looks like:
Host authorization/roles/permissions code on your (expensive) application server hosting
Upon successful auth for the resource, generate a TempURL the expires after a few seconds
Use a 302 redirect to direct the client to TempURL for the resource
Now, with this solution there's nothing preventing a malicious authorized user from distributing the URL for multiple downloads within the small window that it's active. You can decide whether that's a problem or not given that you're intended to serve it to them and they could just as easily distribute it out of band afterward.
Related
We develop a service much like Dropbox or Google Drive or S3 where our customers can host their files and get direct links to them with our domain, e.g. www.ourservice.com/customer_site/path/to/file.
Recently we started receiving emails from our server provider that we host phishing .html files. It appeared that some of our customers did host such htmls and of course we removed the files after that and blocked the customers.
But the problem is that we would like to prevent this from happening at all, mainly because this lowers our Google Search index and of course we never wanted to be a hosting for phishing scripts.
Is there a way how other file hosting providers solve this? I guess we could run some checks (anti-virus, anti-phishing etc.) when the customers upload their files, but that would be pretty tedious considering that there's a lot of them. Another options is to periodically check all the files, or all new files, but still I'd better ask if I'm missing any easier solution.
I host a software product on Azure, and store the downloads themselves in a public container, which the website links to via URL. You can see my downloads page here: https://flyinside-fsx.com/Download
Normally I get somewhere in the range of 200mb-500mb worth of downloads per day, with the downloaded files themselves being 15-30mb. Starting these week, I've seen spikes of up to 220GB per day from this storage container. It hasn't harmed the website in any way but the transfer is costing me money. I'm certainly not seeing an increase in website traffic that would accompany 220GB worth of downloads, so this appears to either be some sort of DOS attack or a broken automated downloader.
Is there a way to remedy this situation? Can I set the container to detect and block malicious traffic? Or should I be using a different type of file hosting entirely, which offers these sorts of protections?
To see what's going on with your storage account, best way would be to use Storage Analytics especially see the storage activity logs. These logs are stored in a special blob container called $logs. You can download the contents of the blob using any storage explorer which supports exploring the contents of it.
I would highly recommend starting from there and identify what exactly is going on. Based on the findings, you can take some corrective actions. For example, if the traffic is coming via some bots, you can put a simple CAPTCHA on the download page.
So I have finished creating my first website that I will be hosting online. It have php, html, and javascript. Now I am looking for a way to host my website securely. I have looked at sites like godaddy and web hosting hub. I was wondering what the best hosting service would be for my needs.
My needs:
Able to run php
Have a actual name, like www.noahhuppert.com
Be able to obscure the code so people can not just copy it(This is because my website is for my website design company and I have examples of templates people can use, but I don't want people jsut stealing those templates with a simple right click + inspect element)
Run server side scripts(Like slowing down connections to users if they fail to login too many times, to prevent brute force cracking attempts)
Deny access to people reading files(I don't want people downloading my password hash files or anything like that)
Be able to host files on the services servers, I don't just want a dns pointing back to my computer.
This question is asking for an opinion. Basically any linux web host will provide most of what you're looking for. You're asking for an opinion about which hosting site is the best. I cannot answer that.
What I do want to warn you about is this:
From your question, you're concerned with:
- security
this is not a web host provider feature, but a feature of secure web code. See https://www.owasp.org/index.php/Top_10_2013 for great introduction to website security.
obscure code
You cannot prevent someone from stealing your css. They will not get to your raw templates (I'm assuming you're using templates) if you set your file permissions right on the web server.
if you're concerned with brute force protections, you'll need to code that up yourself. The web host provider would not (and should not) rate limit your connections.
My webhost is aking me to speed up my site and reduce the number of files calls.
Ok let me explain a little, my website is use in 95% as a bridge between my database (in the same hosting) and my Android applications (I have around 30 that need information from my db), the information only goes one way (as now) the app calls a json string like this the one in the site:
http://www.guiasitio.com/mantenimiento/applinks/prlinks.php
and this webpage to show in a web view as welcome message:
http://www.guiasitio.com/movilapp/test.php
this page has some images and jquery so I think this are the ones having a lot of memory usage, they have told me to use some code to create a cache of those files in the person browser to save memory (that is a little Chinese to me since I don't understand it) can some one give me an idea and send me to a tutorial on how to get this done?. Can the webview in a Android app keep caches of this files?
All your help his highly appreciated. Thanks
Using a CDN or content delivery network would be an easy solution if it worked well for you. Essentially you are off-loading the work or storing and serving static files (mainly images and CSS files) to another server. In addition to reducing the load on your your current server, it will speed up your site because files will be served from a location closest to each site visitor.
There are many good CDN choices. Amazon CloudFront is one popular option, though in my optinion the prize for the easiest service to setup is CloudFlare ... they offer a free plan, simply fill in the details, change the DNS settings on your domain to point to CloudFlare and you will be up and running.
With some fine-tuning, you can expect to reduce the requests on your server by up to 80%
I use both Amazon and CloudFlare, with good results. I have found that the main thing to be cautious of is to carefully check all the scripts on your site and make sure they are working as expected. CloudFlare has a simple setting where you can specify the cache settings as well, so there's another detail on your list covered.
Good luck!
Currently at my job, we are distributing installers for our windows base software via an apache web server on a Ubuntu server using apache authentication. When we initially started doing this we only had 3 projects to distribute and as such, 3 htpasswd files to manage. Since then, we have grown and are now distributing 8 projects as well as several sponsor specific variants of each of these. I addition, we have started distributing videos to potential research sponsors along with the software which are quite large. This has become a huge burden to manage all of the htpasswd files. So my question is, what is a better way to provide password protected access to large file sets in a web based manner? I am thinking that a CMS might be appropriate in this case, but I am interested in other ideas people may have. My specific requirements are:
Run on Apache/Linux. Specifically Ubuntu 6.06/Apache2
Free or relatively cheap, research doesn't provide for expensive enterprise software
Ability to easily create users and set an expiration date for their account
Ability to create a logical collections of files, and restrict users to only see these specific collections
Able to handle relatively large files (upwards of hundreds of megabytes though this is rare). In addition, there should be an easy method to add files outside of a web interface as uploading a 300 MB video wouldn't be feasible via the web. A command line client would probably be best.
Any suggestions of software that can handle the above requirements is greatly appreciated.
Se up apache to use LDAP for authentication. Then you can use a pre-existing LDAP frontend, or roll your own, to manage access rights and account expiration.
With LDAP, you could have a group for each project, so that users can have access to several projects by being in several groups.
Some info on setting up Apache for LDAP can be found here.
The directory can be managed by, for example, phpLDAPadmin or the old but good LDAP Browser/Editor if you prefer a offline java app.
I might consider Plone with the LDAP plugin.
As a side note, I'd also suggest updating to the more recent LTS release of Ubuntu, but it's not mandatory :)
It might be worth thinking about Amazon S3. It's not free, but it is very cheap.
You can't have users, but you can generate individually signed URLs for each file - URLs which will permit access for some pre-determined period of time.
So rather than having to register users, worry about distributing passwords and expiring them after some time, simply generate URLs for the files you need to share, and give the URLs to your users.
And there are any number of clients for putting files on S3 - if you want a command line interface, just mount it as a filesystem and "cp" the files there.