As a file hosting provider, how do you prevent phishing? - security

We develop a service much like Dropbox or Google Drive or S3 where our customers can host their files and get direct links to them with our domain, e.g. www.ourservice.com/customer_site/path/to/file.
Recently we started receiving emails from our server provider that we host phishing .html files. It appeared that some of our customers did host such htmls and of course we removed the files after that and blocked the customers.
But the problem is that we would like to prevent this from happening at all, mainly because this lowers our Google Search index and of course we never wanted to be a hosting for phishing scripts.
Is there a way how other file hosting providers solve this? I guess we could run some checks (anti-virus, anti-phishing etc.) when the customers upload their files, but that would be pretty tedious considering that there's a lot of them. Another options is to periodically check all the files, or all new files, but still I'd better ask if I'm missing any easier solution.

Related

Need some ideas on how one can spam some website, crawl some website and waste it's resources

I am working on a startup which basically serves website. Sorry, I can't reveal much details about the startup.
I need some ideas on how spammers and cralwer devs think on attacking some website. And if possible, then a way to prevent such attacks too.
We have come up with some basic ideas like:
1. Include a small JS file in the sites that would send an ACK on our servers ones all the assets are loaded. Like some crawlers/bots only come to websites and download specific stuff like images or articles. In such cases, our JS won't be triggered. And when we study our logs, which will have a record of resources requested by the particular IP and if out JS was triggered or not. We can then whitelist or blacklist IP's based on the study.
2. Like email services do, we will load a 1x1 px image on the client side via an API call. In simple words, we won't add the "img" tag directly in out HTML, but rather a JS that calls an API on our server that returns the image to the client.
3. We also have a method to detect Good bots like that of google which indexes our pages. So we can differentiate between good bots and bad bots that just waste our resources.
We are at a very basic level. Infact, all our code does right now is logs the IP's and assets requested by that IP in elasticsearch.
And so we need ideas on how people spam/crawl websites via cralwers/bots/etc. So we can come up with some solution. And if possible, please also mention the pros and cons and ways to defend against your ideas too.
Thanks in advance. If you share your ideas, you'll be helping a startup which will be doing a lot of good stuff.

Using another server to store files: Good or bad idea?

I am thinking of using another "less" important server to store files that our clients want to upload and handling the data validation, copying, insertion, etc at that end.
I would display the whole upload thingy through iframe on our website and using HTML,PHP,SQL as syntax-languages for the thingy?
Now I would like to ask your opinions is this is a good or bad idea.
I´m figuring out that the pros and cons are:
**Pros:
The other server is "less" valuable, meaning if something malicious could be uploaded there it would not be the end of the world
Since the other server has less events/users/functionality/data it would help to lessen the stress of our main website server
If the less important server goes down the other functionality on main server would still be functioning
Firewall prevents outside traffic (at least to a certain point)
The users need to be logged through the main website
**Cons:
It does not have any CMS+plugins, so it might be more vunerable
It might generate more malicious traffic towards it.
Makes the upkeep of the main website that much more complicated for future developers
Generally I´m not found of the idea that users get to uploading files, but it is not up to me.
Thanks for your input. I´m looking forward to hearing your opinions.
Servers have file quotas and bandwidths defined/allocated for them.
If you transfer your "less" used files to another server ,it will help your main server to improve its performance.
And also there wont be much maintenance headaches with the main server if all files are uploaded there.
Conclusion : It is a good idea.
Well, I guess most importantly, you will need a single sign-on (SSO) solution in place between the two web applications. I assume you don't want user A be able to read or delete files from user B.
SSO between 2 servers is a lot more complicated than for a single web application. Unless this site is only deployed in an intranet with a Active Directory domain controller in which case you can use Kerberos.
I'm not sure it's worth it just for the advantages you name.

Setting up a secure webserver

So I have finished creating my first website that I will be hosting online. It have php, html, and javascript. Now I am looking for a way to host my website securely. I have looked at sites like godaddy and web hosting hub. I was wondering what the best hosting service would be for my needs.
My needs:
Able to run php
Have a actual name, like www.noahhuppert.com
Be able to obscure the code so people can not just copy it(This is because my website is for my website design company and I have examples of templates people can use, but I don't want people jsut stealing those templates with a simple right click + inspect element)
Run server side scripts(Like slowing down connections to users if they fail to login too many times, to prevent brute force cracking attempts)
Deny access to people reading files(I don't want people downloading my password hash files or anything like that)
Be able to host files on the services servers, I don't just want a dns pointing back to my computer.
This question is asking for an opinion. Basically any linux web host will provide most of what you're looking for. You're asking for an opinion about which hosting site is the best. I cannot answer that.
What I do want to warn you about is this:
From your question, you're concerned with:
- security
this is not a web host provider feature, but a feature of secure web code. See https://www.owasp.org/index.php/Top_10_2013 for great introduction to website security.
obscure code
You cannot prevent someone from stealing your css. They will not get to your raw templates (I'm assuming you're using templates) if you set your file permissions right on the web server.
if you're concerned with brute force protections, you'll need to code that up yourself. The web host provider would not (and should not) rate limit your connections.

I need to speed up my site and reduce the number of files calls

My webhost is aking me to speed up my site and reduce the number of files calls.
Ok let me explain a little, my website is use in 95% as a bridge between my database (in the same hosting) and my Android applications (I have around 30 that need information from my db), the information only goes one way (as now) the app calls a json string like this the one in the site:
http://www.guiasitio.com/mantenimiento/applinks/prlinks.php
and this webpage to show in a web view as welcome message:
http://www.guiasitio.com/movilapp/test.php
this page has some images and jquery so I think this are the ones having a lot of memory usage, they have told me to use some code to create a cache of those files in the person browser to save memory (that is a little Chinese to me since I don't understand it) can some one give me an idea and send me to a tutorial on how to get this done?. Can the webview in a Android app keep caches of this files?
All your help his highly appreciated. Thanks
Using a CDN or content delivery network would be an easy solution if it worked well for you. Essentially you are off-loading the work or storing and serving static files (mainly images and CSS files) to another server. In addition to reducing the load on your your current server, it will speed up your site because files will be served from a location closest to each site visitor.
There are many good CDN choices. Amazon CloudFront is one popular option, though in my optinion the prize for the easiest service to setup is CloudFlare ... they offer a free plan, simply fill in the details, change the DNS settings on your domain to point to CloudFlare and you will be up and running.
With some fine-tuning, you can expect to reduce the requests on your server by up to 80%
I use both Amazon and CloudFlare, with good results. I have found that the main thing to be cautious of is to carefully check all the scripts on your site and make sure they are working as expected. CloudFlare has a simple setting where you can specify the cache settings as well, so there's another detail on your list covered.
Good luck!

authorised access to externally hosted files

I'm looking for a solution to what I would have thought is an fairly common industry scenario, but I may be approaching it incorrectly.
I have a website which includes areas for member file uploads. The files (pdfs, mp3s docs etc) can then be shared with other members according to various authorisation criteria. I'd like the files to be stored and served over a cloud server such as rackspace file cloud, and to be separate from the website in order that file streaming traffic does not affect web performance, and also so that the site doesn't need to be hosted on an expensive cloud package.
I need a solution to ensure that the urls for files on the file server require authorisation checks for access. I had hoped to install a script on rackspace file cloud to do an API check back to my website before serving, but apparently rackspace file cloud doesn't allow this.
Imagine a site where you purchase mp3s. I've done this before where the mp3 file is streamed via a php web script which checks you've paid, but I want don't want my web server to be affected by streaming large or frequently downloaded files.
Surely this is common and simple?!
Rackspace TempURL functionality might fit these needs.
Then the architecture of the solution looks like:
Host authorization/roles/permissions code on your (expensive) application server hosting
Upon successful auth for the resource, generate a TempURL the expires after a few seconds
Use a 302 redirect to direct the client to TempURL for the resource
Now, with this solution there's nothing preventing a malicious authorized user from distributing the URL for multiple downloads within the small window that it's active. You can decide whether that's a problem or not given that you're intended to serve it to them and they could just as easily distribute it out of band afterward.

Resources