So I have finished creating my first website that I will be hosting online. It have php, html, and javascript. Now I am looking for a way to host my website securely. I have looked at sites like godaddy and web hosting hub. I was wondering what the best hosting service would be for my needs.
My needs:
Able to run php
Have a actual name, like www.noahhuppert.com
Be able to obscure the code so people can not just copy it(This is because my website is for my website design company and I have examples of templates people can use, but I don't want people jsut stealing those templates with a simple right click + inspect element)
Run server side scripts(Like slowing down connections to users if they fail to login too many times, to prevent brute force cracking attempts)
Deny access to people reading files(I don't want people downloading my password hash files or anything like that)
Be able to host files on the services servers, I don't just want a dns pointing back to my computer.
This question is asking for an opinion. Basically any linux web host will provide most of what you're looking for. You're asking for an opinion about which hosting site is the best. I cannot answer that.
What I do want to warn you about is this:
From your question, you're concerned with:
- security
this is not a web host provider feature, but a feature of secure web code. See https://www.owasp.org/index.php/Top_10_2013 for great introduction to website security.
obscure code
You cannot prevent someone from stealing your css. They will not get to your raw templates (I'm assuming you're using templates) if you set your file permissions right on the web server.
if you're concerned with brute force protections, you'll need to code that up yourself. The web host provider would not (and should not) rate limit your connections.
Related
I'm hoping to get some kind of idea if what I have in mind is even possible or if I'm looking in the wrong place.
Basically, my company provides a website which users are able to access online with credentials we sell and provide them. We have another potential customer who would like to access this website. Sadly this customer is very stuck in the past, and they don't allow their users any internet access at all.
For a number of reasons, I don't want them to host their own version of this website. However, I considered that we might configure a web proxy on their network (which is given internet access) which reverse forwards connections to our website. Is this even possible? And should it be attempted? Or are there better ways to achieve this?
Yes it's possible, you can install on their intranet a simple proxy script for example
https://github.com/Athlon1600/php-proxy-app
and modify the index.php and allow from there only a single host to your website.
I don't know what technology you can use on their Intranet network but virtually for every web language, such software is available.
Here is some discussion related to the "Access the sites blocked over the network" that is just for Gmail but it will definitely help you too:
https://superuser.com/questions/453825/how-to-bypass-web-url-filtering-service-to-access-blocked-websites-proxy
For bypassing the firewall and getting access to the blocked sites:
http://www.makeuseof.com/tag/how-to-get-into-blocked-websites-in-school-with-freeproxy/
I am thinking of using another "less" important server to store files that our clients want to upload and handling the data validation, copying, insertion, etc at that end.
I would display the whole upload thingy through iframe on our website and using HTML,PHP,SQL as syntax-languages for the thingy?
Now I would like to ask your opinions is this is a good or bad idea.
I´m figuring out that the pros and cons are:
**Pros:
The other server is "less" valuable, meaning if something malicious could be uploaded there it would not be the end of the world
Since the other server has less events/users/functionality/data it would help to lessen the stress of our main website server
If the less important server goes down the other functionality on main server would still be functioning
Firewall prevents outside traffic (at least to a certain point)
The users need to be logged through the main website
**Cons:
It does not have any CMS+plugins, so it might be more vunerable
It might generate more malicious traffic towards it.
Makes the upkeep of the main website that much more complicated for future developers
Generally I´m not found of the idea that users get to uploading files, but it is not up to me.
Thanks for your input. I´m looking forward to hearing your opinions.
Servers have file quotas and bandwidths defined/allocated for them.
If you transfer your "less" used files to another server ,it will help your main server to improve its performance.
And also there wont be much maintenance headaches with the main server if all files are uploaded there.
Conclusion : It is a good idea.
Well, I guess most importantly, you will need a single sign-on (SSO) solution in place between the two web applications. I assume you don't want user A be able to read or delete files from user B.
SSO between 2 servers is a lot more complicated than for a single web application. Unless this site is only deployed in an intranet with a Active Directory domain controller in which case you can use Kerberos.
I'm not sure it's worth it just for the advantages you name.
I want to create a dynamic website that does not support IIS. The area where I work does not allow anything to be installed in the server. The have a windows based server and I would like to create a dynamic website. IIS not allowed and server side languages like asp.net, php are not allowed. They did not say anything about client side. Is it possible to do?
In short, a general answer to your question Is it possible? would be No, it's not. And if you still find a way, it's not going to be worth the effort.
For one thing, even without programming languages like ASP.NET or PHP, you still need a web server such as IIS to serve static content. There are of course alternatives to IIS specifically, but no web server at all means no serving web sites at all.
If you would be given an opportunity to server static content, you could possibly produce a web site that is dynamic at least on a per visit basis using client side scripting and cookies, but the things you could make that site do would be very limited, and without anything other than serving static content there is no saving things between sessions, or in any way affecting the server side of the application.
You have to ask yourself why you need to serve this website. Is this something your company would benefit from? If so, could you convince the IT department to setup an environment to serve it? Are there any other alternatives? And, perhaps the most important one: there are lots of free or almost free web hosting solutions out there. Why not just use one of them?
There are many excellent reasons why you would want to create a dynamic website without using a web server. Here are a couple:
You are creating a website as a means of presenting a dataset with hyperlinks that you want to be able to archive on read-only media and ignore for 10 years or more (as you can do with books), and still be able to read (IIS is very poor at backwards compatibility).
You need to present your data to people who have no access to servers or the internet and have no idea how to turn their PC into a web server (there are many millions of such people in the developing world)
Yes, it's challenging, but if you want something to be readable by anyone, anywhere, anytime, and all you can count on are web browsers, there's no option.
By saying you want to do it without IIS, I'm assuming you're implying Apache as well (since you reference no server-side languages).
It depends what you mean by 'dynamic'. Essentially you'll be limited to
JavasScript, which means that you can manipulate information and elements already on the page.
iFrames - this would let you load external pages into elements and pages on the page. These could be dynamic, and if they were on the same server you could manipulate it as well. If it was from an external server, then you wouldn't have control over it from that page.
If you are able to set up an HTTP proxy, you can use JavaScript together with a service like CouchOne. You will need the proxy, since browsers restrict AJAX calls.
I need dev and beta sites hosted on the same server as the production environment (let's let that fly for practical reasons).
To keep things simple, I can accept the same protections in place on both dev and beta -- basically don't let it get spidered, and put something short of user names and passwords in place to prevent everyone and their brother from gaining access (again, there's a need to be practical). I realize that many people would want different permissions on dev than on beta, but that's not part of the requirements here.
Using robots.txt file is a given, but then the question: should the additional host(s) (aka "subdomain(s)") be submitted to the Google Webmaster tools as an added preventive measure against inadvertent spidering? It should go without saying, but there will be no linking into the dev/beta sites directly, so you'd have to type in the address perfectly (with no augmentation by URL Rewrite or other assistance).
How could access be restricted to just our team? IP addresses won't work because of the various methods of internet access (meetings at lunch spots with wifi, etc.).
Perhaps having dev/beta and production INCLUDE a small file (or call a component) that looks for URL variable to be set (on the dev/beta sites) or does not look for the URL variable (on the production site). This way you could leave a different INCLUDE or component (named the same) on the respective sites, and the source would otherwise not require a change when it's moved from development to production.
I really want to avoid full-on user authentication at any level (app level or web server), and I realize that leaves things pretty open, but the goal is really just to prevent inadvertent browsing of pre-production sites.
Usually I see web server based authentication with a single shared username and password for all users, this should be easy to set up. An interesting trick might be to check for a cookie instead, and then just have a better hidden page to set that cookie. You can remove that page when everyone's visited it, or implement authentication just for that file, or allow access to it just from the office and require people working from home to use VPN or visit the office if they clear their cookies.
I have absolutely no idea if this is the "proper" way to go about doing it, but for us we place all Dev and Beta sites on very high port numbers that crawlers/spiders/indexers never go to (in fact, I don't know of any off the top of my head that go beyond port 80 unless they're following a direct link).
We then have a reference index page listing all of the sites with links to their respective port numbers, with only that page being password-protected. For sites involving real money transactions or other sensitive data, we display a short red bar on top of the website explaining that it is just a demo server, on the very rare chance that someone would directly go to a Dev URL and Port #.
The index page is also on a non-standard (!= 80) port. But even if a crawler were to reach it, it wouldn't get past the password input and would never find the direct links to all the other ports.
That way your developers can access the pages with direct URLs and Ports, and they have a password-protected index for backup should they forget.
How can it be done?
Did you ever experienced something like this?
If you're finding JavaScript injected into your web site content (not via XSS but actually present in the file contents) you've most likely been hit by a worm or virus.
A good example is the Gumblar virus, which spread very rapidly indeed a few months ago; it used FTP password sniffing to find FTP details of people's sites and modified them, injecting malicious JavaScript to send site visitors to malware sites etc.
The specifics of removing such viruses depends on the specific virus, but a good start is:
Replace the contents of the site with a known clean backup
Make sure all security patches are applied to your server and all software you're running on it, as well as e.g. any modules or 3rd-party libraries being used on the site
Make sure all computers which are used to access the site (via FTP or an administration interface, for example) have been marked as clean by a reputable and up-to-date virus scanner so you don't get any passwords sniffed
As the password for your site may already have leaked out into the big wide world via (say) a botnet, change all your FTP + administration passwords on the site so you don't just have to go right back to the start again.
Good luck!
You have probably experienced Cross Site Scripting (XSS).
From Wikipedia:
Cross-site scripting (XSS) is a type of computer security vulnerability typically found in web applications which enable malicious attackers to inject client-side script into web pages viewed by other users.