I have multiple domains, such as:
abc.example.com
bcd.example.com
cde.example.com
And a root directory:
D:\websites\example.com
I want to point each to its own directory, as in:
D:\websites\example.com\abc
D:\websites\example.com\bcd
D:\websites\example.com\cde
I can obviously do this manually by setting each one up as a separate website. But is there a way to do this in some sort of wildcard fashion, so I can add more domains as necessary, create a subfolder, and have it setup automatically?
"But is there a way to do this in some sort of wildcard fashion". No, there isn't.
If you don't want to manually create all those sites, write some PowerShell scripts to automate the steps.
Related
I want to disable access of a specific folder of my domain say: mydomain.com/newsletters/
BUT since i will be adding files inside /newsletters I want all files to be accessible.. say: mydomain.com/newsletters/april-newsletter.pdf
Thank you in advance.
I have an asp.net project that I use for a couple different purposes. We have addresses that access the same virtual directory via different paths (use1.company.com and use2.company.com) I do not want to break the project up as they use similar functionality that seems redundant to have in two places. None the less as it stands use1.company.com/default.aspx and use2.company.com/default.aspx both are the same. I want to make it so that use2.company.aspx/default.aspx is not accessible. Is there a way to do that from the App Pool/Virtual Directory settings or do I just have to hope that external users dont type /default.aspx?
I know I can set the default document to like survey.aspx (purpose of the second url) but that does not prevent some savvy users from typing in default.aspx just to see what it does. Any assistance here would be great.
Since they point to the same .aspx file could you not include an if statement at the start of the file to grab the URL and if it includes use2 then go back?
I wish to protect folder with core files of CMS and its sub folders and files from accessing via web, and I tried with .htaccess file with this:
order deny,allow
deny from all
Problem I have is that I can protect that folder but some script from that folder or its sub folder then do not work good.
I also tried with this:
order deny,allow
deny from all
allow from 127.0.0.1
allow from 76.xx.xx.xx
In this case 76.xx.xx.xx is static IP of site.
Is there any way to prevent accessing files in that folder but still to make all work ok?
Another question.
I wish to secure more my site from hackers. So, is there any way to prevent injecting malicious files and code in my scripts/files and/or to block my site of executing files from other sites, hosts, to allow just working with local files.
I prefer .htaccess file, but if it is needed I have access to WHM if there is need for editing other files (but in that case I will need step by step guide). I am running site on Linux VPS with Cent-OS 5 system.
The usual way to do this is to put the accessible files in an apache-accessible directory, but all the rest into a directory out of the way from Apache. For example:
/usr/
local/
mycms/
public/
lib/
/var/
www/
mycms -> softlink to /usr/local/mycms/public
Or better yet, make mycms an alias in Apache config, pointing at the public directory. This way, the files that should be accessible are, those that shouldn't be aren't, and you can still reference all your other files simply by ../lib/ etc.
I know this does not really answer your question literally, and if the CMS directory structure is not under your control, this may not be the best way to do it.
Another way is through rewrites - simply rewrite all requests to your CMS directory except for your CMS's entry script into requests for the entry script.
So, I decided to try to break my website...I googled my site by typing in site:mysite.com/whatever and behold, all of the users uploaded files were available for view under a specific directory.
What kind of script/ counter measure should I use to block these files from being viewed? I already have a script that checks the path and the logged in status, however this doesn't seem to be working. I've looked all over for solutions...but I can't quite find one. I'm using ColdFusion 8.
This isn't a ColdFusion issue so much as a web server configuration issue.
You should either:
configure your web server not to show a directory of files when using a URL without a filename (e.g., http://www.example.com/files/)
drop a blank default web document (index.html, index.htm, default.htm, index.cfm, whatever) into that directory so that it displays that document rather than the list of files. If you use index.cfm, it'll fire your Application.cfm/cfc in your file path and use whatever other security you've built.
(or, better, do both)
The best way to secure your file listings and the files themselves is to store them in another folder outside of the Web site root folder. You can then serve them up using CFDIRECTORY and CFCONTENT. The pages that display the files can check your access controls and only serve the files to those allowed to see them.
I have created a new website for a client, (joomla) which is currently running within a folder in the root called /new/ (original I know)
I am now trying to work out what would be the best way to switch over the sites, there are a lot of files (pdf's, doc's etc) so moving everything would take some time, plus the old site is very messy, and would also take some time to remove in one go.
What about htaccess, and having a redirect? I could then remove the old site and files from the root over time?
One approach could be the use of symbolic links(ln -s basically), but you won't be able to do that for files that have the same name in both root and your new/ subdirectory. You will encounter this same issue with apache redirects, thus basically, you need to decide before hand what do you want to serve from either root or new/ and resolve the possible conflicts.