Deny Access to directories from unauthorized users - .htaccess

I am not being paid for this and I would like to know the quickest way to do the following. A former client has a page which only members can access. This page links to a number of galleries which he only wants members to access. The galleries are not protected by any kind of authentication.
What I assume is the quickest way to do this is to create a .htaccess file which only allow people to view the site when the come from a certain referrer. Would this work?
My current thinking is that I could use a php script to deploy a .htaccess file into each of the gallery directories. (There are around a 100 at the moment.)
I found this link which might be what I am after but to be honest I really don't get it. Is my thinking sound? Could someone either link me to a tutorial that does this or show me how it is done.
http://perishablepress.com/press/2006/01/10/stupid-htaccess-tricks/#sec7
As always, a massive thank you for any help.
Jason

Related

Editing a webpage with no source

I am a new developer (as in just graduated on the 10th) and was hired by a company to do web development. I was asked to do some minor changes to a site that this company acquired. The problem is that we do not have access to the source code (apparently the people had a bad break up with their previous developers and cannot get the source, I'm not exactly sure). Is there a way I can add links to a site and have it change live? I have Visual Studios, the address, the links, and the videos they will go to, not a hard fix, but I don't know how to edit the site without the source code. Any suggestions? Thanks in advance!
I advise you to talk to a senior or superior and get more information on how to proceed, because getting that code in a less than professional (or legal) way (e.g. using website rippers or something) would be a bad career move ;)
good luck.
interesting situation I should say, the company definetely didnt do its homework before the break-up
I am presuming you answer "yes" for the questions below
Is your company the legal owner of this website?
can you change the name servers or CNames etc
The current website is not Flash or silverlight
if here - you have said "yes" for all the above.
First of all navigate to every page of this website. File save as
each of this page to html(make sure you choose webpage complete -
this will save all the images as well) I realise this will be static, but there is not much you can do here
Get all resources (stylesheets, xsds (if any) , any other images)
Enrich this content based on requirements (i.e. add dynamic content, change logos etc)
Modify the cname or nameserver to point to the location(webserver)
you are in control.
Deploy your enriched and tested code
Educate your company to treat the developers well and when things go wrong, ensure transition is done well
I hope this help and good luck
Krishna

Dynamic .htaccess subdomain security

I have yet another .htaccess question, simple for you, not so much for me.
Let's say my main site is found at http://domain.com. I do all of my pre-release testing at sandbox.domain.com and sandbox.domain.co. I just realized that Google has gone ahead and indexed my sandbox sites... Ugggh!
The document root folder on my Apache server with the live site is always called ALIVE, and in order to make my sandbox contents live I quickly rename the folders, ie ALIVE->OLDx, SANDBOX->ALIVE.
My goal is to prevent indexers and users from accessing my sandbox pages. I am trying to design a .htaccess file for document root that only allows my ip address when accessed from a sandbox subdomain (sandbox.domain.com), otherwise it allows everyone when accessed from the main domain (domain.com). This would eliminate the process of remembering to update the .htaccess file each time I release a new site.
This doesn't seem too difficult, but I haven't been able to find the right combination. Any pointers in the right direction will be much appreciated!
Create a .htaccess inside each folder (like sandbox):
order deny,allow
deny from all
allow from YOUR IP HERE

google index - will google index my logs?

I have some txt log files where i print out some important activities for my site.
These files ARE NOT referenced from any link within my site, so it's only me i know the url
(they contain current date in the filname so i have one for each day).
Question: will google index these kind of files?
I think google indexes only the pages whom urls are on the site.
Can you confirm my assumption? I just do not want others to find the link from google etc:)
In theory they shouldn't. If they aren't linked from anywhere they shouldn't be able to find them. However I'm not sure if stuff can make its way into the index by virtue of having the google toolbar installed. Definitely I've had some unexpected stuff turn up in search engines. The only safe way would be to password protect the folder.
Google can not index pages that it doesn't know they exist, so it won't index these, unless someone posts the url's to google, or place them on some website.
If you want to be sure, just disallow indexing for the files (in /robots.txt).
Best practice is to use the robots.txt to prevent the google crawler from indexing files you don't want to show up.
This description from Google Webmaster Tools is very helpful and leads you through the process of creating such a file:
https://support.google.com/webmasters/answer/6062608
edit: As it was pointed out in the comments there is no guarantee that the robots.txt is used so password-protecting the folders is also a good idea.

Changing page location after google analytics setup

The current website structure is setup such that all the ASPX pages are in the main folder. It's becoming increasingly difficult to maintain, so I would like to create new folders and move the relevant pages. This would change the URL from say:
http://mydomain.com/DoStuff.aspx
to
http://mydomain.com/DoingFolder/DoStuff.aspx
I fear that this will skew up the google analytics results. Is it recommended I do this change? If so, is there a way to link the page locations of after and before the change?
Also, what would happen when I implement the URL rewrite? Would I run into the same issue again? Anyone?
So in general I think it is a good idea to add the folder for both your users to visually see the section they are in via the URL and to help the search engines figure out the areas and who knows you may even get a (small) SEO benefit out of it.
What I would advise is to setup a second profile in Analytics and then add a filter which removes the folder name from the request and will leave you with the same flat structure in your reports as you have currently. (NB Do this under a new profile with the same tracking code to avoid major mess-ups that you can't undo).
Cheers
Z

Prevent site deletion

In our Sharepoint implementation users have been granted site collection admin rights. On a few occasions they've managed to delete a subsite or even the entire site collection. I'd like to be able to block this but not being a developer I'm finding it pretty tricky.
I've had a look at the MSIT site delete capture tool to try to understand how that's working and it seams fairly straight forward. I want to override the delete function and either block it entirely or have the user type a password. What I can't see is any way to fully override the default behavior as it looks like the MSIT tool simply adds some functionality (backs up the site) then falls back into the default behavior.
So my question is, can I prevent the default behavior or can I only add actions before or after it fires?
Thanks in advance
Change the user permissions may be the best way to go. site collection admin is a crazy level of access for normal users.
Two answers:
You cannot prevent site deletes without either coding up something yourself, or buying a product to help you with "site lifecycle management" or "site governance" or some other vague term they use to describe this sort of thing.
The Site Delete Capture Tool may be good enough for you. It doesn't prevent any kind of deletions, but it does take a crude backup that (hopefully) allows you to restore anything they delete. We're using this tool in production and it works.
You could try to edit the site settings aspx file and comment out the delete site link, don't have a setup around to try that. While users could delete the site in other ways it would prevent the most common method.
Other option for important sites would be to make sure the site has a sub-site, if one does not already exist create one and don't user any access. The site would not be seen by the users and it would prevent them from deleting the parent site.
As for programming, in the before behavior you can return a false to stop the action. Just be sure to place a work around so you can delete a site.
A Site Collection administrator has the permission to delete sites and it should stay that way. We have modified MSIT to do additional stuff
The best way to limit user privileges is to put users in the right SharePoint group (ie) Owners, Members, Visitors or you could create a new group with right permission/permission levels.

Resources