Okay I searched and I can't find the solution.
I have a folder on my website where I'm keeping some script I use for something. My server via Cron job has to access this folder all the time. But if someone goes to the URL of the folder they can see the Index page, and I would like to prevent it. I want it to throw up a 403 or a 404 or a 500 or any page, I don't care what it is.
I tried this in htaccess:
deny from all
But this also blocks my own server and breaks the Cron job. I tried a few other solutions I found here and there, but they didn't work.
One of the things I saw is you can block everyone but allow your server access via IP. But I'm on Hostgator Shared hosting, and so my IP isn't static (as far as I know). I don't want to have to worry that at any time my server's IP may change and break my Cron thing.
Isn't there a nice elegant simple and permanent solution for this? Block access to the folder to all people, allow my own server/cron to access it at will.
Thank you :)
As it seems you want to call a script which is stored on the same server.
You don't need to use curl for it, you can just run the script by providing it's path and the php installation.
You can prevent outside access to the script if you just don't save the script in your public_html directory.
Your cronjob command will something look like this:
/opt/php70/bin/php /home/<USERNAME>/cron.php
The exact directory structure differs depending on your webhost.
Related
I want Apache to blindly give the root website directory the full URL without a concern for the path. Reason being I have an Angular app that handles routing and it's at the root directory.
I've tried stuff like this:
AliasMatch ^/(.+) /var/www/html/mywebsite.com
But it always results in an infinite loop.
Essentially, I just want to disable path-directory resolution.
EDIT: I should also clarify that I have multiple sites hosted on the same machine and still want that to function. I just don't want directory routing from within a single website.
I figured out how to accomplish what I want while also having the neat side effect of allowing me to still have assets that can be reached through directory navigation:
FallbackResource /
Will use the root directory without changing the URL when no such directory the path specifies exists.
I have read a few answers to try and find a solution to a ridiculous problem.
I dont have access to a server that I can log on to access phpmyadmin,
What is supposed to happen is that the web url is supposed to be viewed via https, and in most cases this happens.
Except for a particular PC I have at home and it never seems to open in https. Why this is happening on this given machine is completely unknown.
Is there a way I can set up a rule on my local machine that will ALWAYS convert http://pathtomysite.com to https://pathtomysecuresite.com, (possibly via the 'hosts' entry (and yes it is a windows machine running win10).
I could do this on the web server itself, I know how to do this, but the problem is, I don't have, nor am I allowed to have, access to the database server to update the .htaccess or webconfig.xml on the server. (I am 99% sure its Apache, not nginx or IIS).
Any help is allows gratefully received.
I am running a raspberry pi using raspbian linux. I have apache web service installed and when i type in my ip address into the address bar of a browser it loads the default apache webpage saying it all works.
I have another folder located in home/Client5 on this device from which i am trying to load an index.html page but i am recieving a 404 not found error. eg
192.304.0.22/home/Client5/index.html
Not Found
The requested URL /home/Client5/task5.html was not found on this server.
I gather that theres something wrong in the above web adress or is it that i have to place this folder within the apache folder?
You have two solutions.
1/ If you want to keep the default apache pages, add a virtualhost (there are tons of docs on this, I don't think it needs to be repeated here).
2/ If you don't care about the default apache pages, edit /etc/apache2/sites-available/default and change DocumentRoot to make it point to /home/Client5/ . Add an index.html file in there, hit the Raspberry IP in your browser, you should see your page.
You might need to chmod -R ugo+rwX /home/Client5.
I don't know what you have under /home/Client5, but if it's a regular user, this setup is highly insecure. There are a bunch of additional steps to take if you want to host under home directories (first step, don't put pages in $HOME but create a subdir). It is safer to have a dedicated place with proper perms outside home dirs unless you really know what you're doing.
Is it working in local ? (XXX.XXX.X.XX:80) ? Surely yes, so take a look at your router.
If you have apache2, you local ip indicate to folder '/var/www/'.
If you want host page in '/home/Client5' you have to make virtual host :)
#edit
Read about it here
Let's say there's a website www.example.com/user/john. Accessing that link takes you to www.example.com/user/john/index.html.
There are files like www.example.com/user/john/picture.png and www.example.com/user/john/document.html. These are also accessible to the public, but there's no link to these from index.html.
Is there a systematic way to find out these files? I'm asking because I'm going to set up my website, and I also want to put up a few files that I don't necessarily want every one to see, only people who I give the link to. So I'm wondering how easy/hard it is to find out that those files exist in my directory.
Most importantly you have to switch off the possibility to just browse the directory with the browser. Every server has its own way to switch this off. Then you can use the proposed way of "security through obscurity".
Another way can be, to have a specific folder whos access is restricted by a http basic authentication. This can be configured in the .htaccess file which you put in the root folder of your directory you want to share only with specific people.
Just google ".htacces" and "basic authentication".
HTTP does not provide a directory listing method as such. Most web servers can be set up to provide a HTML-formatted directory listing if an index file such as index.html does not exist. If you provide an index file so that autoindexing does not happen (or if you disable autoindex by web server configuration), and if your "hidden" file names are hard to guess, they should be pretty hard to find. I sometimes publish such files in a directory with a random gibberish name.
"Share links" used by Dropbox, Picasa and other services do the same, they just use longer random file/directory names or random parameters in the URL.
To provide real security you'll want to set up https (SSL/TLS) so that any eavesdroppers on the network cannot easily look at the requested URLs, and authentication such as HTTP Basic Authentication with username/password. For less sensitive stuff, http to a random hidden directory will often do fine.
I'm testing moving my site to a new Linux server using cPanel which requires you to put in your IP and username (e.g. http://123.xxx.xxx.xxx/~username/). The problem is, all my image/JS/CSS links use paths like /css/style.css or /images/picture.jpg so none of the styles, scripts or images show up properly.
How do I set up a RewriteRule to prefix ~username to all requests?
Before moving site, if it was working with only domain name and with redirection then now it should also work with server IP and username, Make sure the permissions and ownership of /css/style.css or /images/picture.jpg are correct.
Also check it once adding exact path manually like 'http://123.xxx.xxx.xxx/~username/css/style.css'.
I was able to get around the issue by setting the domain to a dedicated IP instead of shared, so I could access the site using 123.xxx.xxx.xxx instead of 123.xxx.xxx.xxx/~username.