On my Apache 2.x server at home, I have a number of virtual directories. I've set up my router so that I can access Apache from the internet. I need to keep one of those virtual dirs (/private) from being accessed outside my home network LAN. So given /private, how do I configure Apache to only serve requests to /private from 192.168.4.x?
<Directory /users/me/private>
Order deny,allow
Allow from 192.168.4
Deny from all
</Directory>
Related
I have a problem with my raspberry pi 4 model B 4Go
I have installed Ubuntu Server 21.10
And installed apache2
And I have a basic ssh access
I can access it (ssh and web) with its local ip (192.168.1.90)
However impossible to connect or to load a web page with its public ip address, firewall of my pc all closed.
I don't understand the problem because before on my computer with Wamp installed, lowering the firewall and loading an apache page with a public ip was totally possible (by retesting just now it works without any problem)
I also opened port 80 on my box for testing and still nothing
Is there anything else to configure on my box or directly on my server?
Thanks
<Directory />
Options FollowSymLinks
AllowOverride None
Require all denied
</Directory>
<Directory /usr/share>
AllowOverride None
Require all granted
</Directory>
<Directory /var/www/>
Options Indexes FollowSymLinks
AllowOverride None
Require all granted
</Directory>
#<Directory /srv/>
# Options Indexes FollowSymLinks
# AllowOverride None
# Require all granted
#</Directory>
Here the part of the content in etc/apache/apache2.conf but I don't think it's that since I can't connect anywhere with this ip
I managed to complete my problem thanks to the "port forwarding" option on my box and not the firewall, it is enough to redirect the ports 80 to its local ip, I did not know at all that this option existed. It is possible to do it with all ports. The firewall doesn't seem to influence this, thanks to you
The apache configuration is correct.
Can you check and upload the /var/log/apache2/error.log file of apache to check what is the error?
Run the tail -f command on the log and refresh the web by accessing the public IP.
It is possible that your router is blocking the traffic.
Have you opened the ports?
Edit;
There is an error in the image, the open port has to be redirected to a local IP. The router does not know where to send the traffic.
There is a mistake in the picture, you have to redirect the open port to an IP. The router does not know where it has to send the traffic.
Probably this question was already asked, but I would appreciate a confirmation that I'm doing things right.
I run XAMPP 5.6 and all pcs connected to wifi are trusted. I'm behind a router and no virtual server or port forwarding are enabled.
(On this version of XAMPP the security.php page is gone.)
I would like that XAMPP allow access to htdocs folder (mysites) ONLY from pcs connected to my wifi, and stay locked from Internet (rejecting access to htdocs and all xampp setting folder and files).
QUESTION 1:
1) if I do nothing (not even add a password to root), is XAMPP (settings folder and/or my sites in htdocs folder) open to internet or just to my local wifi pcs?
QUESTION 2:
Adding this to httpd.conf
<Directory />
Options Indexes FollowSymLinks Includes ExecCGI
AllowOverride All
Order deny,allow
Deny from all
Allow from 127.0.0.1 ::1 localhost 192.xxx.xxx
</Directory>
will reinforce security allowing only connection from specified local ips (and reject ANY connections from internet)?
As a test, I did try the following addresses (from another internet connection):
1) my-isp-ip/routerip/mysite
2) my-isp-ip:80/routerip/mysite
and the pages just did not load.
Is it a good test?
Thanks!
I'm coming from a non-cloud hosting background on Red Hat Enterprise Linux and/or CentOS and trying to set up an apache (2.2) server with Amazon EC2. I typically host my files from a user's home directory and create a virtualhost like so:
<VirtualHost *:80>
ServerName userdomain.com
DocumentRoot /home/myuser/public_html
<Directory /home/myuser/public_html>
AllowOverride All
<Limit DELETE>
Order Deny,Allow
Deny from All
</Limit>
</Directory>
</VirtualHost>
However on Amazon EC2 that doesn't seem to work at all no matter how I sent the file permissions.
Is this something that just isn't allowed? Do I have to host files from /var/www? What am I missing?
It turns out Linux had SELinux enabled, which I have not encountered before.
The simplest solution was to put it in "permissive" mode as described at
https://access.redhat.com/documentation/en-US/Red_Hat_Enterprise_Linux/6/html/Security-Enhanced_Linux/sect-Security-Enhanced_Linux-Enabling_and_Disabling_SELinux-Disabling_SELinux.html
I am trying to create a URL for a site hosted through wampserver, but no matter what I do I am unable to get the URL to work. The site is online because I am able to connect through the servers IP address, though.
(I should also mention that this site is only available on an intranet)
hosts file:
# Copyright (c) 1993-2009 Microsoft Corp.
#
# This is a sample HOSTS file used by Microsoft TCP/IP for Windows.
#
# This file contains the mappings of IP addresses to host names. Each
# entry should be kept on an individual line. The IP address should
# be placed in the first column followed by the corresponding host name.
# The IP address and the host name should be separated by at least one
# space.
#
# Additionally, comments (such as these) may be inserted on individual
# lines or following the machine name denoted by a '#' symbol.
#
# For example:
#
# 102.54.94.97 rhino.acme.com # source server
# 38.25.63.10 x.acme.com # x client host
# localhost name resolution is handled within DNS itself.
# 127.0.0.1 localhost
# ::1 localhost
127.0.0.1 localhost
127.0.0.1 www.socialclub.com #also tried public/private IP, still only works locally
vhosts.conf:
# Virtual Hosts
#
# Required modules: mod_log_config
# If you want to maintain multiple domains/hostnames on your
# machine you can setup VirtualHost containers for them. Most configurations
# use only name-based virtual hosts so the server doesn't need to worry about
# IP addresses. This is indicated by the asterisks in the directives below.
#
# Please see the documentation at
# <URL:http://httpd.apache.org/docs/2.4/vhosts/>
# for further details before you try to setup virtual hosts.
#
# You may use the command line option '-S' to verify your virtual host
# configuration.
#
# VirtualHost example:
# Almost any Apache directive may go into a VirtualHost container.
# The first VirtualHost section is used for all requests that do not
# match a ServerName or ServerAlias in any <VirtualHost> block.
#
<VirtualHost *:80>
ServerName localhost
DocumentRoot "E:\Data\Users Apps\wamp\www\socialclub"
</VirtualHost>
<Directory "E:\Data\Users Apps\wamp\www\socialclub">
AllowOverride All
Order Allow,Deny
Allow from all
Options Indexes FollowSymLinks Includes ExecCGI
</Directory>
<VirtualHost *:80>
DocumentRoot "E:\Data\Users Apps\wamp\www\socialclub"
ServerName www.socialclub.com
</VirtualHost>
Every guide I've looked at says that this should work, but it only works locally. What do I need to do for the URL to work from other computers?
Ok I think the problem is your are not understanding what the HOSTS file is used for and what its scope is.
The HOSTS file only effects the single PC that it lives on. It is used to seed the windows DNS cache at boot time. So whatever you put in this file will have no effect on any other PC in your intranet.
There are a couple of solutions :
Lets assume your PC running WAMPServer has the ip address 192.168.1.10:
You could go to each PC in your intranet and make this change to the
HOSTS file on each PC
192.168.1.10 socialclub.com
people normally think this is too much hassle especially if they have more than 5-6 PC's to mod
You could install a local DNS Server, or make use of an existing
local DNS Server. Then as long as all the PC's in your intranet are
using that DNS Server you add the domain name to that DNS Server.
people normally think this is a good idea, but it can be quite complicated to get this right and not loose access to the real DNS servers out there on the web
A couple of changes I would suggest to your httpd-vhost.conf file
First leave localhost pointing to the original wampserver homepage, but only allow access from the PC running WAMPServer. The tools on the homepage can be a very useful for debug/diagnostics/etc, but only allow access to locahost from the PC running WAMPServer.
Second put the <Directory></Directory> block inside the Virtual Host definition. This allows you to make each virtual hosts security specific to that virtual host.
# Should be the first VHOST definition so that it is the default virtual host
# Also access rights should remain restricted to the local PC and the local network
# So that any random ip address attack will recieve an error code and not gain access
<VirtualHost *:80>
DocumentRoot "c:/wamp/www"
ServerName localhost
<Directory "c:/wamp/www">
AllowOverride All
Require local
</Directory>
</VirtualHost>
<VirtualHost *:80>
DocumentRoot "E:\Data\Users Apps\wamp\www\socialclub"
ServerName www.socialclub.com
<Directory "E:\Data\Users Apps\wamp\www\socialclub">
AllowOverride All
Options Indexes FollowSymLinks Includes ExecCGI
# assuming your subnet equates to this range
# and you are using Apache 2.4.x
# its not necessary to allow access from all in an intranet
# in fact it might be dangerous
Require ip 192.168.1
</Directory>
</VirtualHost>
We have Wordpress website on our linux system. It host number of private files which are visible to who can login. We set the permission of those files as full-read (rwxr-xr-x). The problem is all the content are accessible to anyone who has the direct link! The main question is how can I restrict accesses to these direct link only to logged-in users?
Is there any Wordpress plugin or httpd configuration doing this?
I would place a .htaccess file with deny from all clause within the root of the uploads folder.
I tackled the problem temporarily by modifying httpd.conf to add special policy for the uploads sub-directories:
<Directory /var/www/wp-content/uploads>
Options FollowSymLinks
AllowOverride All
Order deny,allow
Allow from XXX.XXX.XX
Deny from all
</Directory>
This permits only to our local users to access the contents. However, I am still curious how wordpress can permit this access to only logged in users.