understanding server structure [closed] - linux

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I have recently purchased a dedicated server with apache on centOS. I want to understand the complete file structure in the server. I am accessing it through putty shell prompt.
What i need to understand is that how the files are stored and where. Like through WHM i am able to login to cpanel and create subdomains etc. And through ftp, i can put contents in that subdomain directory.
Now how do i access those directories for subdomains? How the individual website accounts are stored? I want to get a clear picture of the complete system through shell. I just now prefer using shell to view the file system hierarchy. any help would be really great.

Most/many Linux distributions adhere to a standard layout called FHS - File System Hierarchy Standard. That lays out some of the ground rules around where major things go - things like /etc for configuration and /var for things that, well, vary.
The most relevant thing you probably care about is that apache's config information is somewhere in /etc/httpd, the actual HTML files are in /var/www and the logs are in /var/log.
http://www.centos.org/docs/5/html/Deployment_Guide-en-US/s1-filesystem-fhs.html.
How to list the contents of a package using YUM? will show you where the apache package puts things by default.
Note that all of this is just convention - if you install apache from source you can put everything where you like, but it may confuse other people that have to support your box.

Related

user own image folders (security against hacker) [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I'm creating a browser based image cloud service and every user will have his own picture folder.
My question is how to make sure, other users or hackers can not access foreign folders.
What is to consider?
Is it e.g. enough to check session-variables?
Thanks in advance!
I'm not 100% sure what you mean by just checking the session-variables.
I would create a setup like this:
\root
\userImages
\user1
\img1.png
\img2.png
\user2
\img1.png
\img2.png
\public
\index.php
I'm assuming you would use PHP or ASP.NET or something similar that uses some type of server like nginx or apahce. You can set the server root to the public folder. This means only your code would have access to the user images.
You can use PHP or whatever language to look at the session information and see if the user is authenticated. If you can, I would recommend encrypting the cookie data with Mycrypt. Once you have checked the authentication, you can get the file with a script and send back header information. Here's a really in depth article that I think would help you if you actually want a how-to. Protecting Images with PHP
If you are using PHP, Laravel handles sessions and protecing images really nicely.

Full Apache config migration [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I searched alot and didn't find an applicable answer.
I have a working LAMP setup on Ubuntu machine and I have to migrate to a new server in a different country.
The old server is 11.10, the new server is 12.04LTS.
My problem is that I simply can not remember the steps I followed when I configured the current server which is not the basic LAMP install.
It is Apache with FastCGI, SuEXEC, a GD library, worker MPM and all sitting on top of a mhddfs system. There are also other configs I've changed and I can not recall what they are.
Because of the complexity of the setup, my attempts to migrate to the new server fail. I get permissions errors, cgi problems etc.
Therefore my question is :
Is there a sane way to simply tar a full backup of the current web server installation, including MySQL, Php amd the apache server with all configs, and then move it to the new machine?
I shall be forever thankful on any advise. So far non of thise I found here gave me an answer.
Thanks!

How to create a downloadable public link for files on server [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I am afraid my question could be very stupid, and also be duplicate. But I didn't manage to find what I want after looking for some similar questions on this site.
My question is simple, I have a big file, i.e. 1GB on my Ubuntu server, and I want to share this file with other users. How can I create a URL address for public users, in other words, when one user click this URL, the download will automatically start without demanding a username and password, just like we download many stuff (pdf, music) when we find an usable url with google.
Someone suggests me to setup an anonymous ftp. I think it's a possible solution, but I didn't succeed to accomplish it. Can some one give me more details how I achieve my goal, (with or without ftp will both ok).
Thanks for any help, and I am very grateful for some examples, or some tutorials !
Install Apache2
sudo apt-get install apache2
Place your into file the /var/www/ directory (might need root privileges for this)
sudo cp yourfile /var/www/yourfile
Access the file with the following link:
http://your-ip-address/yourfile
If your running under a router or firewall, you might have to open port 80 and forward it to your pc.
Lets assume your filename is foobar.iso.
You could just place it in your web root, and give the link example.com/foobar.iso to people. This will download the file.
Optionally, place it in a directory downloads. The download link will then be example.com/downloads/foobar.iso.

How to make new domain on local pc? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Can someone point me where I can find application/service used for creating NEW domain names?
If I want to create domain for example: somethingnew.com and I don't want to purchase it somewhere on net and redirect it on my pc, I want program/service that will allow me to own same way of registering as those online service have, just on my pc locally...
Is this possible?
I would like to use it with WAMP ( Apache ) server, if that is possible...
If anyone have any direction what would be useful, what program/service, I will appreciate that...
I try with Simple DNS Plus application, but it's not working okay...
Anyone have any suggestion?
You mean like with the hosts file? Just add your local domain names to your local PC's hosts file.
Edit file C:\Windows\system32\drivers\etc\hosts
Within place a line like this:
127.0.0.1 somethingnew.com www.somethingnew.com
This will cause your local system to resolve the above domain names to the local IP address.
This file is usually restricted access so you might need to be admin user.
I would like to use it with WAMP ( Apache ) server, if that is possible...
The SimpleDNS Plus tool you mentioned looks like a full DNS nameserver, which would only work with registered domains (nameservers are assigned to your registered domain).
WampDeveloper Pro, a WAMP application, has a LocalDNS tab that does the above, but is not a free tool.
HostsMan and HostsFileEditor, are some other options, though I've not used these two before.

How to get root shell after login as a normal desktop user and how to patch it? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I am new to Linux. I am interested in security too. I got to know from one of my friend that it is possible to get into root shell after login as a normal desktop user. I don't know how to do it. I only know its possible after exploiting some services.
When I search for the above topic, I come to that, first we need to find out SUID files, which are the executable files having special permissions. I used 'find / -perm +4000' command to get such files list.
I don know what to do after that and get into root shell. I need to find out such issues in my OS and need to patch it. Could you please help me?
After you get your list of root suid programs (this is just one of many starting points when trying to get root on a system) you have to find if any of them are vulnerable to buffer overflows (you can start searching the CVE database for their names) and get a shell payload to be executed when the hole is exploited (so you can get a root shell).
There are many resources on the topic and by googling for "buffer overflow" you'll get to them.
There's no generic way to do this; it depends on what vulnerabilities exist on the system you've logged into, and that can vary from machine to machine. You need to look at what version of the OS is running, what vulnerabilities are known in that OS version, and which patches haven't been installed on the machine in question.

Resources