Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I have two linux web servers (X and Y) serving my website by load-balancer, the user can upload his file (image for example) via web form, this file will go to
/var/www/files/token/filename.ext
NOW THE QUESTION IS:
How Could I keep files directory synchronized in real time manner between the two servers (given that files directory contains sub and sub-sub directories in it. I don't want to use NFS (for the purpose of high availability)
Any scenario is highly appreciated
Linux kernel has a feature called "inotify" which detects inode changes, in this case, it can be used to detect change in directory content. But I am not sure if there is any CLI tool for it.
Once change is detected, we can use common file synchronization tools such as rsync to synchronize new/changed files to other servers.
This idea is more like "pushing", which is more responsive than polling at regular interval.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I know that truecrypt isn't capable of creating a hidden OS but in another post someone describes the steps to do it manually and that he does it to all of his linux computers all of the time. Can anyone elaborate on his steps so that someone (like myself) who is not as experienced could accomplish this?
I would just ask this individual to provide more details but it appears as though their account is "anonymous" or something.
I developed something like you are describing.
Here https://github.com/antonio-petricca/buddy-linux you can find all the information and installation script.
Buddy linux allows you to install linux on (hidden) loop files (like for the link you provided), but providing GRUB loader by an external USB drive. So, removing, it will results in a Windows boot.
The other good stuff is that it is based on LVM, so you can extended file system "simply" by adding loop files as per your needs.
Regards.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
All my pictures and important info got deleted, alon with my exodus wallet recovery phrase and everything!! Is there a way i could get those data back??
If you didn't explicitly wipe out the drives, you can still get your stuff. The OS just lost track of where everything is.
If you load a Windows image to a stick and boot from it, you will probably find your files.
Althoght they wont be inside the Documents folder or whatever default folders windows comes with. You will have to get into your drive and search for the folder. The path will probably under users or something.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I want to download a directory with 100s of large files (500MB - 1.5GB each.) The only problem is there is a download speed limit, and it takes nearly an hour just to download a single file.
What built in command or package on Linux would be able to download all files in a web directory with multiple download streams? If I ever have to restart, I would need the program to ignore already downloaded files.
Read the man page for wget. It supports exactly what you want.
Note that most sites will ban you for downloading too many files too quickly. Someone is paying for that bandwidth, and if you leech too much, it becomes a tragedy of the commons.
Additionally to wget, you can use axel to open multiple streams for a single file, in case you need more speed on single files as well.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
Since I am not that experienced with Linux this might be an easy, if not too simple question for you. Recently I met an old friend of mine and I want to exchange some files with him. In fact I could send the files by E-Mail or share them by Dropbox or something like that but I want to make use of Linux and my RaspberryPi.
Here, the RaspberryPi can be accessed via SSH and I want my friend to be able to access one specific directory. The one where I place the files.
I don't want him to mess around in the system. Ideally he should be able to only see this one directory.
Is it enough if I create a user and put the files in his home directory?
Thanks in advance
See this introduction to permission management on Linux.
To answer your Question:
Is it enough if I create a user and put the files in his home directory?
Yes, but it's not a perfect solution because the home folder of an user contains some subfolders.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I like to ask how am I going to copy files from one of my Linux Server Account to other account? If anyone knows how, please help me.
Take a look at the man pages for scp. This is a very useful command that I use rather often at my job.
It works easiest if you are logged into the server that has the file you want to transfer (IMO). The syntax is scp src_file username#remote_host:dst_file, where the text that comes after the : in the second part is the destination path on the other server.
For example, if you have a file called "file.txt" on server1, and you want to put it on server2, you would type:
scp file.txt username#server2.name.or.ip:/home/other_username
or where ever you want to put the file. I would recommend copying the file first to your home directory on the other server as that minimizes issues with permissions, in my experience.
EDIT: If you want to log into the server that is going to receive the file, you can just swap the first and second arguments to copy from the remote server to the local one.