How to download whole web directory with multiple download streams? [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I want to download a directory with 100s of large files (500MB - 1.5GB each.) The only problem is there is a download speed limit, and it takes nearly an hour just to download a single file.
What built in command or package on Linux would be able to download all files in a web directory with multiple download streams? If I ever have to restart, I would need the program to ignore already downloaded files.

Read the man page for wget. It supports exactly what you want.
Note that most sites will ban you for downloading too many files too quickly. Someone is paying for that bandwidth, and if you leech too much, it becomes a tragedy of the commons.

Additionally to wget, you can use axel to open multiple streams for a single file, in case you need more speed on single files as well.

Related

Creating hidden OS with linux [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I know that truecrypt isn't capable of creating a hidden OS but in another post someone describes the steps to do it manually and that he does it to all of his linux computers all of the time. Can anyone elaborate on his steps so that someone (like myself) who is not as experienced could accomplish this?
I would just ask this individual to provide more details but it appears as though their account is "anonymous" or something.
I developed something like you are describing.
Here https://github.com/antonio-petricca/buddy-linux you can find all the information and installation script.
Buddy linux allows you to install linux on (hidden) loop files (like for the link you provided), but providing GRUB loader by an external USB drive. So, removing, it will results in a Windows boot.
The other good stuff is that it is based on LVM, so you can extended file system "simply" by adding loop files as per your needs.
Regards.

how to bulk download text files from a webpage [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I want to download all text files from this webpage...
https://ftp.ncbi.nlm.nih.gov/dbgap/studies/phs001672/analyses/
How can I do that and place the download in a zip file?
Use the wget tool, use it to download all the files, then zip them yourself.
Or, you can mount that FTP server as a drive in Windows, and deal with them in Explorer however you want (e.g. drag and drop them to another folder on your computer and zip them). Or don't mount directly to Windows, but use a graphical FTP client to connect to that server and then copy the files down.

ALL MY PICTURES DELETED UPON SWITCHING FROM WINDOWS TO UBUNTU [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
All my pictures and important info got deleted, alon with my exodus wallet recovery phrase and everything!! Is there a way i could get those data back??
If you didn't explicitly wipe out the drives, you can still get your stuff. The OS just lost track of where everything is.
If you load a Windows image to a stick and boot from it, you will probably find your files.
Althoght they wont be inside the Documents folder or whatever default folders windows comes with. You will have to get into your drive and search for the folder. The path will probably under users or something.

Git Lab configuration issue [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
We have on premise GitLab. Is it possible to stop people from uploading all kinds of files eg. resist people from pushing in .exe files.
I've already looked at gitlabs apis, couldn't find anything relevant
Yes can prohibit certain file names in the push rules, or could reduce the maximum file size to discourage people from uploading .exe files
https://docs.gitlab.com/ee/push_rules/push_rules.html
As #Iron Bishop pointed out this feature is not available in the free plan.
Alternatively, if you are so inclined, you could contribute a filter to the Gitlab code base and then use it. It should be fairly simple to add since the framework for filtering is already in place.
Comment here, if you need further help or want to explore the custom code option.
Cheers and all the best.

Real time synchronization for directory in Linux [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I have two linux web servers (X and Y) serving my website by load-balancer, the user can upload his file (image for example) via web form, this file will go to
/var/www/files/token/filename.ext
NOW THE QUESTION IS:
How Could I keep files directory synchronized in real time manner between the two servers (given that files directory contains sub and sub-sub directories in it. I don't want to use NFS (for the purpose of high availability)
Any scenario is highly appreciated
Linux kernel has a feature called "inotify" which detects inode changes, in this case, it can be used to detect change in directory content. But I am not sure if there is any CLI tool for it.
Once change is detected, we can use common file synchronization tools such as rsync to synchronize new/changed files to other servers.
This idea is more like "pushing", which is more responsive than polling at regular interval.

Resources