Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I am downloading many files from a password-protected server. They suggested using:
wget -i urllist.txt --user name --ask-password
to download files. I am able to download files but the problem is I want to change all files with each step of running this script so that I will have the correct files name.
For a single file download -output option was working but for many files, I am having a problem. Can you help me out?
I am not sure what you mean with:
I want to change all files with each step of running this script so
that I will have the correct files name.
How about if you download all files to a folder and later rename they way you want?
wget manuals says:
-P prefix
--directory-prefix=prefix Set directory prefix to prefix. The directory prefix is the directory where all other files and
subdirectories will be saved to, i.e. the top of the retrieval tree.
The default is . (the current directory).
Therefore you could use:
wget -P download-folder -i urllist.txt --user name --ask-password
Now you call manually rename the files in download-folder.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I downloaded file which it's info.tar.gz located into directory called root.
so in the terminal linux/ubuntu I did this
cd ./root and did ls so I see the file called
info.tar.gz
so after Im in root directory I did
tar -zxvf info.tar.gz
but the file is still in the root directory zipped / tarred ..any help?
I tried to do cd ./root/info.tar.gz but it tells me that info.tar.gz isn't a directory
Any help please to untar/unzip the file in linux/ubuntu?
You can just double click on the tar file and click 'extract' to unzip the file.
Another way to unzip the file via terminal :
Type the following command in the terminal after moved the directory where the zip file stored.
tar -xf info.tar.gz
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I just gotten a Raspberry Pi for Christmas and I wanted to delete some built in programs because I wanted to make a Linux server for home use. So far I had to do this all the time using the terminal because to delete the files, you had to use root.
rm ./files/*
rmdir files
Is there any way I can use rmdir command when there are files in it?
rm -rf files will remove the files directory and all subdirectories and not prompt you with questions about file permissions.
Sure just recursive delete :)
rm -r files
In your terminal, change directories to the one in the hierarchy just above the directory in question. Then:
$mv ./dir_to_del/* .; rmdir ./dir_to_del
This will move all the files out of the directory you want to delete, and then delete the now-empty folder.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
while compiling the c++ programs in which i'm using the libxml library it is showing errors at the header files that no file or directory found. I have installed the library but it still showing errors. So i just type the above command after that every thing is working fine but i didn't understand it.
what is the meaning of "../" in UNIX? my command in UNIX is like this "sudo cp -r libxml ../" what it means? how to give relative addresses in UNIX and what are the different wildcard is used.
.. represents the parent directory. For example, if the current directory is /home/user/ the parent directory is /home
. represents the current directory
The command sudo cp -r libxml ../ copies the entire directory libxml in the parent directory.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I have a user account on a cluster( a server), and can only install program like python on the home folder. In case I might accidentally delete the bin, lib, share,include folders coming with the installation of python on the home folder. I change the permissions of the above folder like this
chmod -w folder
but I am worried when the program need to write/delete some files of the folders, it might not function because the removal of write permission. Am I right? or I the run, including write files in the folder, of a program have permissions different than the permission of user.
BTW, is there a way to hide the folders without changing the names?
Wouldn't this stop python from running all together? For example:
$ cd ~
$ mkdir -p python/bin/
$ echo "echo 'hi'" > python/bin/python
$ python/bin/python
hi
$ chmod -x python
$ python/bin/python
bash: python/bin/python: Permission denied
As for your second question, no, there is no other way to selectively hide one folder without changing the name.
Edit: re-reading, I may have mis read what you were saying about the folders. You could always apply a "chmod -r folder" and nothing inside will be visable. This is not hiding it, just turning off permissions to view it.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I am trying to do an scp (on SuSE LINUX) and seeing something that I did not expect.
scp -q -r /home/dir1/mydir host:/var/home/dirx/BACKUPS
If there is /var/home/dirx/BACKUPS/mydir under the destination host, I see that the existing directories under that directory (including modification times) are left untouched. Only new directories are created.
If there are files in the destination directory that do not exist in the source directory, they are preserved.
After the copy, I was expecting to see the destination directory as an exact copy of the source directory. Seems like more of a merge.
Is that how scp supposed to work?
That's standard behavior for a copy command in pretty much any system. Files which exist in both locations will cause the destination to get refreshed with a source copy. Files which don't exist in the destination will be created/copied from the source.
Files which exist ONLY in the destination will not be affected, because it's not copy/cp's job to delete "stale" files - it has no way of knowing what a stale file is.
If you want to remove old/obsolete files in the destination, you'll need some other tool.
The other tool is ssh combined with tar. This will do the trick if you want to completely scrap the target directory and recreate it.
( cd /home/dir1/mydir && tar cf - . ) | ssh host "cd /var/home/dirx && rm -rf BACKUPS && mkdir BACKUPS && cd BACKUPS && tar xvf -"