I was trying to build the LFS project and following the book version 7.8. But I'm stuck as wget is not working.
when I execute the command -
"wget --input-file=wget-list --continue --directory-prefix=$LFS/sources"
it returns an error
"wget-list: No such file or directory
No URLs found in wget-list."
I have created $LFS/sources directory.
Kindly let me know what I can do to get over this. Any help is appreciable.
You need to give the path file to --input-file. in this case wget-list you can get it from the LFS website: http://www.linuxfromscratch.org/lfs/view/stable/wget-list
so, you can try this below:
"wget --input-file=wget-list --continue --directory-prefix=$LFS/sources"
It seems you don't have a file called wget-list in the currect directory where you run the wget command.
the other option is that the wget-list file doesn't contain the urls in a way wget can read them.
I have this problem too, but I solved that by do this:
at first save wget-list in /mnt/lfs/sources by this command:
sudo wget --input-file="http://www.linuxfromscratch.org/lfs/downloads/7.7/wget-list" --continue --directory-prefix=/mnt/lfs/sources
use this command to download all files:
sudo wget -i /mnt/lfs/sources/wget-list --directory-prefix=$LFS/sources
The command that you are executing cannot find the input file. The file should be placed in the same directory from where you are trying to execute the command. Or else, you can simply execute the following command to fetch all the packages:
sudo wget --input-file="http://www.linuxfromscratch.org/lfs/downloads/7.8/wget-list" --continue --directory-prefix=$LFS/sources
I had a similar problem where it was telling me it couldn't find any of the URLs but it downloaded one file, the solution to the problem was to enter the following code:
wget -nc --input-file="http://www.linuxfromscratch.org/lfs/view/stable/wget-list" --continue --directory-prefix=$LFS/sources
The -nc is no clobber which stops it from downloading the same file twice which is the problem I was having
Related
so i want to start tomcat server, to do this i have to run a script whose path is the following:
/usr/local/Cellar/tomcat/9.0.6/libexec/bin/strartup.sh
Since it is tedious to remember this, i made a simbolic link:
tomcatsh/startup.sh
so with the ln command tomcatsh points to /usr/local/Cellar/tomcat/9.0.6/libexec/bin
There is a problem when i run the shortened version, it yelds an error saying that the startup.sh script couldn't find setclasspath.sh .
this other script is in the same folder, and it is not missing, why doesn't startup find that script? What can i do to solve this problem?
If previously that symlink is defined for the folder of the file, you have to call command with update parameter
ln -sf <file> <symlink>
rather than creation parameter
ln -s <file> <symlink>
Have got a problem executing the command as below:
tar -xvf arch.tar.gz -s '/^bundle//'
Could be the
-s '/^bundle//'
is a problem as I've got errors like:
$ tar -xvf arch.tar.gz -s '/^bundle//'
tar: /^bundle: Not found in archive
tar: Exiting with failure status due to previous errors
The tried to run the command under Cygwin/Win10.
It's part of the longer script but I'm not sure what was the idea of original author. Archive does include the 'bundle' folder inside... and it's the only first level file there.
Thank you in advance :)
-s does not mean to do a substitution, which seems to be how you're trying to use it. You probably want --xform='s/^bundle//'
-s has the following entry in the help listing:
-s, --preserve-order, --same-order
member arguments are listed in the same order as
the files in the archive
With your code it's actually trying to find a file with the name /^bundle// which does not exist, even if bundle does. Also, the --xform option I gave will rewrite the names of files to strip the string bundle from the front. If you are just trying to not extract the file bundle you would want the flag --exclude='bundle'
In this case, if bundle is a top level directory in the archive, and it's the only one, you could also use the flag --strip-components=1, though this would get rid of all the top level directories, so might not be exactly what you want depending on your archive
Thanks all,
Problem solved other, then 'tar', way but for those who may be interested here is the answer I have found on the web:
If you are developing on Linux, or using GNU tar, this command should work:
tar -xvf arch.tar.gz --transform 's|^bundle/||'
For Mac or BSD-based operating systems:
tar -xvf arch.tar.gz -s '/^bundle//'
Yes, the idea was to remove the /bundle/ folder from files paths.
I am running wget through cronjob for executing some script in scheduled manner. Everytime the output is downloaded and saved as new file. I want to append the output to same file. How can I do that?
I am talking about the downloaded content from the URL but not the log of the execution.
You can do it using the following command:
wget <URL> -O ->> <FILE_NAME>
My first approach now would be to download it to a file, add the content of the new downloaded file to the previously downloaded file and delete it.
I want to download a number of files which are as follows:
http://example.com/directory/file1.txt
http://example.com/directory/file2.txt
http://example.com/directory/file3.txt
http://example.com/directory/file4.txt
.
.
http://example.com/directory/file199.txt
http://example.com/directory/file200.txt
Can anyone help me with it using shell scripting? Here is what I'm using but it is downloading only the first file.
for i in {1..200}
do
exec wget http://example.com/directory/file$i.txt;
done
wget http://example.com/directory/file{1..200}.txt
should do it. That expands to wget http://example.com/directory/file1.txt http://example.com/directory/file2.txt ....
Alternatively, your current code should work fine if you remove the call to exec, which is unnecessary and doesn't do what you seem to think it does.
To download a list of files you can use wget -i <file> where is a file name with a list of url to download.
For more details you can review the help page: man wget
Ok so I need to run wget but I'm prohibited from creating 'dot' files in the location that I need to run the wget. So my question is 'Can I get wget to use a name other than .listing that I can specify'.
further clarification : this is to sync / mirror an ftp folder with a local one, So using the -O option is not really useful, as I require all files to maintain format.
You can use the -O option to set the output filename, as in:
wget -O file http://stackoverflow.com
You can also use wget --help to get a complete list of options.
For folks that come along afterwards, and are surprised by an answer to the wrong question, here is a copy of one of the comments below:
#FelixD, yes, unfortunately misunderstood the question. Looking at the code for wget version 1.19 (Feb 2017), specifically ftp.c, it appears that the .listing file is hardcoded in macro LIST_FILENAME, and no override possible. There are probably better options for mirroring ftp sites - maybe take a look at lftp and its mirror command, also includes parallel downloads: lftp.yar.ru
#Paul: You can use that -O option specified by spong
No. You can't do this.
wget/src/ftp.c
/* File where the "ls -al" listing will be saved. */
#ifdef MSDOS
#define LIST_FILENAME "_listing"
#else
#define LIST_FILENAME ".listing"
#endif
I have same problem;
wget seems to save the .listing file in current directory where wget was called from, regardless of -O path/outpout_file
As an ugly/desperate solution we can try to run wget from random directories:
cd /temp/random_1; wget ftp://example.com/ -O /full/save_path/to_file_1.txt
cd /temp/random_2; wget ftp://example.com/ -O /full/save_path/to_file_2.txt
Note: manual says that using the --no-remove-listing option will cause it to create .listing.1, .listing.2, etc, so that might be an option to avoid conflicts.
Note: .listing file is not created at all if ftp login failed.