Most efficient way to get the latest version of an rpm via web - linux

This is my attempt using wget to pull down the web page, dig for latest tar file and rerun a wget to take it down. In the example, i'm taking down pip.
wget https://pypi.org/project/pip/#files
wget $(grep tar.gz index.html | head -1 | awk -F= '{print $2}' | sed 's/>//' | sed 's/\"//g')
gunzip -c $(ls | grep tar |tail -1) | tar xvf -
yum install -y python-setuptools
cd $(ls -d */ | grep pip)
python setup.py install
cd ..
I'm sure that there is a better way, perhaps only using one wget or similar

Do you mean like that?
wget $(curl -s "https://pypi.org/project/pip/#files"|grep -o 'https://[^"]*tar\.gz')

Related

Linux curl : no url found (or) curl: malformed url

So I am downloading docker setup on my linux vm, and have to run this command as part of the steps, but even though it mentions url, and I changed once -o to -O but still getting those errors, what to do for this?
this is the command im running
sudo curl -L $(curl -L https://api.github.com/repos/docker/compose/releases/latest | grep "browser_download_url" | grep "$(uname -s)-$(uname -m)\"" | sed -nr 's/\s+"browser_download_url":\s+"(https.*)"/\1/p') -o /usr/local/bin/docker-compose
The grep that is filtering what system you are running is outputting an upper case L in Linux, this may be the cause of your errors. Try this:
sudo curl -L $(curl -L https://api.github.com/repos/docker/compose/releases/latest | grep "browser_download_url" | grep -i "$(uname -s)-$(uname -m)\"" | sed -nr 's/\s+"browser_download_url":\s+"(https.*)"/\1/p') -o /usr/local/bin/docker-compose
Hope this helps!

How to automate installation of missing GPG keys on Linux

I've been working with Linux containers for several years. I am surprised that I wasn't able to find a thread about this question. Scenario:
I've just added a new package index (/etc/sources.list.d/example.list) and want to install a package, let's call it snailmail.
I run the commands:
apt-get update && apt-get install -y snailmail
I get the following error:
W: GPG error: https://example.com/snailmail/debian stable InRelease:
The following signatures couldn't be verified because the public key is not available:
NO_PUBKEY 7EF2A9D5F293ECE4
What is the best way to automate the installation of GPG keys?
apt-key now seems to be deprecated, I have created a script that will detect and get the missing keys, you can get it here.
#!/bin/sh -e
tmp="$(mktemp)"
sudo apt-get update 2>&1 | sed -En 's/.*NO_PUBKEY ([[:xdigit:]]+).*/\1/p' | sort -u > "${tmp}"
cat "${tmp}" | xargs sudo gpg --keyserver "hkps://keyserver.ubuntu.com:443" --recv-keys # to /usr/share/keyrings/*
cat "${tmp}" | xargs -L 1 sh -c 'sudo gpg --yes --output "/etc/apt/trusted.gpg.d/$1.gpg" --export "$1"' sh # to /etc/apt/trusted.gpg.d/*
rm "${tmp}"
Here's a handy script that can be called during the build process to download and install common GPG keys (from the Ubuntu keyserver):
Prerequisites:
wget
for PUBKEY in $(apt-get update 2>&1 | grep NO_PUBKEY | awk '{print $NF}')
do
wget -q "https://keyserver.ubuntu.com/pks/lookup?op=get&search=0x${PUBKEY}" -O - | sed -n '/BEGIN/,/END/p' | apt-key add - 2>/dev/null
done

Why this command destroyed my ubuntu 14:04 installation?

I successfully used this command to remove all the old kernels from my system:
dpkg --list |
grep linux-image |
awk '{ print $2 }' |
sort -V |
sed -n '/'"linux-image-3.13.0-100-generic"'/q;p' |
xargs sudo apt-get -y purge
But when I used this modified version to un-install cups, dpkg started to remove packages unrelated to cups:
dpkg --list |
grep cups |
awk '{ print $2 }' |
sort -V |
xargs sudo apt-get -y purge
By the time I realized what was happening, my system had became already unbootable. I don't know if it's supposed to happen with xargs, but I could not stop the execution with a Ctrl+C sequence.

How do you use wget to download most up to date file on a site?

Hello I am trying to use wget to download the most update to day McAfee patch and I am having issues singling out the .tar file. This is what I have:
wget -q -O - ftp://ftp.mcafee.com/pub/antivirus/datfiles/4.x/ | grep -o -m 2 "avvdat-[^\']*"
However when I run the above command it gives me:
avvdat-8065.tar">avvdat-8065.tar</a> (95191040 bytes)
avvdat-8066.tar">avvdat-8066.tar</a> (95385600 bytes)
When I need it to just be the most recent.tar file in between the <a> </a> which in this case would be avvdat-8066.tar. Can someone please help me out with greping the correct .tar I am not too good with regex or sed.
Try this,
wget $(wget -q -O - ftp://ftp.mcafee.com/pub/antivirus/datfiles/4.x/ | grep -Eo "ftp://[^\"\]+" | sort | tail -n1)
I'd suggest modifying your grep regex so it retrieves only the file name, then using sort to sort the results and tail to discard all but the last one.
wget -q -O - ftp://ftp.mcafee.com/pub/antivirus/datfiles/4.x/ | grep -o -m 2 "avvdat-[^\'\"]*" | sort | tail -1

How to create an offline repository for debian non-free?

I am using debian squeeze and want to create an offline repository or a cd/dvd for the debian non-free branch. I looked around the internet and all i found out is that there are neither iso images nor there are jidgo files for creating such image so I had the idea to fetch the packages from one of the debian package servers using:
wget -r --no-parent -nH -A*all.deb,*any.deb,*i386.deb \
ftp://debian.oregonstate.edu/debian/pool/non-free/
I know that that I must use file: in my */etc/apt/sources.list* to indicate local repositories but how do I actually create one so that apt or aptitude understands this?
(Answered in a questioned edit. Converted to a community wiki answer. See What is the appropriate action when the answer to a question is added to the question itself? )
The OP wrote:
Update: With a few ugly tricks I was able to extract the needed data from pool and the dist folder.
I used the unzipped Package.gz to do this:
grep '^Package\:.*' Packages|awk '{print $2}' >> Names.lst
grep '^Version\:.*' Packages|awk '{print $2}' >> Versions.lst
grep '^Architecture\:.*' Packages|awk '{print $2}' >> Arch.lst
With vim I find and remove the ':' in the file Versions.lst and generate a shorter Content.lst more easy to parse with bash tools:
paste Names.lst Versions.lst Arch.lst >> Content.lst
Now I do this:
cat content.lst | while read line; \
do echo "$(echo $line|awk '{print $1}')\
_$(echo $line|awk '{print $2}')_$(echo $line|awk '{print $3}')";\
done >> content.lst.tmp && mv content.lst.tmp content.lst
which generates me the file names in the debian directory I need. When finishing with my downloads using wget I find and rsync the needed file names. mv does not work here because I needed the structure as it is referring to in Packages.gz:
cat content.lst |while read line; \
do find debian/ -type f -name ${line}.deb -exec \
rsync -rtpog -cisR {} debian2/ \; ;done
rm -r debian && mv debian2 debian
To receive the complete dists tree structure I used wget again:
wget -c -r --no-parent -nH -A*.bz2,*.gz,Release \
ftp://debian.oregonstate.edu/debian/dists/squeeze/non-free/binary-i386/
I think the only thing I have to do now is to create the Contents.gz file.
The Contents.gz file can easily be created using the apt-ftparchive program:
apt-ftparchive contents > Contents-i386 && gzip -f Contents-i386

Resources