docker container exec -i [container_name] stat -c '%a' /path - linux

I got this code from a comrade, but I don't even know what means the "stat -c '%a'"
I looked throught google and on the man of docker but nothing clear
#!/bin/bash
set -e
commandDocker=$(docker exec -i name)
filepermissions=$(docker exec -i name stat -c '%a' /etc/ssl/certs/certificate.pem)
$commandDocker mkdir /var/www/certificate
No error, but I've to update the work's wiki with some code and explain it. And I don't know what does that mean :/

I already figured it out. It's to see the permissions in a format "-c" in octal "'%a'"

Related

The script sometimes doesn't run after wget

The script sometimes doesn't run after wget. Perhaps it is necessary to wait for the completion of wget?
#!/usr/bin/env bash
set -Eeuo pipefail
# Installing tor-browser
echo -en "\033[1;33m Installing tor-browser... \033[0m \n"
URL='https://tor.eff.org/download/' # Official mirror https://www.torproject.org/download/, may be blocked
LINK=$(wget -qO- $URL | grep -oP -m 1 'href="\K/dist.+?ALL.tar.xz')
URL='https://tor.eff.org'${LINK}
curl --location $URL | tar xJ --extract --verbose --preserve-permissions
sudo mv tor-browser /opt
sudo chown -R $USER /opt/tor-browser
cd /opt/tor-browser
./start-tor-browser.desktop --register-app
There are pitfalls associated with set -e (aka set -o errexit). See BashFAQ/105 (Why doesn't set -e (or set -o errexit, or trap ERR) do what I expected?).
If you decide to use set -e despite the problems then it's a very good idea to set up an ERR trap to show what has happened, and use set -E (aka set -o errtrace) so it fires in functions and subshells etc. A basic ERR trap can be set up with
trap 'echo "ERROR: ERR trap: line $LINENO" >&2' ERR
This will prevent the classic set -e problem: the program stops suddenly, at an unknown place, and for no obvious reason.
Under set -e, the script stops on any error.
set -Eeuo pipefail
# ^
Maybe the site is sometimes unavailable, or the fetched page doesn't match the expression grep is searching for.
You are doing
wget -qO- $URL
according to wget man page
-q
--quiet
Turn off Wget's output.
this is counterproductive for finding objective cause of malfunction, by default wget is verbose and write information to stderr, if you wish to store that into file you might redirect stderr to some file, consider following simple example
wget -O - http://www.example.com 2>>wget_out.txt
it does download Example Domain and write its' content to standard output (-) whilst stderr is appended to file named wget_out.txt, therefore if you run that command e.g. 3 times you will have information from 3 runs in wget_out.txt

Passing attributes to chef via command line

This is driving me nuts, any help is massively appreciated.
Currently using a recipe to run an ssh command whereby the command takes in args and then uses that.
The escaping of the string string quotes is quite literally sending me insane; please help me SO, you're my only help. :D
This is the literal string that I need for my ssh:
ssh -i /home/ec2-user/.ssh/Test-Key.pem -o StrictHostKeyChecking=no ec2-user#ipAddress echo '{\"attr\":\"value\"}' | sudo chef-client -o solr-restart -j /dev/stdin
it's wrapped in a command within the recipe like so:
command "ssh -i /home/ec2-user/.ssh/Test-Key.pem -o StrictHostKeyChecking=no ec2-user#ipAddress echo '{\"attr\":\"value\"}' | sudo chef-client -o solr-restart -j /dev/stdin"
no matter how I try and manipulate the string I cannot get the output to be correct, it either removes the escaped characters in the json, or adds in additional ones.
I've tried to echo '#{madness}'
where madness = madness = '{\"portAttribute\":\"'+"#{portNumber}"+'\"}'
but still no luck, thanks for any help.
IMHO your string interpolation looks fine but as you want to run the following command on remote machine:
echo '{\"portAttribute\":\"#{portNumber}\"}' | sudo chef-client -o solr-restart -j /dev/stdin
Command should tweaked a bit more and be passed in recipe as:
command "ssh -i /home/ec2-user/.ssh/Test-Key.pem -o StrictHostKeyChecking=no ec2-user#ipAddress 'echo \'{\\\"portAttribute\\\":\\\"#{portNumber}\\\"}\' | sudo chef-client -o solr-restart -j /dev/stdin' "
This works
{\\\"attr\\\":\\\"value\\\"}'
You reeeeeeally probably don't mean to be using -j, that totally overwrites whatever data is on the node already and is only intended for inital bootstrapping. After that, you don't pass data in on the command line, it comes from Chef Server.

Remote SSH commands not working in Linux

Regardless of why, I am trying to write a script that will let me send a command to various addresses. There is a shared key for the user, so there is no need for logging in. But this isn't working.
So, the following will not work...
#!/bin/bash
ip=$1
shift
args="'$#'"
cmd="ssh user#$ip -C $args"
output=$($cmd)
If I execute it with the following:
./myscript.sh 10.0.1.2 /bin/ls -l /var
I get the error of "ls -l /var: No such file or directory"
If I run that command (ssh user#10.0.1.2 -C '/bin/ls -l /var'), it works fine.
What am I doing wrong? These are the same installs of RHEL6.
Apparently, the quotes were confusing bash. The following works...
ip=$1
shift
$(ssh -o ConnectTimeout=1 User#$ip "$#")

wget -O for non-existing save path?

I can't wget while there is no path already to save. I mean, wget doens't work for the non-existing save paths. For e.g:
wget -O /path/to/image/new_image.jpg http://www.example.com/old_image.jpg
If /path/to/image/ is not previously existed, it always returns:
No such file or directory
How can i make it work to automatically create the path and save?
Try curl
curl http://www.site.org/image.jpg --create-dirs -o /path/to/save/images.jpg
mkdir -p /path/i/want && wget -O /path/i/want/image.jpg http://www.com/image.jpg
To download a file with wget, into a new directory, use --directory-prefix without -O:
wget --directory-prefix=/new/directory/ http://www.example.com/old_image.jpg
Using -O new_file in conjunction with --directory-prefix, will not create the new directory structure, and will save the new file in the current directory.
It may even fail with "No such file or directory" error, if you specify -O /new/directory/new_file
I was able to create folder if it doesn't exists with this command:
wget -N http://www.example.com/old_image.jpg -P /path/to/image
wget is only getting a file NOT creating the directory structure for you (mkdir -p /path/to/image/), you have to do this by urself:
mkdir -p /path/to/image/ && wget -O /path/to/image/new_image.jpg http://www.example.com/old_image.jpg
You can tell wget to create the directory (so you dont have to use mkdir) with the parameter --force-directories
alltogether this would be
wget --force-directories -O /path/to/image/new_image.jpg http://www.example.com/old_image.jpg
After searching a lot, I finally found a way to use wget to download for non-existing path.
wget -q --show-progress -c -nc -r -nH -i "$1"
=====
Clarification
-q
--quiet --show-progress
Kill annoying output but keep the progress-bar
-c
--continue
Resume download if the connection lost
-nc
--no-clobber
Overwriting file if exists
-r
--recursive
Download in recursive mode (What topic creator asked for!)
-nH
--no-host-directories
Tell wget do not use the domain as a directory (for e.g: https://example.com/what/you/need
- without this option, it will download to "example.com/what/you/need")
-i
--input-file
File with URLs need to be download (in case you want to download a lot of URLs,
otherwise just remove this option)
Happy wget-ing!

WGET without the log file

Every time I use wget http://www.domain.com a Log file is being saved automatically on my server. is there anyway to run this command without logging?
Thanks,
Joel
You could try -o and -q
-o logfile
--output-file=logfile
Log all messages to logfile. The messages are
normally reported to standard error.
-q
--quiet
Turn off Wget's output.
So you'd have:
wget ... -q -o /dev/null ...
This will print the site contents to the standard output, is this what you mean when you say that you don't want logging to a file?
wget -O - http://www.domain.com/
I personally found that #Paul's answer was still adding to a Log file, regardless of the Command line output of -q
Added -O to /dev/null ontop of the -o output file argument.
wget [url] -q -o /dev/null -O &> /dev/null

Resources