Editing file with vim without typing path (similar to autojump) - linux

A few months back, I installed a utility on my mac so that instead of typing something like this:
vim /type/path/to/the/file
I could just type:
v file
9 times out of 10 it would guess the right file based on the past history, similar to the way autojump works. And instead of typing in vim I can just type the letter v.
I can't remember how I set this up though. It still works on my mac but I don't see anything in my .bash_profile that shows how I did that.
I'm trying to get this to work on my linux box.

This can be found here
https://github.com/rupa/v/blob/master/v
it should work in Linux too. It is a bash script that uses the viminfo
history file to fill in partial strings.
It can be installed on macOS with brew install v

Ah! I found the command with which. Here is the magical script. I can't determine where I got it.
#!/usr/bin/env bash
[ "$vim" ] || vim=vim
[ $viminfo ] || viminfo=~/.viminfo
usage="$(basename $0) [-a] [-l] [-[0-9]] [--debug] [--help] [regexes]"
[ $1 ] || list=1
fnd=()
for x; do case $x in
-a) deleted=1;;
-l) list=1;;
-[1-9]) edit=${x:1}; shift;;
--help) echo $usage; exit;;
--debug) vim=echo;;
--) shift; fnd+=("$#"); break;;
*) fnd+=("$x");;
esac; shift; done
set -- "${fnd[#]}"
[ -f "$1" ] && {
$vim "$1"
exit
}
while IFS=" " read line; do
[ "${line:0:1}" = ">" ] || continue
fl=${line:2}
[ -f "${fl/\~/$HOME/}" -o "$deleted" ] || continue
match=1
for x; do
[[ "$fl" =~ $x ]] || match=
done
[ "$match" ] || continue
i=$((i+1))
files[$i]="$fl"
done < "$viminfo"
if [ "$edit" ]; then
resp=${files[$edit]}
elif [ "$i" = 1 -o "$list" = "" ]; then
resp=${files[1]}
elif [ "$i" ]; then
while [ $i -gt 0 ]; do
echo -e "$i\t${files[$i]}"
i=$((i-1))
done
read -p '> ' CHOICE
resp=${files[$CHOICE]}
fi
[ "$resp" ] || exit
$vim "${resp/\~/$HOME}"

Related

How to fix '[: too many arguments' for jpg files [duplicate]

This question already has answers here:
Meaning of "[: too many arguments" error from if [] (square brackets)
(6 answers)
Closed 3 years ago.
I am making a script that should play any media files in the downloads folder and shred it if I do not want to keep them. It works on the file types .swf, .webm, .gif, .png but not .jpg. For jpg I get this error
'[: too many arguments'
If I change it to .png without changing anything else then it works.
I have tried to change it from *jpg to *png, and that works. Changing it back to *jpg, gods forbid that. I can't find anything on google that can help me with this.
#!/bin/bash
cd Downloads
get_files () {
for i in *.*; do
[ -f "$i" ] || break
echo "Playing '$i'"
if [ "$i" == *swf ]; then
./flashplayer $i
shred_file $i
elif [ "$i" == *webm ]; then
vlc $i
shred_file $i
elif [ "$i" == *gif ]; then
xdg-open $i
shred_file $i
elif [ "$i" == *jpg ]; then
xdg-open $i
shred_file $i
elif [ "$i" == *png ]; then
xdg-open $i
shred_file $i
fi
done
}
shred_file () {
echo ""
echo "Do you want to shred the file?"
read r
if [ "$r" == "y" ]; then
shred -uvz $1
else
echo keep
fi
}
get_files
I expect this script to be able to open .jpg files and any other file types defined in this script. I do not expect this error to occur at all
In Bash (and other POSIX shells), [ ... ] is not particularly magical; in fact, it can be implemented as a completely separate program, though in practice most shells do provide it as a builtin. (If you run type -a [, you'll most likely see that your system has both a builtin and a separate program.)
So the problem is that if your current directory contains files whose names end with jpg, such as foojpg and barjpg, then this command:
[ "$i" == *jpg ]
expands to something like this:
[ my.file == foojpg barjpg ]
which is obviously not what you want.
And even if you escaped the * to prevent the filename-expansion ([ "$i" == \*jpg ] or [ "$i" == '*jpg' ]), it still wouldn't do what you want, because [ ... ] doesn't support this sort of glob comparison.
Since you're specifically using Bash, your best bet is to use its special [[ ... ]] syntax, which is magical, and has special support for globs:
[[ "$i" == *jpg ]]
(And likewise for all of your other tests.)
bash will expand *jpg to all the file names which end 'jpg'. And so on for all the file suffixes. The '.jpg' test causes an error probably because there are a number of files that match *.jpg.
Better to test the suffix itself:
suffix=${i#*.} # if $i is file.jpg yields 'jpg'
And then do tests for the suffix value:
if [ "$suffix" == "swf" ]; then
./flashplayer $i
shred_file $i
elif [ "$suffix" == "webm" ]; then
vlc $i
shred_file $i
...
...
Incidentally, for the for loop, you could (if appropriate) specify the file types the loop should process:
shopt -s nullglob
for i in *.{swf,gif,png,jpg,jpeg}; do

How to write an abbreviated version of the if record for this code?

How to write an abbreviated version of the if record for this code?
Rewrite the same script making it one-liner. (if else)
Linux RedHat 4-5
!/bin/bash
for file in /etc/*; do
if [ -f "$file" ]
then
echo "$file is a regular file"
elif [ -d "$file" ]
then
echo "$file is a directory"
else
echo "$file is something else"
fi
done
Rewrite the same script making it one-liner. (if else)
Not sure if this is better, but this will work:
([ -f "$file" ] && echo "regular") || ( [ -d "$file" ] && echo "dir" ) || echo "wtf"

How to compare two stat values of the same file?

Screenshot of the my code
I am trying to make a shell program that tells me when a file has been created, when it has been modified, and when it has been deleted. I think I can solve this but my only issue is that I cant compare the stat values. It tells me that I have "too many arguments". Any help would be much appreciated :)
#!/bin/bash
run=yes
if [ -f $1 ]
then
while [ run=yes ]
do
time1=$(stat -c %y $1)
time2=$(stat -c %y $1)
if [ ! $time2 ]
then
echo "The file "$1" has been deleted."
run=no
elif [ $time2 -gt $time1 ]
then
echo "The file "$1" has been modified."
run=no
fi
done
else
while [ run=yes ]
do
sleep 2
if [ -f $1 ]
then
echo "The file "$1" has been created."
run=no
fi
done
fi
The output of static -c %y ... includes spaces, which is what the shell uses to separate arguments. When you then run:
if [ ! $time2 ]; then
This translates into something like:
if [ ! 2017-09-02 08:57:19.449051182 -0400 ]; then
Which is an error. The ! operator only expects a single argument. You could solve it with quotes:
if [ ! "$time2" ]; then
Or by using the bash-specific [[...]]] conditional:
if [[ ! $time2 ]]; then
(See the bash(1) man page for details on that second solution).
Separately, you're not going to be able to compare the times with -gt as in:
elif [ $time2 -gt $time1 ]
This (a) has the same problem as the earlier if statement, and (b) -gt can only be used to compare integers, not time strings.
If you were to use %Y instead of %y, you would get the time as an integer number of seconds since the epoch, which would solve all of the above problems.
The code is now working and I thought I would share the final result if anyone wanted to know.
#!/bin/bash
run=true
if [ -f $1 ]
then
while [ "$run" = true ]
do
time1=$(stat -c %Y $1 2>/dev/null)
sleep $2
time2=$(stat -c %Y $1 2>/dev/null)
if [ ! "$time2" ]
then
echo "The file "$1" has been deleted."
run=false
elif [ $time2 -gt $time1 ]
then
echo "The file "$1" has been modified."
run=false
fi
else
while [ "$run" = true ]
do
sleep 2
if [ -f $1 ]
then
echo "The file "$1" has been created."
run=false
fi
done
fi

Hashing a string using a passphrase in Bash

I need a way of hashing a existing filename with a passphrase (ASCII string) but being able to revert it back afterwards using the same passphrase.
I know that ciphers can do this - they are encrypting the string... But their output lenght is based on the filename lenght, which is exactly what I do not want... mainly because it sometimes doubles the file lenght, but the outputted strings are not allways compatible with the FS. Ex. "\n" in a filename.
To be clear, I did a lot of research and even wrote some scripts, but all of the solutions are either slow, or don't work for my application at all.
The Goal of this is to get a constant length filenames that can be all 'decrypted' at once using a single passphrase. Without the need of creating additional 'metadata-like' files.
I've gotten all the way around with my initial question. There seems to be only one solution to the problem above, and that is (as James suggested) Format-preserving encryption. Altough, as far as I can tell, there are no existing commands that do this.
So I did exactly what was my very first option, and that is hashing the filename, putting the hash and the filename into a plain file (one file per directory) and encrypting that file with a passphrase.
I'll post my code here. Though it's probably not the prettiest nor the most portable code, but It does the job and is (IMO) really simple.
#!/usr/bin/env bash
man="Usage: namecrypt [ -h ] [ -e || -d ] [ -F ] [ -D ] [DIRECTORY] [PASSPHRASE]\n
-h, --help display this message
-e, --encrypt encrypt the specified directory
-d, --decrypt decrypt the specified directory
-F, --files include files
-D, --dir include directories
[DIRECTORY] relative or absolute path to a directory/symbolic link
[PASSPHRASE] optional - specify the user passphrase on command line";
options=();
for Arg in "$#"; do
if [ "$Arg" == "-h" ] || [ "$Arg" == "--help" ]; then
echo -e "$man"; exit;
elif [ "$Arg" == "-e" ] || [ "$Arg" == "--encrypt" ]; then
options[0]="E";
elif [ "$Arg" == "-d" ] || [ "$Arg" == "--decrypt" ]; then
options[0]="D";
elif [ "$Arg" == "-F" ] || [ "$Arg" == "--files" ]; then
options[1]="${options[1]}F";
elif [ "$Arg" == "-D" ] || [ "$Arg" == "--dir" ]; then
options[1]="${options[1]}D";
elif [ -d "$Arg" ]; then
options[2]="$(realpath "$Arg")";
else
options[3]="$Arg";
fi;
done;
if [ "${options[0]}" == "" ]; then echo "No Mode specified!"; exit 1; fi;
if [ "${options[1]}" == "" ]; then options[1]="F"; fi;
if [ "${options[2]}" == "" ]; then echo "No such directory!"; exit 2; fi;
if [ "${options[3]}" == "" ]; then echo "Enter a passphrase: "; read options[3]; fi;
shopt -s nullglob dotglob;
function hashTarget
{
BASE="$(basename "$1")";
DIR="$(dirname "$1")/";
if [ -a "$1" ]; then
oldName="$BASE";
newName=$(echo "$oldName" | md5sum);
echo "$oldName||$newName" >> "$DIR.names";
mv "$1" "$DIR$newName";
else echo "Skipping '$1' - No such file or directory!";
fi;
}
function dehashTarget
{
BASE="$(basename "$1")";
DIR="$(dirname "$1")/";
[ -f "$DIR.names.cpt" ] && ccdecrypt -K "${options[3]}" "$DIR.names.cpt";
if [ -f "$DIR.names" ]; then
oldName="$BASE";
newName=$(grep "$oldName" "$DIR.names" | awk -F '|' '{print $1}');
[[ ! -z "${newName// }" ]] && mv "$1" "$DIR$newName";
else
echo "Skipping '$1' - Hash table not found!";
fi;
}
function mapTarget
{
DIR="$(dirname "$1")/";
for Dir in "$1"/*/; do
mapTarget "$Dir";
done;
for Item in "$1"/*; do
if ([ -f "$Item" ] && [[ "${options[1]}" == *"F"* ]]) ||
([ -d "$Item" ] && [[ "${options[1]}" == *"D"* ]]); then
if [ "${options[0]}" == "E" ]; then
hashTarget "$Item";
else
dehashTarget "$Item";
fi;
fi;
done;
[ "${options[0]}" == "D" ] && [ -f "$DIR.names" ] && rm "$DIR.names";
[ "${options[0]}" == "E" ] && [ -f "$DIR.names" ] && ccencrypt -K "${options[3]}" "$DIR.names";
}
mapTarget "${options[2]}";
Probably the only reason why it is so long, is because I didn't bother with any OOP, and I also did a lot of checks and steps to make sure that most of the time no names get mangled and can't be restored - user error can still cause this.
This is the command used to encrypt the hash-table files: CCrypt

Bash null binary operators

A recent test I took had a question on the output of the following bash command:
var=; [ -n $var ]; echo $?; [ -z $var ]; echo $?
The results are 0 and 0, indicating the return codes for both unary operators had no errors. This means $var resolves to both null (empty) and 'non-null' (not empty), correct?
How is this possible?
No, it means that [ is unfixably broken. In both cases $var evaluates to nothing, and the commands simply execute [ -n ] and [ -z ] respectively, both of which result in true. If you want to test the value in the variable itself then you must quote it to have it handled properly.
$ var=; [ -n "$var" ]; echo $?; [ -z "$var" ]; echo $?
1
0
You will need to surround $var:
$ [ -n "$var" ]; echo $?
1
Remember that the closing square bracket is just syntactic sugar: you don't need it. That means your line:
$ [ -n $var ]; echo $?
will expand to (since $var is empty):
$ [ -n ]; echo $?
The above asks: "is the string ']' non-empty?" And the answer is yes.
It's surprising indeed. If you were to try the same with the bashism [[ syntax, you'd get 1 and 0 as results. I reckon this is a bug.
var=; [[ -n $var ]]; echo $?; [[ -z $var ]]; echo $?
or, as Ignacio points out and as in fact I have always been doing intuitively, with defensive coding and quoting:
var=; [[ -n "$var" ]]; echo $?; [[ -z "$var" ]]; echo $?
It's surprising to me that [ behaves this way, because it's a builtin:
$ type [
[ is a shell builtin
Just did a little test and the system command [ behaves in the same broken way as the builtin. So probably it's buggy for compatibility:
var=; /usr/bin/\[ -n $var ]; echo $?; /usr/bin/\[ -z $var ]; echo $?

Resources