Resize a list of images in line command - linux

I would like to resize a list of images, all in the directory. To achieve that, I use convert from imagemagick. I would like to resize
image1.jpg
image2.jpg
...
into
image1-resized.jpg
image2-resized.jpg
...
I was wondering if there is a method to achieve this in a single command line. An elegant solution could be often useful, not only in this case.
EDIT:
I would like a non script-like solution, ie. without for loop.

If you want to resize them to 800x600:
for file in *.jpg; do convert -resize 800x600 -- "$file" "${file%%.jpg}-resized.jpg"; done
(works in bash)

ls *.jpg|sed -e 's/\..*//'|xargs -I X convert X.jpg whatever-options X-resized.jpg
You can eliminate the sed and be extension-generic if you're willing to accept a slightly different final filename, 'resized-image1.jpg' instead of 'image1-resized.jpg':
ls|xargs -I X convert X whatever-options resized-X

GNU Parallel is even easier than for loops, and it's often faster:
parallel convert -resize 800x600 -- "{}" "{.}-resized.jpg" ::: *.jpg
A few things going on here, from right to left:
::: *.jpg means run the command for every jpg file
{.} means insert the current filename without the suffix (.jpg)
{} means insert the current filename
parallel means run the following command many times in parallel. It will choose the max to do in parallel to match the number of cores your computer has. As each one finishes it will launch the next one until all the jpg files are converted.
This runs the command convert --resize 800x600 -- foo.jpg foo-resized.jpg for each file. The -- tells convert to stop processing flags, in case a file name happens to start with a -.
P.S. On my mac I have Homebrew installed, so I was able to install parallel and convert with
brew install parallel
brew install imagemagick

If your image files have different extensions:
for f in *; do convert -resize 800x600 -- "$f" "${f%.*}-resized.${f##*.}"; done

Related

Is it possible to display a file's contents and delete that file in the same command?

I'm trying to display the output of an AWS lambda that is being captured in a temporary text file, and I want to remove that file as I display its contents. Right now I'm doing:
... && cat output.json && rm output.json
Is there a clever way to combine those last two commands into one command? My goal is to make the full combined command string as short as possible.
For cases where
it is possible to control the name of the temporary text file.
If file is not used by other code
Possible to pass "/dev/stdout" as the.name of the output
Regarding portability: see stack exchange how portable ... /dev/stdout
POSIX 7 says they are extensions.
Base Definitions,
Section 2.1.1 Requirements:
The system may provide non-standard extensions. These are features not required by POSIX.1-2008 and may include, but are not limited to:
[...]
• Additional character special files with special properties (for example,  /dev/stdin, /dev/stdout,  and  /dev/stderr)
Using the mandatory supported /dev/tty will force output into “current” terminal, making it impossible to pipe the output of the whole command into different program (or log file), or to use the program when there is no connected terminals (cron job, or other automation tools)
No, you cannot easily remove the lines of a file while displaying them. It would be highly inefficient as it would require removing characters from the beginning of a file each time you read a line. Current filesystems are pretty good at truncating lines at the end of a file, but not at the beginning.
A simple but extremely slow method would look like this:
while [ -s output.json ]
do
head -1 output.json
sed -i 1d output.json
done
While this algorithm is plain and simple, you should know that each time you remove the first line with sed -i 1d it will copy the whole content of the file but the first line into a temporary file, resulting in approximately 0.5*n² lines written in total (where n is the number of lines in your file).
In theory you could avoid this by do something like that:
while [ -s output.json ]
do
line=$(head -1 output.json)
printf -- '%s\n' "$line"
fallocate -c -o 0 -l $((${#len}+1)) output.json
done
But this does not account for variable newline characters (namely DOS-formatted newlines) and fallocate does not always work on xfs, among other issues.
Since you are trying to consume a file alongside its creation without leaving a trace of its existence on disk, you are essentially asking for a pipe functionality. In my opinion you should look into how your output.json file is produced and hopefully you can pipe it to a script of your own.

Convert a bunch of images from svg to png

I need to convert from svg to png all the images in a folder. Lets say that images are called test1.svg, test2.svg, ... , testn.svg. Using the following script:
for i in *.svg
do
convert "$i" PNG24:"$i".png
done
does the job correctly, and the images are called test1.svg.png, test2.svg.png, ... , testn.svg.png. My questions are:
1) Is it possible to make the output images be called test1.png, test2.png, ... , testn.png, essentially removing the 'svg' part from the name?
2) Is it possible to send them directly into some other directory?
Thanks!
Yes. You can make another directory and send them there like this:
mkdir other
for i in *.jpg; do
convert "$i" PNG24:other/"${i%jpg}png"
done
If you have lots of images to do, and you are on macOS or Linux, I would recommend GNU Parallel to get the job done faster:
mkdir other
parallel convert {} PNG24:other/{.}.png ::: *jpg

Regarding comparison of 2 image sequences in Linux/Ubuntu

I have images in 2 different folders, 100 images in each of the 2 folders.The images belong to photographs taken from 2 different simulations.The 100 images are the 100 time steps of the 2 simulations.I wish to compare the images frame by frame. Can they be displayed on the screen with some software,such that I just need to press the arrow keys(up/down) and the images from the 2 sequences will BOTH move forward/backward by one step, so that I can compare the 2 images frame by frame simultaneously. I do not wish to mathematically subtract images, just compare them visually with the eyes.
Windows, I came to know has avisynth and pdplayer for the above. avxsynth is the Linux version of avisynth,but it is unstable in my computer.
This is the only question I found,before posting this and it is off-topic
How to list an image sequence in an efficient way? Numercial sequence comparison in Python
Can anyone please suggest any other option ?
Have you considered using ImageMagick, specifically montage, and a simple shell script to create a new set of images? (Where each new image consists of your two previous images glued (montaged) together side by side.)
ImageMagick will also support subtraction of images, which can be montaged into the new set too, should you change your mind about that.
Or the creation of animations where the images oscillate between your two runs so you can compare them more easily. (i.e. create a 3rd folder with 100 new images each of which is an animation alternating between the two runs.)
You may want to consider generating a little html with a shell-script, and putting each image or set of images into its own webpage along with forward and backward buttons. It's pretty trivial, and gives you a nice little web-browser slideshow. You can pick up the necessary HTML in a couple of minutes. You wouldn't need much more than the A HREF and IMG tags. Webbrowsers support local file:// URLs. (Again, a 3rd directory with 100 html files linking to images in the first 2 directories.)
Or you could generate one big webpage with all the images in it, and just scroll up and down...
I am new to shell scripting, can you please give me a shell script. I could not find out how to write the shell script. I could make use of montage. Suppose the 2 directories are dir1 and dir2, and each of them has five files file_001,file_002,file_003,file_004,file_005 can you please post the shell script ??
Sure. I happen to like TCSH (or CSH) for this, just for the :t option...
Note: montage output filename needs an extension to tell montage what the output graphics filetype is... (e.g. .jpeg or .gif or whatever...)
% mkdir dir3
% ls -a *
dir1:
./ ../ file_001 file_002 file_003 file_004 file_005
dir2:
./ ../ file_001 file_002 file_003 file_004 file_005
dir3:
./ ../
% foreach VAR ( dir1/file* )
montage -background #000000 -geometry +4+4 $VAR dir2/$VAR:t dir3/out_$VAR:t.jpeg
end
% ls d*
dir1:
./ ../ file_001 file_002 file_003 file_004 file_005
dir2:
./ ../ file_001 file_002 file_003 file_004 file_005
dir3:
./ out_file_001.jpeg out_file_003.jpeg out_file_005.jpeg
../ out_file_002.jpeg out_file_004.jpeg
Nothing to it... For HTML you could just echo text into a file...
% set Q = '"'
% mkdir dirfoo
% foreach VAR ( dir1/file* )
echo "<html><head></head><body><img src=${Q}../$VAR${Q}></img></body></html>" >> dirfoo/$VAR:t.html
end
That sort of thing...
Perhaps:
% foreach VAR ( dir1/file* )
echo "<html><head></head><body><table><tr><td><img src=${Q}../$VAR${Q}></img></td><td><img src=${Q}../dir2/$VAR:t${Q}></img></td></tr></table></body></html>" >> dirfoo/$VAR:t.html
end

Combining all the images in folder into a video

I have a script that takes tons of pictures and names them with a time-stamp. These Images are all put into one folder. I want to create a script that takes all the pictures in the folder, combines them into a 10fps video, saves this video as the date and time it started from to the time it ended, and deletes the original pictures. So far, I've seen some people use Ffmpeg or mencoder but I'm not sure how to use these or do what I want with them. Any help is appreciated! Thanks.
You can use the FFMpeg command line interface. You invoke it from the shell. Download the binary and run it by pointing it at the desired directory. %05d is simply string formatting for numbers. %05d just says pad with 4 leading zeros 0001.jpg or whatever.
# Create a directory and copy the original images there for manipulation:
mkdir temp
cp *.JPG temp/.
# Resize the images:
mogrify -resize 200x200 temp/*.JPG
# Create the morph images
convert temp/*.JPG -delay 10 -morph 5 temp/%05d.jpg
# Stitch them together into a video
ffmpeg -r 50 -qscale 2 -i temp/%05d.jpg output.mp4
from http://www.itforeveryone.co.uk/image-to-video.html

An efficient way to detect corrupted png files?

I've written a program to process a bunch of png files that are generated by a seperate process. The capture mostly works, however there are times when the process dies and is restarting which leaves a corrupted image. I have no way to detect when the process dies or which file it dies one (there are ~3000 png files).
Is there a good way to check for a corrupted png file?
I know this is a question from 2010, but I think this is a better solution: pngcheck.
Since you're on a Linux system you probably already have Python installed.
An easy way would be to try loading and verifying the files with PIL (Python Imaging Library) (you'd need to install that first).
from PIL import Image
v_image = Image.open(file)
v_image.verify()
(taken verbatim from my own answer in this thread)
A different possible solution would be to slightly change how your processor processes the files: Have it always create a file named temp.png (for example), and then rename it to the "correct" name once it's done. That way, you know if there is a file named temp.png around, then the process got interrupted, whereas if there is no such file, then everything is good.
(A variant naming scheme would be to do what Firefox's downloader does -- append .partial to the real filename to get the temporary name.)
Kind of a hack, but works
If you are running on linux or something like you might have the "convert" command
$ convert --help
Version: ImageMagick 5.5.6 04/01/03 Q16 http://www.imagemagick.org
Copyright: Copyright (C) 2003 ImageMagick Studio LLC
Usage: convert [options ...] file [ [options ...] file ...] [options ...] file
If you make an invalid png, and then try to convert, you'll get an error:
$ date> foo.png
$ convert foo.png foo.gif
convert: NotAPNGImageFile (foo.png).
Find all non-PNG files:
find . -type f -print0 | xargs -0 file --mime | grep -vF image/png
Find all corrupted PNG files:
find . -type f -print0 | xargs -0 -P0 sh -c 'magick identify +ping "$#" > /dev/null' sh
file command only checks magic number. Having the PNG magic number doesn't mean it is a well formed PNG file.
magick identify is a tool from ImageMagick. By default, it only checks headers of the file for better performance. Here we use +ping to disable the feature and make identify read the whole file.

Resources