Linux: Delay between regular expression executions? / Readings every file in directory? [closed] - linux

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I have a directory on a linux machine that has hundreds of files. They are relatively short files and can easily be displayed and read in a command line in a second or two. Is there a way to print all of the files to the command line in an automated way, similar to typing "cat *" but with a one or two second delay between each print so that I can read each of the files?
I have tried making a bash script with:
cat $1
sleep 2
And then calling "bash script.sh *" but it only prints one of the files and then stops.
Any suggestions?

Use a loop in the script:
#!/usr/bin/env bash
for f
do
cat "$f"
sleep 2
done

while [ -e "$1" ]
do
cat "$1"
shift 1
sleep 2
done

for i in $(find /path/to/files -type f -name \*.\*); do
cat $i;
sleep 2;
done
This should work fine:
find /path/to/files -type f
if there is a specific extension you wish to focus on for eg then try something like this:
find /path/to/files -type f -name \*.cfg
and rather than looping you could do it all within find
find /path/to/files -type f -name \*.cfg -print -exec cat {} \; -exec sleep 2 \;
Edited to add Kevin is not off his head I had commented that cat * might not work in a large folder containing lots of files and he responded to my comment which I had removed whilst he was writing his comment :)

In your shell prompt (provided it's sh-compatible - most likely):
for i in *; do
cat $i
sleep 2
done
perhaps ? Loops over all the files (*) and cats each one out. You don't really need a separate shell script for something so simple.

Related

Bash Script: Check if multiple files are executable and output them? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 months ago.
Improve this question
Can you make a script that iterates through a folder (for example the home directory) and prints the names of all regular files that are executable?
Are you aware that the find command has an -executable switch?
If you want to see all executable files in all subdirectories, you might do:
find ./ -type f -executable
If you want to see all executable files, but just in your directory, you might do:
find ./ -maxdepth 1 -type f -executable
I can.
#!/bin/bash
for d in "$#"
do [[ -d "$d" ]] || { printf '\n"%s" not a directory\n' "$d"; continue; }
for f in "$d"/* "$d"/.*; do [[ -f "$f" && -x "$f" ]] && ls -l "$f"; done
done
But use find as Dominique advised.
Why reinvent the wheel?
Still, there's a lot going on there that could be useful.
Let me know if you have questions.

Create file in all subdirectories [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I'm trying to create a file called 1 in all subdirectories.
e.g the main directory is abc and sub-directories are abc/xyz, abc/ghi/123
It must contain a full path and file name.
You can do something along the lines of
for d in $(find abc -type d); do
touch "$d"/1
done
Use:
find abc -type d -print0 | while IFS= read -r -d '' dir; do
readlink -f "$dir"/1 > "$dir"/1
done
It finds all directories recursive in directory named abc
For each directory found, it read it to "dir" variable
readlink -f shows full path to a path "$dir"/1
> "$dir"/1 writes to the file "$dir"/1, truncates it before writing, creates if it does not exists
The -print0 and -d '' are for handling spaces and newlines in directory names.
And a version using xargs:
find abc -type d -print0 | xargs -0 -n1 -- sh -c 'readlink -f "$1"/1 > "$1"/1` --
But we can also use find's -exec, which should probably be the fastest:
find abc -type d -exec sh -c 'readlink -f "$1"/1 > "$1"/1' -- {} \;

finding files and moving their folders [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I have a huge number of text files, organized in a big folder tree, on Debian Linux. What I need is to find all text files having a specific name pattern and then move the containing folder to a destination.
Example:
/home/spenx/src/a12/a1a22.txt
/home/spenx/src/a12/a1a51.txt
/home/spenx/src/a12/a1b61.txt
/home/spenx/src/a12/a1x71.txt
/home/spenx/src/a167/a1a22.txt
/home/spenx/src/a167/a1a51.txt
/home/spenx/src/a167/a1b61.txt
/home/spenx/src/a167/a1x71.txt
The commands:
find /home/spenx/src -name "a1a2*txt"
mv /home/spenx/src/a12 /home/spenx/dst
mv /home/spenx/src/a167 /home/spenx/dst
The result:
/home/spenx/dst/a12/a1a22.txt
/home/spenx/dst/a167/a1a22.txt
Thank you for your help.
SK
combination of find, dirname and mv along with xargs should solve your problem
find /home/spenx/src -name "a1a2*txt" | xargs -n 1 dirname | xargs -I list mv list /home/spenx/dst/
find will fetch list of files
dirname will extract path of file. Note that it can only take one argument at a time
mv will move source directories to destination
xargs is the key to allow output of one command to be passed as arguments to next command
For details of options used with xargs, refer to its man page of just do man xargs on terminal
You can execute:
find /home/spenx/src name "a1a2*txt" -exec mv {} /home/spenx/dst \;
Font: http://www.cyberciti.biz/tips/howto-linux-unix-find-move-all-mp3-file.html
Create this mv.sh script in the current directory that will contain this:
o=$1
d=$(dirname $o)
mkdir /home/spenx/dst/$d 2>/dev/null
mv $o /home/spenx/dst/$d
Make sure it is executable by this command:
chmod +x mv.sh
Next call this command:
find /home/spenx/src -name "a1a2*txt" -exec ./mv.sh {} \;
find /home/spenx/src -name "a1a2*txt" -exec mv "{}" yourdest_folder \;
There's probably multiple ways to do this, but, since it seems you might have multiple matches in a single directory, I would probably do something along this line:
find /home/spenx/src -name "a1a2*txt" -print0 | xargs -0 -n 1 dirname | sort -u |
while read d
do
mv "${d}" /home/spenx/dst
done
It's kind of long, but the steps are:
Find the list of all matching files (the find part), using -print0 to compensate for any names that have spaces or other odd characters in them
extract the directory part of each file name (the xargs ... dirname part)
sort and uniquify the list to get rid of duplicates
Feed the resulting list into a loop that moves each directory in turn

find a pattern in files and rename them [closed]

Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 5 years ago.
Improve this question
I use this command to find files with a given pattern and then rename them to something else
find . -name '*-GHBAG-*' -exec bash -c 'echo mv $0 ${0/GHBAG/stream-agg}' {} \;
As I run this command, I see some outputs like this
mv ./report-GHBAG-1B ./report-stream-agg-1B
mv ./reoprt-GHBAG-0.5B ./report-stream-agg-0.5B
However at the end, when I run ls, I see the old file names.
You are echo'ing your 'mv' command, not actually executing it. Change to:
find . -name '*-GHBAG-*' -exec bash -c 'mv $0 ${0/GHBAG/stream-agg}' {} \;
I would suggest using the rename command to perform this task. rename renames the filenames supplied according to the rule specified as a Perl regular expression.
In this case, you could use:
rename 's/GHBAG/stream-agg/' *-GHBAG-*
In reply to anumi's comment, you could in effect search recursively down directories by matching '**':
rename 's/GHBAG/stream-agg/' **/*-GHBAG-*
This works for my needs, replacing all matching files or file types. Be warned, this is a very greedy search
# bashrc
function file_replace() {
for file in $(find . -type f -name "$1*"); do
mv $file $(echo "$file" | sed "s/$1/$2/");
done
}
I will usually run with find . -type f -name "MYSTRING*" in advance to check the matches out before replacing.
For example:
file_replace "Slider.js" "RangeSlider.ts"
renamed: packages/react-ui-core/src/Form/Slider.js -> packages/react-ui-core/src/Form/RangeSlider.ts
renamed: stories/examples/Slider.js -> stories/examples/RangeSlider.ts
or ditch the filetype to make it even greedier
file_replace Slider RangeSlider
renamed: packages/react-ui-core/src/Form/Slider.js -> packages/react-ui-core/src/Form/RangeSlider.js
renamed: stories/examples/Slider.js -> stories/examples/RangeSlider.js
renamed: stories/theme/Slider.css -> stories/theme/RangeSlider.css

Change filenames to lowercase in Ubuntu in all subdirectories [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I know it's been asked but what I've found has not worked out so far.
The closet I came is this : rename -n 'y[A-Z]/[a-z]/' *
which works for the current directory. I'm not too good at Linux terminal so what
should I add to this command to apply it to all of the files in all the sub-directories from which I am in, thanks!
Here's one way using find and tr:
for i in $(find . -type f -name "*[A-Z]*"); do mv "$i" "$(echo $i | tr A-Z a-z)"; done
Edit; added: -name "*[A-Z]*"
This ensures that only files with capital letters are found. For example, if files with only lowercase letters are found and moved to the same file, mv will display the are the same file error.
Perl has a locale-aware lc() function which might work better:
find . -type f | perl -n -e 'chomp; system("mv", $_, lc($_))'
Note that this script handles whitespace in filenames, but not newlines. And there's no protection against collisions, if you have "ASDF.txt" and "asdf.txt" one is going to get clobbered.

Resources