replacement on xargs variable returns empty string - string

I need to search for XML files inside a directory tree and create links for them on another directory (staging_ojs_pootle), naming these links with the file path (replacing slashes per dots).
the bash command is not working, I got stuck on the replacement part. Seems like the variable from xargs, named 'file', is not accessible inside the replacement code (${file/\//.})
find directory/ -name '*.xml' | xargs -I 'file' echo "ln" file staging_ojs_pootle/${file/\//.}
The replacement inside ${} result gives me an empty string.
Tried using sed but regular expressions were replacing all or just the last slash :/
find directory/ -name '*.xml' | xargs -I 'file' echo "ln" file staging_ojs_pootle/file |sed -e '/^ln/s/\(staging_ojs_pootle.*\)[\/]\(.*\)/\1.\2/g'
regards

Try this:
$ find directory/ -name '*.xml' |sed -r 'h;s|/|.|g;G;s|([^\n]+)\n(.+)|ln \2 staging_ojs_pootle/\1|e'
For example:
$ mkdir -p /tmp/test
$ touch {1,2,3,4}.xml
# use /tmp/test as staging_ojs_pootle
$ find /tmp/test -name '*.xml' |sed -r 'h;s|/|.|g;G;s|([^\n]+)\n(.+)|ln \2 /tmp/test/\1|e'
$ ls -al /tmp/test
total 8
drwxr-xr-x. 2 root root 4096 Jun 15 13:09 .
drwxrwxrwt. 9 root root 4096 Jun 15 11:45 ..
-rw-r--r--. 2 root root 0 Jun 15 11:45 1.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 2.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 3.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 4.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 .tmp.test.1.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 .tmp.test.2.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 .tmp.test.3.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 .tmp.test.4.xml
# if don NOT use the e modifier of s command, we can get the final command
$ find /tmp/test -name '*.xml' |sed -r 'h;s|/|.|g;G;s|([^\n]+)\n(.+)|ln \2 /tmp/test/\1|'
ln /tmp/test/1.xml /tmp/test/.tmp.test.1.xml
ln /tmp/test/2.xml /tmp/test/.tmp.test.2.xml
ln /tmp/test/3.xml /tmp/test/.tmp.test.3.xml
ln /tmp/test/4.xml /tmp/test/.tmp.test.4.xml
Explains:
for each xml file, use h to keep the origin filename in hold space.
the use s|/|.|g to substitute all / to . for xml filename.
use G to append the hold space to pattern space, then pattern space is CHANGED_FILENAME\nORIGIN_FILENAME.
use s|([^\n]+)\n(.+)|ln \2 staging_ojs_pootle/\1|e' to merge the command with CHANGED_FILENAME and ORIGIN_FILENAME, then use e modifier of s command to execute the command assembled above, which will do the actual works.
Hope this helps!

If you can be sure that the names of your XML files do not contain any word-splitting characters, you can use something like:
find directory -name "*.xml" | sed 'p;s/\//./' | xargs -n2 echo ln

Related

how to get previous date files and pass ls output to array in gawk

I have log files like below generated, and I need to daily run script ,which will list them , and then do 2 things.
1- get previous / yesterday files and transfer them to x server
2- get files older than one day and transfer them to y server
files are like below and I am trying below code but not working.
how can we pass ls -altr output to gawk ? can we built an associate array like below.
array[index]=ls -altr | awk '{print $6,$7,$8}'
code I am trying to retrieve previous date files , but not working
previous_dates=$(date -d "-1 days" '+-%d')
ls -altr |gawk '{if ( $7!=previous_dates ) print $9 }'
-r-------- 1 root root 6291563 Jun 22 14:45 audit.log.4
-r-------- 1 root root 6291619 Jun 24 09:11 audit.log.3
drwxr-xr-x. 14 root root 4096 Jun 26 03:47 ..
-r-------- 1 root root 6291462 Jun 26 04:15 audit.log.2
-r-------- 1 root root 6291513 Jun 27 23:05 audit.log.1
drwxr-x---. 2 root root 4096 Jun 27 23:05 .
-rw------- 1 root root 5843020 Jun 29 14:57 audit.log
To select files modified yesterday, you could use
find . -daystart -type f -mtime 1
and to select older files, you could use
find . -daystart -type f -mtime +1
possibly adding a -name test to select only files like audit.log*, for example. You could then use xargs to process the files, e.g.
find . -daystart -type f -mtime 1 | xargs -n 1 -I{} scp {} user#server

How to find files modified in last x minutes (find -mmin does not work as expected)

I'm trying to find files modified in last x minutes, for example in the last hour. Many forums and tutorials on the net suggest to use the find command with the -mmin option, like this:
find . -mmin -60 |xargs ls -l
However, this command did not work for me as expected. As you can see from the following listing, it also shows files modified earlier than 1 hour ago:
-rw------- 1 user user 9065 Oct 28 23:13 1446070435.V902I67a5567M283852.harvester
-rw------- 1 user user 1331 Oct 29 01:10 1446077402.V902I67a5b34M538793.harvester
-rw------- 1 user user 1615 Oct 29 01:36 1446078983.V902I67a5b35M267251.harvester
-rw------- 1 user user 72365 Oct 29 02:27 1446082022.V902I67a5b36M873811.harvester
-rw------- 1 user user 69102 Oct 29 02:27 1446082024.V902I67a5b37M142247.harvester
-rw------- 1 user user 2611 Oct 29 02:34 1446082482.V902I67a5b38M258101.harvester
-rw------- 1 user user 2612 Oct 29 02:34 1446082485.V902I67a5b39M607107.harvester
-rw------- 1 user user 2600 Oct 29 02:34 1446082488.V902I67a5b3aM465574.harvester
-rw------- 1 user user 10779 Oct 29 03:27 1446085622.V902I67a5b3bM110329.harvester
-rw------- 1 user user 5836 Oct 29 03:27 1446085623.V902I67a5b3cM254104.harvester
-rw------- 1 user user 8970 Oct 29 04:27 1446089232.V902I67a5b3dM936339.harvester
-rw------- 1 user user 165393 Oct 29 06:10 1446095400.V902I67a5b3eM290158.harvester
-rw------- 1 user user 105054 Oct 29 06:10 1446095430.V902I67a5b3fM265065.harvester
-rw------- 1 user user 1615 Oct 29 06:24 1446096244.V902I67a5b40M55701.harvester
-rw------- 1 user user 1620 Oct 29 06:24 1446096292.V902I67a5b41M337769.harvester
-rw------- 1 user user 10436 Oct 29 06:36 1446096973.V902I67a5b42M707215.harvester
-rw------- 1 user user 7150 Oct 29 06:36 1446097019.V902I67a5b43M415731.harvester
-rw------- 1 user user 4357 Oct 29 06:39 1446097194.V902I67a5b56M446687.harvester
-rw------- 1 user user 4283 Oct 29 06:39 1446097195.V902I67a5b57M957052.harvester
-rw------- 1 user user 4393 Oct 29 06:39 1446097197.V902I67a5b58M774506.harvester
-rw------- 1 user user 4264 Oct 29 06:39 1446097198.V902I67a5b59M532213.harvester
-rw------- 1 user user 4272 Oct 29 06:40 1446097201.V902I67a5b5aM534679.harvester
-rw------- 1 user user 4274 Oct 29 06:40 1446097228.V902I67a5b5dM363553.harvester
-rw------- 1 user user 20905 Oct 29 06:44 1446097455.V902I67a5b5eM918314.harvester
Actually, it just listed all files in the current directory. We can take one of these files as an example and check if its modification time is really as displayed by the ls command:
stat 1446070435.V902I67a5567M283852.harvester
File: ‘1446070435.V902I67a5567M283852.harvester’
Size: 9065 Blocks: 24 IO Block: 4096 regular file
Device: 902h/2306d Inode: 108680551 Links: 1
Access: (0600/-rw-------) Uid: ( 1001/ user) Gid: ( 1027/ user)
Access: 2015-10-28 23:13:55.281515368 +0100
Modify: 2015-10-28 23:13:55.281515368 +0100
Change: 2015-10-28 23:13:55.313515539 +0100
As we can see, this file was definitely last modified earlier than 1 hour ago! I also tried find -mmin 60 or find -mmin +60, but it did not work either.
Why is this happening and how to use the find command correctly?
I can reproduce your problem if there are no files in the directory that were modified in the last hour. In that case, find . -mmin -60 returns nothing. The command find . -mmin -60 |xargs ls -l, however, returns every file in the directory which is consistent with what happens when ls -l is run without an argument.
To make sure that ls -l is only run when a file is found, try:
find . -mmin -60 -type f -exec ls -l {} +
The problem is that
find . -mmin -60
outputs:
.
./file1
./file2
Note the line with one dot?
That makes ls list the whole directory exactly the same as when ls -l . is executed.
One solution is to list only files (not directories):
find . -mmin -60 -type f | xargs ls -l
But it is better to use directly the option -exec of find:
find . -mmin -60 -type f -exec ls -l {} \;
Or just:
find . -mmin -60 -type f -ls
Which, by the way is safe even including directories:
find . -mmin -60 -ls
To search for files in /target_directory and all its sub-directories, that have been modified in the last 60 minutes:
$ find /target_directory -type f -mmin -60
To find the most recently modified files, sorted in the reverse order of update time (i.e., the most recently updated files first):
$ find /etc -type f -printf '%TY-%Tm-%Td %TT %p\n' | sort -r
Manual of find:
Numeric arguments can be specified as
+n for greater than n,
-n for less than n,
n for exactly n.
-amin n
File was last accessed n minutes ago.
-anewer file
File was last accessed more recently than file was modified. If file is a symbolic link and the -H option or the -L option is in effect, the access time of the file it points to is always
used.
-atime n
File was last accessed n*24 hours ago. When find figures out how many 24-hour periods ago the file was last accessed, any fractional part is ignored, so to match -atime +1, a file has to
have been accessed at least two days ago.
-cmin n
File's status was last changed n minutes ago.
-cnewer file
File's status was last changed more recently than file was modified. If file is a symbolic link and the -H option or the -L option is in effect, the status-change time of the file it points
to is always used.
-ctime n
File's status was last changed n*24 hours ago. See the comments for -atime to understand how rounding affects the interpretation of file status change times.
Example:
find /dir -cmin -60 # creation time
find /dir -mmin -60 # modification time
find /dir -amin -60 # access time
I am working through the same need and I believe your timeframe is incorrect.
Try these:
15min change: find . -mtime -.01
1hr change: find . -mtime -.04
12 hr change: find . -mtime -.5
You should be using 24 hours as your base. The number after -mtime should be relative to 24 hours. Thus -.5 is the equivalent of 12 hours, because 12 hours is half of 24 hours.
Actually, there's more than one issue here. The main one is that xargs by default executes the command you specified, even when no arguments have been passed. To change that you might use a GNU extension to xargs:
--no-run-if-empty
-r
If the standard input does not contain any nonblanks, do not run the command. Normally, the command is run once even if there is no input. This option is a GNU extension.
Simple example:
find . -mmin -60 | xargs -r ls -l
But this might match to all subdirectories, including . (the current directory), and ls will list each of them individually. So the output will be a mess. Solution: pass -d to ls, which prohibits listing the directory contents:
find . -mmin -60 | xargs -r ls -ld
Now you don't like . (the current directory) in your list? Solution: exclude the first directory level (0) from find output:
find . -mindepth 1 -mmin -60 | xargs -r ls -ld
Now you'd need only the files in your list? Solution: exclude the directories:
find . -type f -mmin -60 | xargs -r ls -l
Now you have some files with names containing white space, quote marks, or backslashes? Solution: use null-terminated output (find) and input (xargs) (these are also GNU extensions, afaik):
find . -type f -mmin -60 -print0 | xargs -r0 ls -l
This may work for you. I used it for cleaning folders during deployments for deleting old deployment files.
clean_anyfolder() {
local temp2="$1/**"; //PATH
temp3=( $(ls -d $temp2 -t | grep "`date | awk '{print $2" "$3}'`") )
j=0;
while [ $j -lt ${#temp3[#]} ]
do
echo "to be removed ${temp3[$j]}"
delete_file_or_folder ${temp3[$j]} 0 //DELETE HERE
fi
j=`expr $j + 1`
done
}
this command may be help you sir
find -type f -mtime -60

Linux combine sort files by date created and given file name

I need to combine these to commands in order to have a sorted list by date created with the specified "filename".
I know that sorting files by date can be achieved with:
ls -lrt
and finding a file by name with
find . -name "filename*"
I don't know how to combine these two. I tried with a pipeline but I don't get the right result.
[EDIT]
Not sorted
find . -name "filename" -printf '%TY:%Tm:%Td %TH:%Tm %h/%f\n' | sort
Forget xargs. "Find" and "sort" are all the tools you need.
My best guess would be to use xargs:
find . -name 'filename*' -print0 | xargs -0 /bin/ls -ltr
There's an upper limit on the number of arguments, but it shouldn't be a problem unless they occupy more than 32kB (read more here), in which case you will get blocks of sorted files :)
find . -name "filename" -exec ls --full-time \{\} \; | cut -d' ' -f7- | sort
You might have to adjust the cut command depending on what your version of ls outputs.
Check the below-shared command:
1) List Files directory with Last Modified Date/Time
To list files and shows the last modified files at top, we will use -lt options with ls command.
$ ls -lt /run
output
total 24
-rw-rw-r--. 1 root utmp 2304 Sep 8 14:58 utmp
-rw-r--r--. 1 root root 4 Sep 8 12:41 dhclient-eth0.pid
drwxr-xr-x. 4 root root 100 Sep 8 03:31 lock
drwxr-xr-x. 3 root root 60 Sep 7 23:11 user
drwxr-xr-x. 7 root root 160 Aug 26 14:59 udev
drwxr-xr-x. 2 root root 60 Aug 21 13:18 tuned
https://linoxide.com/linux-how-to/how-sort-files-date-using-ls-command-linux/

counting number of directories in a specific directory

How to count the number of folders in a specific directory. I am using the following command, but it always provides an extra one.
find /directory/ -maxdepth 1 -type d -print| wc -l
For example, if I have 3 folders, this command provides 4. If it contains 5 folders, the command provides 6. Why is that?
find is also printing the directory itself:
$ find .vim/ -maxdepth 1 -type d
.vim/
.vim/indent
.vim/colors
.vim/doc
.vim/after
.vim/autoload
.vim/compiler
.vim/plugin
.vim/syntax
.vim/ftplugin
.vim/bundle
.vim/ftdetect
You can instead test the directory's children and do not descend into them at all:
$ find .vim/* -maxdepth 0 -type d
.vim/after
.vim/autoload
.vim/bundle
.vim/colors
.vim/compiler
.vim/doc
.vim/ftdetect
.vim/ftplugin
.vim/indent
.vim/plugin
.vim/syntax
$ find .vim/* -maxdepth 0 -type d | wc -l
11
$ find .vim/ -maxdepth 1 -type d | wc -l
12
You can also use ls:
$ ls -l .vim | grep -c ^d
11
$ ls -l .vim
total 52
drwxrwxr-x 3 anossovp anossovp 4096 Aug 29 2012 after
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 autoload
drwxrwxr-x 13 anossovp anossovp 4096 Aug 29 2012 bundle
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 colors
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 compiler
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 doc
-rw-rw-r-- 1 anossovp anossovp 48 Aug 29 2012 filetype.vim
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 ftdetect
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 ftplugin
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 indent
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 plugin
-rw-rw-r-- 1 anossovp anossovp 2505 Aug 29 2012 README.rst
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 syntax
$ ls -l .vim | grep ^d
drwxrwxr-x 3 anossovp anossovp 4096 Aug 29 2012 after
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 autoload
drwxrwxr-x 13 anossovp anossovp 4096 Aug 29 2012 bundle
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 colors
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 compiler
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 doc
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 ftdetect
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 ftplugin
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 indent
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 plugin
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 syntax
Get a count of only the directories in the current directory
echo */ | wc
you will get out put like 1 309 4594
2nd digit represents no. of directories.
or
tree -L 1 | tail -1
find . -mindepth 1 -maxdepth 1 -type d | wc -l
For find -mindepth means total number recusive in directories
-maxdepth means total number recusive in directories
-type d means directory
And for wc -l means count the lines of the input
If you only have directories in the folder and no files this does it:
ls | wc -l
Run stat -c %h folder and subtract 2 from the result. This employs only a single subprocess as opposed to the 2 (or even 3) required by most of the other solutions here (typically find or ls plus wc).
Using sh/bash:
echo $((`stat -c %h folder` - 2))   # 'echo' is a shell builtin
Using csh/tcsh:
# cnt = `stat -c %h folder` - 2; echo $cnt   # 'echo' is a shell builtin
Explanation: stat -c %h folder prints the number of hardlinks to folder, and each subfolder under folder contains a ../ entry which is a hardlink back to folder. You must subtract 2 because there are two additional hardlinks in the count:
folder's own self-referential ./ entry, and
folder's parent's link to folder
Best way to navigate to your drive and simply execute
ls -lR | grep ^d | wc -l
and to Find all folders in total, including subdirectories?
find /mount/point -type d | wc -l
...or find all folders in the root directory (not including subdirectories)?
find /mount/point -maxdepth 1 -type d | wc -l
Cheers!
I think the easiest is
ls -ld images/* | wc -l
where images is your target directory. The -d flag limits to directories, and the -l flag will perform a per-line listing, compatible with the very familiar wc -l for line count.
No of directory we can find using below command
ls -l | grep "^d" | wc -l
Some useful examples:
count files in current dir
/bin/ls -lA | egrep -c '^-'
count dirs in current dir
/bin/ls -lA | egrep -c '^d'
count files and dirs in current dir
/bin/ls -lA | egrep -c '^-|^d'
count files and dirs in in one subdirectory
/bin/ls -lA subdir_name/ | egrep -c '^-|^d'
I have noticed a strange thing (at least in my case) :
When I have tried with ls instead /bin/ls
the -A parameter do not list implied . and .. NOT WORK as espected. When I use
ls that show ./ and ../ So that result wrong count. SOLUTION : /bin/ls instead ls
To get the number of directories - navigate
go to the directory and execute
ls -l | grep -c ^d
A pure bash solution:
shopt -s nullglob
dirs=( /path/to/directory/*/ )
echo "There are ${#dirs[#]} (non-hidden) directories"
If you also want to count the hidden directories:
shopt -s nullglob dotglob
dirs=( /path/to/directory/*/ )
echo "There are ${#dirs[#]} directories (including hidden ones)"
Note that this will also count links to directories. If you don't want that, it's a bit more difficult with this method.
Using find:
find /path/to/directory -type d \! -name . -prune -exec printf x \; | wc -c
The trick is to output an x to stdout each time a directory is found, and then use wc to count the number of characters. This will count the number of all directories (including hidden ones), excluding links.
The methods presented here are all safe wrt to funny characters that can appear in file names (spaces, newlines, glob characters, etc.).
Using zsh:
a=(*(/N)); echo ${#a}
The N is a nullglob, / makes it match directories, # counts. It will neatly cope with spaces in directory names as well as returning 0 if there are no directories.
The best answer to what you want is
echo `find . -maxdepth 1 -type d | wc -l`-1 | bc
this subtracts one to remove the unwanted '.' directory that find lists (as patel deven mentioned above).
If you want to count subfolders recursively, then just leave off the maxdepth option, so
echo `find . -type d | wc -l`-1 | bc
PS If you find command substitution ugly, subtracting one can be done as a pure stream using sed and bc.
Subtracting one from count:
find . -maxdepth 1 -type d | wc -l | sed 's/$/-1\n/' | bc
or, adding count to minus one:
find . -maxdepth 1 -type d | wc -l | sed 's/^/-1+/' | bc
Count all files and subfolders, windows style:
dir=/YOUR/PATH;f=$(find $dir -type f | wc -l); d=$(find $dir -mindepth 1 -type d | wc -l); echo "$f Files, $d Folders"
If you want to use regular expressions, then try:
ls -c | grep "^d" | wc -l
If you want to count folders that have similar names like folder01,folder02,folder03, etc then you can do
ls -l | grep ^d | grep -c folder
Best way to do it:
ls -la | grep -v total | wc -l
This gives you the perfect count.

Linux - Save only recent 10 folders and delete the rest

I have a folder that contains versions of my application, each time I upload a new version a new sub-folder is created for it, the sub-folder name is the current timestamp, here is a printout of the main folder used (ls -l |grep ^d):
drwxrwxr-x 7 root root 4096 2011-03-31 16:18 20110331161649
drwxrwxr-x 7 root root 4096 2011-03-31 16:21 20110331161914
drwxrwxr-x 7 root root 4096 2011-03-31 16:53 20110331165035
drwxrwxr-x 7 root root 4096 2011-03-31 16:59 20110331165712
drwxrwxr-x 7 root root 4096 2011-04-03 20:18 20110403201607
drwxrwxr-x 7 root root 4096 2011-04-03 20:38 20110403203613
drwxrwxr-x 7 root root 4096 2011-04-04 14:39 20110405143725
drwxrwxr-x 7 root root 4096 2011-04-06 15:24 20110406151805
drwxrwxr-x 7 root root 4096 2011-04-06 15:36 20110406153157
drwxrwxr-x 7 root root 4096 2011-04-06 16:02 20110406155913
drwxrwxr-x 7 root root 4096 2011-04-10 21:10 20110410210928
drwxrwxr-x 7 root root 4096 2011-04-10 21:50 20110410214939
drwxrwxr-x 7 root root 4096 2011-04-10 22:15 20110410221414
drwxrwxr-x 7 root root 4096 2011-04-11 22:19 20110411221810
drwxrwxr-x 7 root root 4096 2011-05-01 21:30 20110501212953
drwxrwxr-x 7 root root 4096 2011-05-01 23:02 20110501230121
drwxrwxr-x 7 root root 4096 2011-05-03 21:57 20110503215252
drwxrwxr-x 7 root root 4096 2011-05-06 16:17 20110506161546
drwxrwxr-x 7 root root 4096 2011-05-11 10:00 20110511095709
drwxrwxr-x 7 root root 4096 2011-05-11 10:13 20110511100938
drwxrwxr-x 7 root root 4096 2011-05-12 14:34 20110512143143
drwxrwxr-x 7 root root 4096 2011-05-13 22:13 20110513220824
drwxrwxr-x 7 root root 4096 2011-05-14 22:26 20110514222548
drwxrwxr-x 7 root root 4096 2011-05-14 23:03 20110514230258
I'm looking for a command that will leave the last 10 versions (sub-folders) and deletes the rest.
Any thoughts?
There you go. (edited)
ls -dt */ | tail -n +11 | xargs rm -rf
First list directories recently modified then take all of them except first 10, then send them to rm -rf.
ls -dt1 /path/to/folder/*/ | sed '11,$p' | rm -r
this assumes those are the only directories and no others are present in the working directory.
ls -dt1 will normally only print the newest directory however the /*/ will
only match directories and print their full paths the 1 ensures one
line per match/listing t sorts time with newest at the top.
sed takes the 11th line on down to the bottom and prints only those lines, which are then passed to rm.
You can use xargs, but for testing you may wish to remove | rm -r to see if the directories are listed properly first.
If the directories' names contain the date one can delete all but the last 10 directories with the default alphabetical sort
ls -d */ | head -n -10 | xargs rm -rf
ls -lt | grep ^d | sed -e '1,10d' | awk '{sub(/.* /, ""); print }' | xargs rm -rf
Explanation:
list all contents of current directory in chronological order (most recent files first)
filter out all the directories
ignore the 10 first lines / directories
use awk to extract the file names from the remaining 'ls -l' output
remove the files
EDIT:
find . -maxdepth 1 -type d ! -name \\.| sort | tac | sed -e '1,10d' | xargs rm -rf
I suggest the following sequence. I use a similar approach on my Synology NAS to delete old backups. It doesn't rely on the folder names, instead it uses the last modified time to decide which folders to delete. It also uses zero-termination in order to correctly handle quotes, spaces and newline characters in the folder names:
find /path/to/folder -maxdepth 1 -mindepth 1 -type d -printf '%Ts\t' -print0 \
| sort -rnz \
| tail -n +11 -z \
| cut -f2- -z \
| xargs -0 -r rm -rf
IMPORTANT: This will delete any matching folders! I strongly recommend doing a test run first by replacing the last command xargs -0 -r rm -rf with xargs -0 which will echo the matching folders instead of deleting them.
A short explanation of each step:
find /path/to/folder -maxdepth 1 -mindepth 1 -type d -printf '%Ts\t' -print0
Find all directories (-type d) directly inside the backup folder (-maxdepth 1) except the backup folder itself (-mindepth 1), print (-printf) the Unix time (%Ts) of the last modification followed by a tab character (\t, used in step 4) and the full file name followed by a null character (-print0).
sort -rnz
Sort the zero-terminated items (-z) from the previous step using a numerical comparison (-n) and reverse the order (-r). The result is a list of all folders sorted by their last modification time in descending order.
tail -n +11 -z
Print the last lines (tail) from the previous step starting from line 11 (-n +11) considering each line as zero-terminated (-z). This excludes the newest 10 folders (by modification time) from the remaining steps.
cut -f2- -z
Cut each line from the second field until the end (-f2-) treating each line as zero-terminaded (-z) to obtain a list containing the full path to each folder older than 10 days.
xargs -r -0 rm -rf
Take the zero-terminated (-0) items from the previous step (xargs), and, if there are any (-r avoids running the command passed to xargs if there are no nonblank characters), force delete (rm -rf) them.
Your directory names are sorted in chronological order, which makes this easy. The list of directories in chronological order is just *, or [0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9] to be more precise. So you want to delete all but the last 10 of them.
set [0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9]/
while [ $# -gt 10 ]; do
rm -rf "$1"
shift
fi
(While there are more than 10 directories left, delete the oldest one.)

Resources