How do I list one filename per output line in Linux? - linux

I'm using ls -a command to get the file names in a directory, but the output is in a single line.
Like this:
. .. .bash_history .ssh updater_error_log.txt
I need a built-in alternative to get filenames, each on a new line, like this:
.
..
.bash_history
.ssh
updater_error_log.txt

Use the -1 option (note this is a "one" digit, not a lowercase letter "L"), like this:
ls -1a
First, though, make sure your ls supports -1. GNU coreutils (installed on standard Linux systems) and Solaris do; but if in doubt, use man ls or ls --help or check the documentation. E.g.:
$ man ls
...
-1 list one file per line. Avoid '\n' with -q or -b

Yes, you can easily make ls output one filename per line:
ls -a | cat
Explanation: The command ls senses if the output is to a terminal or to a file or pipe and adjusts accordingly.
So, if you pipe ls -a to python it should work without any special measures.

Ls is designed for human consumption, and you should not parse its output.
In shell scripts, there are a few cases where parsing the output of ls does work is the simplest way of achieving the desired effect. Since ls might mangle non-ASCII and control characters in file names, these cases are a subset of those that do not require obtaining a file name from ls.
In python, there is absolutely no reason to invoke ls. Python has all of ls's functionality built-in. Use os.listdir to list the contents of a directory and os.stat or os to obtain file metadata. Other functions in the os modules are likely to be relevant to your problem as well.
If you're accessing remote files over ssh, a reasonably robust way of listing file names is through sftp:
echo ls -1 | sftp remote-site:dir
This prints one file name per line, and unlike the ls utility, sftp does not mangle nonprintable characters. You will still not be able to reliably list directories where a file name contains a newline, but that's rarely done (remember this as a potential security issue, not a usability issue).
In python (beware that shell metacharacters must be escapes in remote_dir):
command_line = "echo ls -1 | sftp " + remote_site + ":" + remote_dir
remote_files = os.popen(command_line).read().split("\n")
For more complex interactions, look up sftp's batch mode in the documentation.
On some systems (Linux, Mac OS X, perhaps some other unices, but definitely not Windows), a different approach is to mount a remote filesystem through ssh with sshfs, and then work locally.

you can use ls -1
ls -l will also do the work

You can also use ls -w1
This allows to set number of columns.
From manpage of ls:
-w, --width=COLS
set output width to COLS. 0 means no limit

ls | tr "" "\n"

Easy, as long as your filenames don't include newlines:
find . -maxdepth 1
If you're piping this into another command, you should probably prefer to separate your filenames by null bytes, rather than newlines, since null bytes cannot occur in a filename (but newlines may):
find . -maxdepth 1 -print0
Printing that on a terminal will probably display as one line, because null bytes are not normally printed. Some programs may need a specific option to handle null-delimited input, such as sort's -z. Your own script similarly would need to account for this.

-1 switch is the obvious way of doing it but just to mention, another option is using echo and a command substitution within a double quote which retains the white-spaces(here \n):
echo "$(ls)"
Also how ls command behaves is mentioned here:
If standard output is a terminal, the output is in columns (sorted
vertically) and control characters are output as question marks;
otherwise, the output is listed one per line and control characters
are output as-is.
Now you see why redirecting or piping outputs one per line.

Related

Getting the most recent filename where the extension name is case *in*sensitive

I am trying to get the most recent .CSV or .csv file name among other comma separated value files where the extension name is case insensitive.
I am achieving this with the following command, provided by someone else without any explanation:
ls -t ~(i:*.CSV) | head -1
or
ls -t -- ~(i:*.CSV) | head -1
I have two questions:
What is the use of ~ and -- in this case? Does -- helps here?
How can I get a blank response when there is no .csv or .CSV file in
the folder? At the moment I get:
/bin/ls: cannot access ~(i:*.CSV): No such file or directory
I know I can test the exit code of the last command, but I was wondering maybe there is a --silent option or something.
Many thanks for your time.
PS: I made my research online quite thorough and I was unable to find an answer.
The ~ is just a literal character; the intent would appear to be to match filenames starting with ~ and ending with .csv, with i: being a flag to make the match case-insensitive. However, I don't know of any shell that supports that particular syntax. The closest thing I am aware of would be zsh's globbing flags:
setopt extended_glob # Allow globbing flags
ls ~(#i)*.csv
Here, (#i) indicates that anything after it should be matched without regard to case.
Update: as #baptistemm points out, ~(i:...) is syntax defined by ksh.
The -- is a conventional argument, supported by many commands, to mean that any arguments that follow are not options, but should be treated literally. For example, ls -l would mean ls should use the -l option to modify its output, while ls -- -l means ls should try to list a file named -l.
~(i:*.CSV) is to tell to shell (this is only supported apparently in ksh93) the enclosed text after : must be treated as insensitive, so in this example that could all these possibilites.
*.csv or
*.Csv or
*.cSv or
*.csV or
*.CSv or
*.CSV
Note this could have been written ls -t *.[CcSsVv] in bash.
To silent errors I suggest you to look for in this site for "standard error /dev/null" that will help.
I tried running commands like what you have in both bash and zsh and neither worked, so I can't help you out with that, but if you want to discard the error, you can add 2>/dev/null to the end of the ls command, so your command would look like the following:
ls -t ~(i:*.CSV) 2>/dev/null | head -1
This will redirect anything written to STDERR to /dev/null (i.e. throw it out), which, in your case, would be /bin/ls: cannot access ~(i:*.CSV): No such file or directory.

Linux command line, reverse polish notation

ls /tmp
How can I run the same command but using reverse polish notation?
Is there a mode that would allow me to do this or something similar to that?
I could use xargs but that's a lot more typing:
echo /tmp | xargs ls
This would be ideal:
/tmp ls
or
/tmp | ls
Bash (I assume you are using it) is a shell for unixoid systems.
As far as I know, bash doesn't provide such a mode. You could use a different shell that provides this feature. Searching in the web, this was my first result: https://github.com/iconmaster5326/RPOS, but maybe it is far from stable ;)
Alternatively, you can make a command that reverses it's argument list and execute it.
The usage would be like this:
reversex /tmp ls
reversex A.txt B.txt cp
Here is an example of such a command:
#!/bin/bash
for i in "$#"
do
CMDLINE="$i $CMDLINE"
done
$CMDLINE
If you name it /usr/local/bin/reversex and make it executable, you should be able to use simple reverse commands with the prefix reversex. I can not give a warranty that it works. Note that the arguments are parsed twice and have to be escaped twice, too.

How to take advantage of filters

I've read here that
To make a pipe, put a vertical bar (|) on the command line between two commands.
then
When a program takes its input from another program, performs some operation on that input, and writes the result to the standard output, it is referred to as a filter.
So I've first tried the ls command whose output is:
Desktop HelloWord.java Templates glassfish-4.0
Documents Music Videos hs_err_pid26742.log
Downloads NetBeansProjects apache-tomcat-8.0.3 mozilla.pdf
HelloWord Pictures examples.desktop netbeans-8.0
Then ls | echo which outputs absolutely nothing.
I'm looking for a way to take advantages of pipelines and filters in my bash script. Please help.
echo doesn't read from standard input. It only writes its command-line arguments to standard output. The cat command is what you want, which takes what it reads from standard input to standard output.
ls | cat
(Note that the pipeline above is a little pointless, but does demonstrate the idea of a pipe. The command on the right-hand side must read from standard input.)
Don't confuse command-line arguments with standard input.
echo doesn't read standard input. To try something more useful, try
ls | sort -r
to get the output sorted in reverse,
or
ls | grep '[0-9]'
to only keep the lines containing digits.
In addition to what others have said - if your command (echo in this example) does not read from standard input you can use xargs to "feed" this command from standard input, so
ls | echo
doesn't work, but
ls | xargs echo
works fine.

How do I use the filenames output by "grep" as argument to another program

I have this grep command which outputs the names of files (which contains matches to some pattern), and I want to parse those files with some file-parsing program. The pipechain looks like this:
grep -rl "{some-pattern}" . | {some-file-parsing-program} > a.out
How do I get those file names as command line arguments to the file-parsing program?
For example, let's say grep returns the filenames a, b, c. How do I pass the filenames so that it's as if I'm executing
{some-file-parsing-program} a b c > a.out
?
It looks to me as though you're wanting xargs:
grep -rl "{some_pattern" . | xargs your-command > a.out
I'm not convinced a.out is a good output file name, but we can let that slide. The xargs command reads white-space separated file names from standard input and then invokes your-command with those names as arguments. It may need to invoke your-command several times; unless you're using GNU xargs and you specify -r, your-command will be invoked at least once, even if there are no matching file names.
Without using xargs, you could not use sed for this job. Without using xargs, using awk would be clumsy. Perl (and Python) could manage it 'trivially'; it would be easy to write the code to read file names from standard input and then process each file in turn.
I don't know of any linux programs that cannot read from stdin. Depending on the program, the default input may be stdin or you may need to specify to use stdin by using a command line option (often - by itself). Do you have anything particular in mind?

Terminal command to find lines containing a specific word?

I was just wondering what command i need to put into the terminal to read a text file, eliminate all lines that do not contain a certain keyword, and then print those lines onto a new file. for example, the keyword is "system". I want to be able to print all lines that contain system onto a new separate file. Thanks
grep is your friend.
For example, you can do:
grep system <filename> > systemlines.out
man grep and you can get additional useful info as well (ex: line numbers, 1+ lines prior, 1+lines after, negation - ie: all lines that do not contain grep, etc...)
If you are running Windows, you can either install cygwin or you can find a win32 binary for grep as well.
grep '\<system\>'
Will search for lines that contain the word system, and not system as a substring.
below grep command will solve ur problem
grep -i yourword filename1 > filename2
with -i for case insensitiveness
without -i for case sensitiveness
to learn how grep works on ur server ,refer to man page on ur server by the following command
man grep
grep "system" filename > new-filename
You might want to make it a bit cleverer to not include lines with words like "dysystemic", but it's a good place to start.

Resources