Linux utility to know command for a particular task - linux

In my initial days of using linux I usually had to search google to know the command for
doing a particular task. Once I have the command name, i can view its usage using man command-name.
Similarly I was thinking of some utility which can tell the command to do a particular task if the task to be done is specified as an argument and opens the man page for that command
e.g:
findUtilty "find all files in a directory"
output:
ls
find
I want to know if some utility of that kind exists, if so it will be very handy especially for newbies.
If not then i may like to implement it.
thanx,

Not as nice as you are asking about, but
apropos <keyword>
and
man -k <keyword>
can be very useful.

Parsing natural language is hard because there are thousands of ways to rephrase one sentence. Google does it best as far as I know. So, there is no such tool. There are handy and practical manuals that makes it easy to find the right tool for the job. Also, there is a huge community behind core-utils (and linux in general), so try both forums and IRC. Often, the latter is the fastest. And people tend to parse natural language as expected :)

apropos will do something like you suggest.

I guess it is: List of Unix utilities # Wikipedia

on Debian (and presumably derived systems) this is also useful:
sudo apt-cache search <keyword>

Few years ago NetBSD decided to rewrite its apropos. The new implementation does a full text search with results ranked in order of relevance. It comes close to what you have asked. See the output here
https://man-k.org/search?q=find+all+files+in+directory

Related

Preferred terminal scripting language [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
What language do you prefer for writing scripts for common tasks (backup, sync, etc.) and why? I'm not talking about programming web pages or applications.
I've came up with this question when thinking about why bash is still popular. For example Python looks more comfortable for me. Do you use just because you know it or for some special reasons?
If it's "create this directory, run this command, if that worked do run this"-level, I just use bash shell-scripts..
Anything more complicated, say something that parses the output of a command and acts upon it, becomes a Python script - I find it just as quick to write, mainly because shell scripts are difficult to debug (bash script error messages aren't exactly useful compared to Python's tracebacks..), and the end code becomes much more readable
...why bash is still popular?
Well, because Bourne Shell ( sh , and not necessarily bash ) is pretty much available in any +*n*x installation.
A good command of sh and vi its extremely helpful when connecting to remote servers via telnet/ssh
For local admin ( when you own the server ) you can use python/perl/ruby and customize them at your will. But most certainly, any day you could be asked to "quickly fix" other server where the two defaults are installed: sh+vi
That's why.
Unix has a philosophy of small tools which do one particular job and do it well. Often the easiest way to solve problems is to use a combination of such tools. Shell scripting is the king for this, no questions about that.
Of course, there's also the "when all you have is a hammer" syndrome :)
This really depends on the type of script. I am starting to use ruby for many sysadmin type tasks however bash is still my first choice for quick and dirty scripts. The advantage of bash, in my eyes, is the interactive nature of it.
To give an example. The other day I was searching for some particular values in approx 200 compressed log files, re-formatting the output and mailing the results.
It was very easy to use bash to do this iteratively, so, zcat one file piping the output to grep, retrying a few times to get the regex correct. Then take that output and reformat the result using awk, again retrying several times to get the format correct.
This process took a couple of minutes after which I wrote the bash commands into a script file, paramaterized it, wrapped a for loop around it, mailed the result and the job was done.
I find this process much simpler in bash just using command editing and retrying the regexes etc than I would in a separate script file where I have to keep editing the script and retrying etc.
G'day,
Different tasks call for different languages. I tend to use either shell, usually bash, or Perl depending on the task.
Now I'm getting more comfortable with Ruby, for those tasks that might suit an OO approach, I'll use that.
HTH
cheers,
Perl would be best in handling system administration tasks. I have never come across a *nix system that does not have Perl installed.
Python for me at the moment, I like using python because it has an interactive terminal that I can use to build up and execute the script as I go along - but I used perl in the past.
Bash, or various sh dialects in the broader sense can be assumed to be present on pretty much any unix system. Often, production Unix systems (Solaris, HP/UX, AIX etc.) have a very plain vanilla install; quite often they will not have perl or python installed. There may be company policies restricting this, so getting it installed may not be an option either. If you want something that will work on this type of platform, you will probably be limited to sh/sed/awk.
Bash is quite good for tasks that primarily involve running other commands, so you shouldn't underrate it. However, it rapidly becomes a write-only language at fairly trivial levels of complexity, so Perl or Python might be a better choice if you are programming something with a lot of internal processing.
For scheduling a backup, a bash script run from cron is quite possibly the best way to do the job. For something that involves parsing multiple log files, formatting the output to a summary status file and sending you an email notification if it notices certain types of events you might be better off with perl or python.
Bash is the preferred scripting language for these kinds of tasks. It's pretty ubiquitous, and it's intended to be a glue language, in the sense that you can glue together a bunch of commands that you would normally do in the terminal pretty much unchanged.
I use Ruby for most of my shell scripting tasks. I can never remember some of the nuanaces/gotchas of Bash scripting.
I use Ruby because I am most comfortable in it. It's one of the few languages in which I find myself struggling with the logic of my problem, rather than the syntax or restrictions of the language. Compare this to C++ or Perl, in which I get frustrated over pointers and sigils. I find recursive directory traversal and running system commands very easy to do in Ruby, e.g. using Ruby to rename files and edit their content.
I use perl, typically. The module library at CPAN makes many tasks simple. Net::SSH is a great tool for automating system administration tasks.

Best practices to put into a man page

Is there a best practices guideline for writing man pages?
What should be included in the layout? The standard ones are:
NAME
SYNOPSIS
DESCRIPTION
EXAMPLES
SEE ALSO
There are others like OPTIONS, AUTHOR.
As a user what would be useful to have? What isn't helpful?
If you cannot find any old bound copies of 1970s Bell Labs "troff" documentation, which had some nice sections on writing man pages, :-) then I'd suggest trying out Jens's "HOWTO" on writing man pages over at his site.
The Unix 7th Edition manuals are available online in a variety of formats.
A BUGS section is nice, and an EXAMPLES section is always useful. Some man pages contain a
FILES section which lists related configuration files, or ENVIRONMENT section detailing any influential environment variables.
To be clear, what sections or type of information are useful to users depends on the nature of the program or utility that you are documenting.
There is a canonical man page outline distributed with UNIX systems, or at least usually there is. In general, I'd put in all the fields, and include a line like "none" if it doesn't apply.
One thing which people sometimes forget to put in manual pages is the meaning of the return value of the function. It's easy to forget, but the omission can make life much harder for people who have to use your function. Also, a simple code segment in the SYNOPSIS or a good minimal working EXAMPLE is very useful.
One thing that I often do with man pages is to try to find a related command, even though I know the thing I'm looking at doesn't do what I want. In this case, the SEE ALSO is great.
It depends on what your software does. If it is a small stand-alone application, I would certainly put the AUTHOR section in the man page so that if users find bugs they can easily find an email address to report the bug to you.
As for best practices, none that I know of, other than that the man page should be concise, detailed but not contain too much information that is not required, if it is just a tool the inner workings are not required for example.

What commands must I learn to become an effective Linux shell script programmer?

I have recently started moving into the world of Linux development. I wanted to learn some new things and thought bash might be fun. As I learn more about bash programming I have found that there are quite an assortment of useful tools to be used (such as grep, tr, awk, etc.) There are so many that I just do not know which ones are "vital" to learn.
Shell scripting commands depend heavily on the configuration of the system itself, and can change drastically over time, unlike most programming languages (where a core library ships with the language itself and represents the "core" set of commands that a programmer would use when interacting with the outside world). Therefore,
As a modern Linux shell script programmer, which command line tools should I be familiar with?
Compressing and uncompressing various archives.
Using the man pages
alias is always helpful
as mentioned by others sed & grep (RegEx is good to know in general), sort, head, tr, cut
echo & printf (their differences and when to use what)
Getting the return value (not as useful but still handy when writing scripts) via $?
top, ps, kill, how to background/foreground/suspend a process
The important thing is combining the many tools that exists and where most become extremely useful. Using man whenever you are stuck is probably the most important thing.
I'd recommend especially that you become familiar with locate, grep and find. sed, awk and vim are next, and around these are cat, less, tail / head, ls (yes, ls!), and the many ways in which bash can help you.
Especially about Bash: beware of bashisms!
Depends on what you're doing, obviously, but I get a lot of mileage out of find, grep, rsync, and ssh. The simple ones are useful, too: cat, tail, wc, ps. There's a lot you can do with a for loop, too, and wildcard syntax is essential. For example,
$ for i in {app,web}{01,02}; do ssh $i date; done
That will ssh into hosts app01, app02, web01, and web02 and execute the date command on each one.
Try looking at commandlinefu. People come up with all sorts of things there, and you're bound to find examples of stuff which may be useful in the future.
But generally, top used commands, by John are nice as a guidance.
And of course, here be dragons, list of stuff you shouldn't do: deadly ones
You should know some console-based text editor. Pico might suffice. I myself am a vi guy, though Emacs is also acceptable. (Though I will recommend vi: that is a de-facto standard on nearly any platform of Unix, and things like grep/sed behave very similar to vi.)
Others:
Screen: extremely useful when you don't have a GUI or don't want to/can't open up many terminal windows or PuTTY sessions. Allows you to have multiple shell sessions open, and you can toggle between them (and many other things.
top: good for monitoring processes, CPU usage, and memory usage
watch: runs a command every "n" seconds and displays its output. E.g., watch -n 1 "ls -aio" executes "ls -aio" every time 1 second.
you should probably know everything on this list:
http://www.faculty.ucr.edu/~tgirke/Documents/UNIX/linux_manual.html
maybe not everything is essential all the time, but knowing at least a cursory overview of each can help a lot for basic functionality.
perl, xargs, lsof, find, grep, bash, tar, gzip, tr, tail, diff, patch, and bc.
And everything that is in SUS2 (Single UNIX Specification).
Like you mentioned, learn awk, sed and grep. They will be very good friends of yours.
Also, very important, learn to use properly a text editor such as vim.
I would also recommend you to get familiar with a good scripting language such as perl or python.
Don't worry about the commands directly. Rather when you find yourself struggling with something try a few quick Google and man page searches and see how you can improve what you're trying to do right then and there. Keep it relevant and you will get more useful results.

How to learn your way through Linux's shell

I want to stop losing precious time when dealing with Linux/Unix's shell.
If I could get to understand it well, that would be great. Otherwise:
I may end up losing a day just for setting up a crontab.
I'll keep wondering why the shebang in this script doesn't work.
I'll keep wondering what the real difference is between:
. run.sh
./run.sh
. ./run.sh
sh run.sh
sh ./run.sh
You see, it's these kind of things that cripple my Linux/Unix life.
As a programmer, I want to get much better at this. I suppose it's best to stay with the widely-used bash shell, but I might be wrong. Whatever tool I use, I need to understand it down to its guts.
What is the ultimate solution?
Just for fun:
. run.sh --- "Source" the code in run.sh. Usually, this is used to get environment variables into the current shell processes. You probably don't want this for a script called run.sh.
./run.sh --- Execute the run.sh script in the current directory. Generally the current directory is not in the default path (see $PATH), so you need to call out the relative location explicitly. The . character is used differently than in item #1.
. ./run.sh --- Source the run.sh script in the current directory. This combines the use of . from items #1 and #2.
sh run.sh --- Use the sh shell interpretor on run.sh. Bourne shell is usually the default for running shell scripts, so this is probably the same as item #2 except it finds the first run.sh in the $PATH rather than the one in the current directory.
sh ./run.sh --- And this is usually the same as #2 except wordier.
Command line interfaces, such as the various shell interpretors, tend to be very esoteric since they need to pack a lot of meaning into a small number of characters. Otherwise typing takes too long.
In terms of learning, I'd suggest using bash or ksh and don't let anyone talk you into something else until you are comfortable. Please don't learn with csh or you will need to unlearn too much when you start with a Bourne-type shell later.
Also, crontab entries are a bit trickier than other uses of shell. My guess is you lost time because your environment was set differently than on the command line. I would suggest starting somewhere else if possible.
Man pages are probably the "ultimate" solution. I'm always amazed at what gems they contain.
You can even use man bash to answer some of the questions you raise in this question.
Try the Linux Documentation Project's BASH pages here [tldp.org].
PS: The difference is that . is the same as the source command. This only requires read permission. To use ./run.sh, you need execute permissions. When you use 'sh', you are explicitly specifying the command you want to run the script with (here, you only need read permission). If you are using a shell to execute it, there shouldn't be a problem there. If you want to use a different program, such as 'python', you could use python run.py. Another trick is to add a line #!<program> to the beginning of your script. In your case, #!/bin/sh would do, and for Python, #/usr/bin/env python is best.
I think immersing yourself in it is the answer. It's like learning to walk, to type, to use an ergonomic keyboard, or type in dvorak. Commit to it completely. And yes, you will absolutely slow down. And you'll be forced to look at your hands or google things constantly. But eventually it will come to you.
Funny story, I killed my apartment's internet access by doing an ifconfig release. Completely didn't know that ifconfig renew was not a command. Had to call a friend when google didn't load ;) dhcpcd later and I was back to googling everything.
Oreily has an older book that teaches you about bash, and bash scripting at the end. Most everything you need to know is on the web, but spread out.
The best way to learn is by reading, and then trying things out. MAN pages are often very helpful, and there are tons of Shell scripting tutorials out there. If shell scripting is what you are after, just read, and then practice the things you read by writing little scripts that do something neat or fun. If you are looking for more info on all of the different command line applications that can be run from a shell, those are more distro dependent, so look in the documentation for your favorite distro.
There are some decent online guides that will help you feel more comfortable in the shell(s). For example:
Steve Parker's Unix/Linux Shell Scripting Tutorial
UNIX Shell Programming, by Stephen G. Kochan and Patrick H. Wood, available online at Google Books. Also available in hardback. Amazon has it.
UNIX shell scripting with sh/ksh. Apparently used as part of a class at Dartmouth College. ksh is the Korn shell, and it's close enough to bash that the info will be useful.
Spend some time reading those kinds of tutorials and, above all, playing in the shell. Pretty soon, it'll start to feel like $HOME. (Okay, sorry for the bad pun...)
use the shell a lot, type "[prog-name] --help" a lot, type "man [prog-name]" a lot, and make notes on what works and what does not -- even if those notes seem obvious at the time. By tomorrow, they might not be so obvious, again. OTOH, in a couple of weeks they definitely should be!
Have a gander at some of the many books on working with shells, like From Bash to Z Shell (which covers Bash and Z), slog through a shell-scripting tutorial, or Gnu's Bash manual.
While sticking with Bash might be best, while just learning, the fish shell might be a little bit easier to use. I played around with it for a few weeks, and while it didn't seem as powerful as bash, it did seem pretty user friendly.
The way I really learned my way around in Linux was to install gentoo. It takes forever but you begin to see how everything ties together.
Go grab the latest instructions and start following them. Do it enough times and it starts to stick.
After a while you get comfortable building everything from scratch.
I find trying to learn a new command using the man pages sometimes a bit overwhelming.
I prefer man pages for refreshing my memory.
When I want to learn a command I look for examples then use the man pages to fine tune what I want it to do.

Which Scripting language is best? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
For writing scripts for process automisation in Linux platform, which scripting language will be better? Shell script, Perl or Python or is there anything else? I am new to all of them. So, am just thinking to which one to go for?
The answer is: Whatever best fits the job!
My rule of thumb;
Bash - for a short script that might need a for loop to do something repetitively.
Perl - anything to do with some kind of text processing or file processing, especially if it's a one off. Just do a dirty nasty perl script and be done with it
Python - If it's something you might want to do again or something very like it. Then at least you have a chance of being able to reuse the script.
Go for all three of them, start with bash/awk/sed plus fileutils (grep, find, and so on) and then move up the abstraction hierarchy with perl and python.
That way you will be able to decide for yourself which one fits your needs best. I say start with bash and friends because they are ubiquitous, some machines will not have perl or python installed and you'll feel helpless there, especially in traditional unix land (ie, not linux)
When choosing a scripting language to help automate your linux / unix environment, the most important thing in my opinion is... your replacement :-)
By which I mean the next / other sysadmins who may have to maintain your scripts. I am currently working in an environment where the lead Unix guy is a real script head, but he has mainly restrained himself to using bash, with some perl and windows vbscript thrown in for good luck. At least it has forced me to brush up my perl.
While agreeing with the other comments here, my suggestion would be to master bash - where possible do as much as possible in bash, as most people know it, and can maintain / debug it. And it will be most portable. Use with sed & awk is particularly powerful.
When you have that mastered, you can come back here and ask "What scripting language should I learn after bash?" :-)
JB
I use Perl for anything beyond extremely simple scripts.
I also 'use warnings', 'use strict', avoid backticks, call system as 'system($command, #and_args)'. And because I like it to be maintainable: IPC::Run (for pipes), File::Fu (for filenames, tempfiles, etc), YAML (for configs or misc data), and Getopt::Helpful (so I can remember what the options were.)
I think it depends on how complex the tasks are you want to automate. Personally, I've always gone with shell-scripts, which enables you to call on awk, sed, grep, find, ls, cat, etc. which can be combined together to do pretty much anything you can achieve using perl or python. On the other hand, if the processes you want to automate are complex (e.g., not just a linear sequence of steps) then you'll probably find that writing the scripts in perl or python (or even ruby!) is much quicker and makes them easier to maintain.
I'd recommend bash, awk, and sed.
bash - http://tldp.org/LDP/abs/html/
awk - http://www.uga.edu/~ucns/wsg/unix/awk/
http://www.grymoire.com/Unix/Awk.html
sed - http://www.ce.berkeley.edu/~kayyum/unix_tips/sedtips.html
http://www.grymoire.com/Unix/Sed.html
Just some ideas.
Depends on the complexity and problem domain of the task at hand.
Bash scripts are quick and dirty for simple system automation tasks. For more complex things than moving files around and running commands, I'd personally say Perl is next in line as the defacto sys-admin goto automation tool. For more focus on code reuse and readability/maintainability I'd want to step it up it up to Python or Ruby.
PHP can also be used to automate tasks, however it is not widely accepted for this purpose in my experience.
It really comes down to what language you are most interested in learning, most can be used for automation, in addition to many other things.
I prefer shell scripts only for very small tasks. Writing robust shell scripts requires a lot of knowledge about possible pitfalls, which you only learn by doing. But learning even the basics will increase your productivity a lot!
If I need to have complex logic, I usually use Python. By complex I mean anything that has more than two if -statements =)
Perl is okay for its original purpose, but be warned that many of the perlisms you learn are not applicable anywhere else.
Python and Ruby are roughly equivalent. I'd recommend you learn one of them well and check out a tutorial on the other. I prefer Python but it really comes down to personal preference.
To summarize: Learn basics of shell scripts. Learn at least Python or Ruby well.
If you want minimalistic, compact and fast solution (faster than Python/Ruby) then -> go for LUA scripting language :-)
However Lua speed & code compactness is achieved by relativelly small Lua language core, so if you want "batteries included" (aka. very big "standard" libraries) then Lua is not for you. Otherwise, guys who come from C/C++ world very enjoys Lua speed :-)
p.s.
Lua vs Ruby 1.9 benchmark (you can look also Lua Vs Python 3):
http://shootout.alioth.debian.org/u32/benchmark.php?test=all&lang=lua&lang2=yarv
I have been getting Python recommended all the time. It's supposed to let you do anything. For the small tasks i use shell scripts though.
I would usually say the one you know best which can achieve the results you want. Like all religious wars, and after learning a large number of languages, you realise that you can do most things in most languages (Note I did say most). I use Perl. It is maybe not as up to date as Python or Ruby, but it does have massive library support from CPAN. And I have not found anything I can't do in it yet. When I do I will look at other languages to find out which one can fill that gap.
If I was starting today, maybe I would pick Python or Ruby, but I don't know enough about them to make a judgement call. Do any of your friends/colleagues know scripting languages. This could help you massively as the support when learning a new language is very important.
Good luck
Well, it's like this:
Perl is not the most user friendly scripting language, but it has CPAN (Comprehensive Perl Archive Network), which contains thousands of libraries that implement almost anything you may think of, and Perl is really powerful when it comes to text processing. The disadvantage would be that perl code is kinda hard to maintain (if you don't know it very well).
Python is a scripting language that is becoming more and more popular among scripters. It doesn't have a community like CPAN (yet), but it's more readable, and it's easier to maintain. It's as fast as perl.
Ruby is the newest trend in scripting languages. Ruby is full OOP, which means that everything is an object. Its advantage is that the code is very readable, and it's pretty easy to learn, if you are a beginner. The main disadvantage is its execution speed, which kinda s*x.
That depends on which type of automation you are doing like if it is testing autoamtion Perl is suggested because Perl is much powerful extension modules via CPAN, an online Perl module inventory. If you only need a handy tool to complete a simple source file, awk is very convenient. If you are planning to use the scripts to automate a big project, Perl is a better choice with more features.
Again Python was designed from the start as an object-oriented language. Perl 5 has some o-o features added on, but it looks to me like an awkward retrofit. Python has well-implemented o-o features for multiple inheritance, polymorphism, and encapsulation.n summary, it seems to me that Python dominates Perl in most applications except for fairly short shell-script sorts of applications, and there they are roughly comparable.
If I had to pick one, it would have to be AWK. It's lightweight, has a small learning curve and has many useful functions like index and substr.
Depends on what you want to do, I regularly use all of them:
Shell for simple batching of commands with perhaps a loop or an if-statement.
Perl when I'm munching files and do some text replacement and souch things.
Python when need more logic.
Under *nix you should use the right tool for the right work, which can be hard for the beginner since it's so many things to learn (after some 15 years as a *nix user I still find new things). My recommendation is to look at all the languages quickly to see what they can do, and then start with using shell for everything, when your scripts gets clunky move them to something else.
Just write your commands one after the other, put it in a file and run this file with
promp> bash file
and you have your first automation. Then learn about bash variables, loops and control structures.
I second Python - powerful, simple, performant, and... actually quite fun, compared to perl or bash. Also if you know it, you'll find other uses, it's used in a lot of projects.
And not just as a "classic" scripting language, take for example the twisted project. That's true for Perl too I guess, but I like Python better order of magnitudes myself
Bottom line though is like has been said beofre, make sure you have the right tool for the job...
If you aim at having a simple script program "controlling" another (command-line, of course) program, then you should review Tcl/Tk, especially its dialect expect - they're simple and oriented towards that goal - it's very easy to create a script that controls ftp and even does a su with them!
Awk's very nice to process text files - not as powerful as perl, yet much more simple and straightforward (and without the horrible syntax).
Of course, your mileage may vary, so I guess the best answer would be to ask you: what do you want to write scripts for? And then: Are you familiar with any language script? The answers to these questions will point you to the scripting language you should use, according to the pros/cons of each one and their main target.
On Linux? Choose your poison, basically. I like Python, others Ruby, still others Perl. Pick one and go for it. :-)
I'd say Python - it has a very high readability, it is simple( no curly brackets, key words as close to english as possible etc.) and you can do almost everything in it, from simple to very complex things. It is also popular and fun to code.
This may sound a little odd, I had been using bash for over 10 years. I have started using PHP5 and it was difficult at first, but now I have a much better reusable code base.
I wouldn't recommend it as a starting point though!

Resources