As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
What language do you prefer for writing scripts for common tasks (backup, sync, etc.) and why? I'm not talking about programming web pages or applications.
I've came up with this question when thinking about why bash is still popular. For example Python looks more comfortable for me. Do you use just because you know it or for some special reasons?
If it's "create this directory, run this command, if that worked do run this"-level, I just use bash shell-scripts..
Anything more complicated, say something that parses the output of a command and acts upon it, becomes a Python script - I find it just as quick to write, mainly because shell scripts are difficult to debug (bash script error messages aren't exactly useful compared to Python's tracebacks..), and the end code becomes much more readable
...why bash is still popular?
Well, because Bourne Shell ( sh , and not necessarily bash ) is pretty much available in any +*n*x installation.
A good command of sh and vi its extremely helpful when connecting to remote servers via telnet/ssh
For local admin ( when you own the server ) you can use python/perl/ruby and customize them at your will. But most certainly, any day you could be asked to "quickly fix" other server where the two defaults are installed: sh+vi
That's why.
Unix has a philosophy of small tools which do one particular job and do it well. Often the easiest way to solve problems is to use a combination of such tools. Shell scripting is the king for this, no questions about that.
Of course, there's also the "when all you have is a hammer" syndrome :)
This really depends on the type of script. I am starting to use ruby for many sysadmin type tasks however bash is still my first choice for quick and dirty scripts. The advantage of bash, in my eyes, is the interactive nature of it.
To give an example. The other day I was searching for some particular values in approx 200 compressed log files, re-formatting the output and mailing the results.
It was very easy to use bash to do this iteratively, so, zcat one file piping the output to grep, retrying a few times to get the regex correct. Then take that output and reformat the result using awk, again retrying several times to get the format correct.
This process took a couple of minutes after which I wrote the bash commands into a script file, paramaterized it, wrapped a for loop around it, mailed the result and the job was done.
I find this process much simpler in bash just using command editing and retrying the regexes etc than I would in a separate script file where I have to keep editing the script and retrying etc.
G'day,
Different tasks call for different languages. I tend to use either shell, usually bash, or Perl depending on the task.
Now I'm getting more comfortable with Ruby, for those tasks that might suit an OO approach, I'll use that.
HTH
cheers,
Perl would be best in handling system administration tasks. I have never come across a *nix system that does not have Perl installed.
Python for me at the moment, I like using python because it has an interactive terminal that I can use to build up and execute the script as I go along - but I used perl in the past.
Bash, or various sh dialects in the broader sense can be assumed to be present on pretty much any unix system. Often, production Unix systems (Solaris, HP/UX, AIX etc.) have a very plain vanilla install; quite often they will not have perl or python installed. There may be company policies restricting this, so getting it installed may not be an option either. If you want something that will work on this type of platform, you will probably be limited to sh/sed/awk.
Bash is quite good for tasks that primarily involve running other commands, so you shouldn't underrate it. However, it rapidly becomes a write-only language at fairly trivial levels of complexity, so Perl or Python might be a better choice if you are programming something with a lot of internal processing.
For scheduling a backup, a bash script run from cron is quite possibly the best way to do the job. For something that involves parsing multiple log files, formatting the output to a summary status file and sending you an email notification if it notices certain types of events you might be better off with perl or python.
Bash is the preferred scripting language for these kinds of tasks. It's pretty ubiquitous, and it's intended to be a glue language, in the sense that you can glue together a bunch of commands that you would normally do in the terminal pretty much unchanged.
I use Ruby for most of my shell scripting tasks. I can never remember some of the nuanaces/gotchas of Bash scripting.
I use Ruby because I am most comfortable in it. It's one of the few languages in which I find myself struggling with the logic of my problem, rather than the syntax or restrictions of the language. Compare this to C++ or Perl, in which I get frustrated over pointers and sigils. I find recursive directory traversal and running system commands very easy to do in Ruby, e.g. using Ruby to rename files and edit their content.
I use perl, typically. The module library at CPAN makes many tasks simple. Net::SSH is a great tool for automating system administration tasks.
Related
I want to make automation script like
auto rpm installation
long packages installation ( pkgs list with many question in the progress )
auto answering on application installation questions
What is better for this task tcl or expect ?
what is uniq in tcl and better from expect?
Expect is actually just Tcl plus a few extra commands (notably spawn, expect and send) and it is designed for automating things. Tcl's just a programming language that's designed for making scriptable tools with.
Given that, for automating RPM installation (especially when there's quite a few interactive questions) the right choice is definitely Expect. Just remember: you can use the power of Tcl inside Expect where you need it. You've got a full programming language available to you. That lets you do really complicated stuff if you're inventive…
This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
What Linux shell should I use?
I am starting to get proficient in a Linux environment and i'm trying to pick a weapon of choice in terms of command shell scripting (as i'm still a big n00b at this) that will help me (and others) manage, test and administer a set of server side applications running on a *NIX environment.
My question is: What is(are) the preferred command shell(s) out there when the following criteria are considered:
How easy is it to learn/understand for a junior dev who has never had an exposure to shell scripting?
Is there a big pool of developers out there that know this shell script?
Is it safe and easy to use - will script errors be silent or give intelligent error output, will it let the uninitiated shoot them selves in the foot?
How portable is it? - Can i expect the same script to run in OpenSolaris as well as Redhat, FreeBSD? (granted command syntax and options for specific OS will change accordingly)
How standard is it? Is it expected to be included on most distro's of *NIX or does it have to be installed additionally?
I understand that there are camps out there who hold strong feelings for/against specific command shells, i am just looking for an informed opinion.
These days, just about any non-embedded (or large embedded) operating system has a POSIX:2001 a.k.a. Single Unix v3 compatibility layer. This is native on unix platforms (Linux, Mac OS X, Solaris, *BSD, etc.) and installable on other platforms such as Windows and Android. POSIX specifies a shell language, usually known as POSIX sh. This language is derived from the Bourne shell.
Most unix systems have one of two implementations of POSIX sh: ksh or bash, which have additional useful features compared to POSIX. However some less mainstream systems (especially embedded ones) may have only POSIX-mandated features.
Given your objectives, I see three choices:
Restrict yourself to POSIX sh. Pro: you don't have to worry about differing variants, since there's a standard and compliant implementations are readily available. Con: you don't benefit from bash and ksh's extensions.
Use the intersection of ksh and bash. This is attractive in appearance, but it does mean you have to use two reference documents instead of just one — and even the features that bash and ksh have in common don't always use the same syntax. Figuring out which one you want to use on a given system is also a pain.
Choose one of ksh or bash. Both bash and ksh are available on all unix-like platforms and on Windows. Both have an open source implementation (the only one for bash, ATT ksh93 for ksh) that can be installed on most platforms. I'd go for bash over ksh for two reasons. First, it's the default on Linux, so you'll find more people who're used to it. Second, there are systems that come with an older, less-featured implementation of ksh; even if you can install ksh93, it's another thing you have to think about when deploying.
Forget about csh for scripting, and forget about zsh if you want common default availability.
See also What are the fundamental differences between the mainstream *NIX shells?, particularly the “for scripting” part of my answer.
Note that shell programming involves other utilities beyond the shell. POSIX specifies those other utilities. “Bash plus other POSIX utilities” is a reasonable choice, distinct from “POSIX utilities (including sh)”.
csh is almost always wrong.
Z shell (zsh)
It's said zsh is the most powerful for now so I would recommend trying it.
No matter which shell you learn - their syntax is very similar. Only built-in commands may slightly differ. But don't choose those old and unmaintained.
Bash is the most popular. But almost every command in bash works in zsh the same way. There are some exceptions of course.
AFAIK, every shell handles it the same way. But be warned - shells are stupid, they are not as smart as programming languages.
I saw zsh working on all Linuxes, FreeBSD and OpenSolaris.
See 4. Distros have zsh in their repos.
Why I prefer zsh (Z shell) to bash:
files matching like this: for file in ./**/*.java; do ... (I mean ./**/*.ext)
wants me to confirm when I do rm * :)
tab-autocompletion is a lot better, I can write dmdomi[tab] and it suggests dnddomainname. java wants class name as the first parameter, zsh will suggest all classes available in the package and all subpackages.
But you are not limited to zsh only. If something does not work for you, you just write it in bash or sh. This is what is "#!/bin/bash" on top of the script for. :-)
To start quickly, use my .zshrc config: http://www.rozne.geozone.pl/.zshrc The only thing you should change there is export LANG="pl_PL.UTF-8". You probably don't want Polish locale.
Shell scripts for any *nix shell are generally deceptively simple. Easy things are usually easy, sometimes hard things are easy, sometimes easy-seeming things are hard. No shell is particularly better than the others in this area but some are worse (I can't seriously recommend csh). Some will say that bash is the worst 'modern' shell, which may be true but you can't completely escape it anyway.
There's an argument to be made that using the most 'popular' shell is best for maintainability for the same reason Windows is best (and I'm not saying that it is): It's easy to find people you can hire who know how to use it. There are simply more people who have at least a passing familiarity with bash-specific features, say, than ksh or zsh. Finding people who actually understand what they're doing is another matter.
All shells have various gotchas, corner-cases and weird behaviors. Mostly it comes down to what you're used to. Shooting yourself in the foot is what I'd call a grand Unix tradition and no *nix shell can truly keep you safe.
Nearly every shell you'll see is highly portable to almost every platform. Even though this is true you won't necessarily be able to run the same (say) bash script on three different boxes unless you were careful about what utilities you used and which options you passed them. Writing portable shell scripts is hard for reasons having nothing to do with which shell they're written for.
Nearly every Linux uses bash by default and has most shells available. FreeBSD includes sh, csh and tcsh by default with bash and others in ports. Once upon a long time ago, Mac OS X used tcsh by default, but it now uses bash by default, and includes zsh along with most common shells. Beyond that I cannot comment.
Personally I use bash out of (mostly) inertia. If I weren't so familiar with it already I would use zsh instead.
bash is the standard and is very good at interactive use (good completion supporting many programs, history, readline support, many kinds of string expansion). It is also good at scripting, for a shell (arrays and hashes, quoting, string manipulation); though writing reliable scripts requires you to learn a lot more.
If you want your programs to be able to grow, work with elaborate data structures, and use some useful libraries, you should learn a language like python, ruby or perl. Most of those have interactive interpreters as well, not as convenient as a shell but useful for quick testing. IPython, for Python, is particularly useful; it lets you explore documentation very easily, can load and reload source, includes a debugger. It also includes some standard shell commands and can pass the rest to a standard shell by prefixing them with a !.
Thanks to being interactive most shells are easy enough to learn once you start using them exclusively
I believe bash, and the posix subset, is better known by a wide margin. But the languages I mentioned are as well known as many shells.
You can easily shoot yourself in the foot, convenience often makes undesirable things easy.
and 5. Portability of the shell itself shouldn't be a problem; you may need to recompile to get more modern features on some of the OSes you mention. Using a full-blown language with its own libraries will help smoothe the variation of your multiplicity of platforms.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I use bash, and have done so for over a decade - but occasionally I wonder whether there has been any significant new developments in the world of Linux shells.
A few years back Microsoft released PowerShell, which seemed very interesting. Is there any comparable innovation going on in Linux shells?
You do realize bash 4 has very recently been released with a load of new features and language additions?
Shell options globstar (**/foo) does a recursive search, dirspell fixes typos during pathname expansion.
Associative arrays, map strings to strings, instead of just numbers to strings.
The autocd shell option allows changing directories by just typing the directory path instead of having to put cd in front.
Coprocesses
&>> and |& redirection operators that redirect both stdout and stderr
Loads of additions to existing builtins for improved scripting convenience.
Check out:
The "official" changelog: http://tiswww.case.edu/php/chet/bash/CHANGES
A short guide to some of the new features: http://bash-hackers.org/wiki/doku.php/bash4
I'd take a look at zsh or fishshell.
One of the least touted features of Bash (and several other shells) is the ability to write your own loadables, and have the shell run them as builtins.
Lets say you write the loadable 'on' .. and you want it to work like this:
on node 123 run some command
on class nodes run some command
on all nodes run some command
... etc ..
You can follow simple examples on how to write a loadable, then enable it as a bash built in via enable -f /path/to/loadable loadable_name
So in our case, enable -f /opt/bash/loadables/on on
... in your bashrc , and you've got it.
So, if you want to have bash interpret your spiffy new language natively, you would write a loadable named 'use' or 'switch_to', then modify the parser to load a different grammar / runtime if a certain environment variable was set.
I.e.:
#/bin/bash
switch_to my-way-cool-language
funkyfunc Zippy(int p) [[
jive.wassup(p) ]]
Most people are not going to want to hack their shell, however. I did want to point out that facilities exist to take Bash and make it the way you want it, without fiddling too much with core code.
See /path-to-bash-source/examples/loadables, you might be able to get that to fly where you work, since you're still using Bash.
You can run PowerShell on Linux via Pash. It uses Mono the way PowerShell uses .NET.
I think the "original improved shell" is ksh93. bash came into existence at a time when the ksh source code was proprietary; if ksh had been open-source then, it might not have been deemed necessary to have a new shell (though with the FSF you never know). ksh is worth studying, especially for its ability to be extended through C modules, but it's not a clear win over bash. bash's autocompletion is clearly superior, which may be enough to make bash a win overall. In any case bash and ksh have made substantial effort to converge, so differences are minor.
The other interesting shell is zsh, which attempts to be everything that ksh is while also including csh. Since I never saw any point or use to csh, I am not the right person to advocate for zsh. I will point out one unusual incompatibility: by default, in zsh a variable $var always expands to a single token, even if it contains spaces. This behavior is incompatible with all other sh-derived shells, and it is occasionally inconvenient, but really it makes a lot more sense than the original, and it saves a hell of a lot of quoting.
csh was the first shell to have job control, but in my mind it (and its successors) has been superseded by bash and ksh. It was never mucn fun to write scripts in.
Finally, there are many tiny shells designed for rescue floppies (!) and other Spartan environments, but it sounds like you have little interest in those.
(In the matter of innovation, I should add that more than half the scripts I used to write as shell scripts are now Lua scripts. Others could say the same for Python or Ruby, or back in the day, Perl or Tcl. So I think the real innovation is migration away from the shell for programmable interaction at the command line.)
IIRC, Powershell is Object Oriented, whereas most unix shells and utilities operate on text. On that regard, Squirrel Shell might interest you. I've never used it, though.
If you’re willing to lose sh compatibility, you could look at using a scripting language like Python or Tcl as your shell. rlwrap can be very handy if the interpreter doesn't provide line editing, command history, completion, etc.
One philosophy regarding shells is that they should primarily only be used to connect processes with files (here is one page that espouses that approach). That said, people have written some remarkably complex software using them.
Shells don't come much more inovative than the Scheme Schell. All the power of Scheme combined with the ability to run Unix commands and an embedded awk interpreter (written in Scheme, of course). The only drawback is that it needs a tiny bit of patching to build on 64 bit Linux.
It's not exactly Bourne-shell, but it's different. Of course, you have to learn Scheme - bonus!
if you like ruby, you can use rush (ruby-unix shell, not irb)
see the presentation here
http://www.slideshare.net/adamwiggins/rush-the-ruby-shell-and-unix-integration-library
or official website to see more examples
http://rush.heroku.com/
I have recently started moving into the world of Linux development. I wanted to learn some new things and thought bash might be fun. As I learn more about bash programming I have found that there are quite an assortment of useful tools to be used (such as grep, tr, awk, etc.) There are so many that I just do not know which ones are "vital" to learn.
Shell scripting commands depend heavily on the configuration of the system itself, and can change drastically over time, unlike most programming languages (where a core library ships with the language itself and represents the "core" set of commands that a programmer would use when interacting with the outside world). Therefore,
As a modern Linux shell script programmer, which command line tools should I be familiar with?
Compressing and uncompressing various archives.
Using the man pages
alias is always helpful
as mentioned by others sed & grep (RegEx is good to know in general), sort, head, tr, cut
echo & printf (their differences and when to use what)
Getting the return value (not as useful but still handy when writing scripts) via $?
top, ps, kill, how to background/foreground/suspend a process
The important thing is combining the many tools that exists and where most become extremely useful. Using man whenever you are stuck is probably the most important thing.
I'd recommend especially that you become familiar with locate, grep and find. sed, awk and vim are next, and around these are cat, less, tail / head, ls (yes, ls!), and the many ways in which bash can help you.
Especially about Bash: beware of bashisms!
Depends on what you're doing, obviously, but I get a lot of mileage out of find, grep, rsync, and ssh. The simple ones are useful, too: cat, tail, wc, ps. There's a lot you can do with a for loop, too, and wildcard syntax is essential. For example,
$ for i in {app,web}{01,02}; do ssh $i date; done
That will ssh into hosts app01, app02, web01, and web02 and execute the date command on each one.
Try looking at commandlinefu. People come up with all sorts of things there, and you're bound to find examples of stuff which may be useful in the future.
But generally, top used commands, by John are nice as a guidance.
And of course, here be dragons, list of stuff you shouldn't do: deadly ones
You should know some console-based text editor. Pico might suffice. I myself am a vi guy, though Emacs is also acceptable. (Though I will recommend vi: that is a de-facto standard on nearly any platform of Unix, and things like grep/sed behave very similar to vi.)
Others:
Screen: extremely useful when you don't have a GUI or don't want to/can't open up many terminal windows or PuTTY sessions. Allows you to have multiple shell sessions open, and you can toggle between them (and many other things.
top: good for monitoring processes, CPU usage, and memory usage
watch: runs a command every "n" seconds and displays its output. E.g., watch -n 1 "ls -aio" executes "ls -aio" every time 1 second.
you should probably know everything on this list:
http://www.faculty.ucr.edu/~tgirke/Documents/UNIX/linux_manual.html
maybe not everything is essential all the time, but knowing at least a cursory overview of each can help a lot for basic functionality.
perl, xargs, lsof, find, grep, bash, tar, gzip, tr, tail, diff, patch, and bc.
And everything that is in SUS2 (Single UNIX Specification).
Like you mentioned, learn awk, sed and grep. They will be very good friends of yours.
Also, very important, learn to use properly a text editor such as vim.
I would also recommend you to get familiar with a good scripting language such as perl or python.
Don't worry about the commands directly. Rather when you find yourself struggling with something try a few quick Google and man page searches and see how you can improve what you're trying to do right then and there. Keep it relevant and you will get more useful results.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
For writing scripts for process automisation in Linux platform, which scripting language will be better? Shell script, Perl or Python or is there anything else? I am new to all of them. So, am just thinking to which one to go for?
The answer is: Whatever best fits the job!
My rule of thumb;
Bash - for a short script that might need a for loop to do something repetitively.
Perl - anything to do with some kind of text processing or file processing, especially if it's a one off. Just do a dirty nasty perl script and be done with it
Python - If it's something you might want to do again or something very like it. Then at least you have a chance of being able to reuse the script.
Go for all three of them, start with bash/awk/sed plus fileutils (grep, find, and so on) and then move up the abstraction hierarchy with perl and python.
That way you will be able to decide for yourself which one fits your needs best. I say start with bash and friends because they are ubiquitous, some machines will not have perl or python installed and you'll feel helpless there, especially in traditional unix land (ie, not linux)
When choosing a scripting language to help automate your linux / unix environment, the most important thing in my opinion is... your replacement :-)
By which I mean the next / other sysadmins who may have to maintain your scripts. I am currently working in an environment where the lead Unix guy is a real script head, but he has mainly restrained himself to using bash, with some perl and windows vbscript thrown in for good luck. At least it has forced me to brush up my perl.
While agreeing with the other comments here, my suggestion would be to master bash - where possible do as much as possible in bash, as most people know it, and can maintain / debug it. And it will be most portable. Use with sed & awk is particularly powerful.
When you have that mastered, you can come back here and ask "What scripting language should I learn after bash?" :-)
JB
I use Perl for anything beyond extremely simple scripts.
I also 'use warnings', 'use strict', avoid backticks, call system as 'system($command, #and_args)'. And because I like it to be maintainable: IPC::Run (for pipes), File::Fu (for filenames, tempfiles, etc), YAML (for configs or misc data), and Getopt::Helpful (so I can remember what the options were.)
I think it depends on how complex the tasks are you want to automate. Personally, I've always gone with shell-scripts, which enables you to call on awk, sed, grep, find, ls, cat, etc. which can be combined together to do pretty much anything you can achieve using perl or python. On the other hand, if the processes you want to automate are complex (e.g., not just a linear sequence of steps) then you'll probably find that writing the scripts in perl or python (or even ruby!) is much quicker and makes them easier to maintain.
I'd recommend bash, awk, and sed.
bash - http://tldp.org/LDP/abs/html/
awk - http://www.uga.edu/~ucns/wsg/unix/awk/
http://www.grymoire.com/Unix/Awk.html
sed - http://www.ce.berkeley.edu/~kayyum/unix_tips/sedtips.html
http://www.grymoire.com/Unix/Sed.html
Just some ideas.
Depends on the complexity and problem domain of the task at hand.
Bash scripts are quick and dirty for simple system automation tasks. For more complex things than moving files around and running commands, I'd personally say Perl is next in line as the defacto sys-admin goto automation tool. For more focus on code reuse and readability/maintainability I'd want to step it up it up to Python or Ruby.
PHP can also be used to automate tasks, however it is not widely accepted for this purpose in my experience.
It really comes down to what language you are most interested in learning, most can be used for automation, in addition to many other things.
I prefer shell scripts only for very small tasks. Writing robust shell scripts requires a lot of knowledge about possible pitfalls, which you only learn by doing. But learning even the basics will increase your productivity a lot!
If I need to have complex logic, I usually use Python. By complex I mean anything that has more than two if -statements =)
Perl is okay for its original purpose, but be warned that many of the perlisms you learn are not applicable anywhere else.
Python and Ruby are roughly equivalent. I'd recommend you learn one of them well and check out a tutorial on the other. I prefer Python but it really comes down to personal preference.
To summarize: Learn basics of shell scripts. Learn at least Python or Ruby well.
If you want minimalistic, compact and fast solution (faster than Python/Ruby) then -> go for LUA scripting language :-)
However Lua speed & code compactness is achieved by relativelly small Lua language core, so if you want "batteries included" (aka. very big "standard" libraries) then Lua is not for you. Otherwise, guys who come from C/C++ world very enjoys Lua speed :-)
p.s.
Lua vs Ruby 1.9 benchmark (you can look also Lua Vs Python 3):
http://shootout.alioth.debian.org/u32/benchmark.php?test=all&lang=lua&lang2=yarv
I have been getting Python recommended all the time. It's supposed to let you do anything. For the small tasks i use shell scripts though.
I would usually say the one you know best which can achieve the results you want. Like all religious wars, and after learning a large number of languages, you realise that you can do most things in most languages (Note I did say most). I use Perl. It is maybe not as up to date as Python or Ruby, but it does have massive library support from CPAN. And I have not found anything I can't do in it yet. When I do I will look at other languages to find out which one can fill that gap.
If I was starting today, maybe I would pick Python or Ruby, but I don't know enough about them to make a judgement call. Do any of your friends/colleagues know scripting languages. This could help you massively as the support when learning a new language is very important.
Good luck
Well, it's like this:
Perl is not the most user friendly scripting language, but it has CPAN (Comprehensive Perl Archive Network), which contains thousands of libraries that implement almost anything you may think of, and Perl is really powerful when it comes to text processing. The disadvantage would be that perl code is kinda hard to maintain (if you don't know it very well).
Python is a scripting language that is becoming more and more popular among scripters. It doesn't have a community like CPAN (yet), but it's more readable, and it's easier to maintain. It's as fast as perl.
Ruby is the newest trend in scripting languages. Ruby is full OOP, which means that everything is an object. Its advantage is that the code is very readable, and it's pretty easy to learn, if you are a beginner. The main disadvantage is its execution speed, which kinda s*x.
That depends on which type of automation you are doing like if it is testing autoamtion Perl is suggested because Perl is much powerful extension modules via CPAN, an online Perl module inventory. If you only need a handy tool to complete a simple source file, awk is very convenient. If you are planning to use the scripts to automate a big project, Perl is a better choice with more features.
Again Python was designed from the start as an object-oriented language. Perl 5 has some o-o features added on, but it looks to me like an awkward retrofit. Python has well-implemented o-o features for multiple inheritance, polymorphism, and encapsulation.n summary, it seems to me that Python dominates Perl in most applications except for fairly short shell-script sorts of applications, and there they are roughly comparable.
If I had to pick one, it would have to be AWK. It's lightweight, has a small learning curve and has many useful functions like index and substr.
Depends on what you want to do, I regularly use all of them:
Shell for simple batching of commands with perhaps a loop or an if-statement.
Perl when I'm munching files and do some text replacement and souch things.
Python when need more logic.
Under *nix you should use the right tool for the right work, which can be hard for the beginner since it's so many things to learn (after some 15 years as a *nix user I still find new things). My recommendation is to look at all the languages quickly to see what they can do, and then start with using shell for everything, when your scripts gets clunky move them to something else.
Just write your commands one after the other, put it in a file and run this file with
promp> bash file
and you have your first automation. Then learn about bash variables, loops and control structures.
I second Python - powerful, simple, performant, and... actually quite fun, compared to perl or bash. Also if you know it, you'll find other uses, it's used in a lot of projects.
And not just as a "classic" scripting language, take for example the twisted project. That's true for Perl too I guess, but I like Python better order of magnitudes myself
Bottom line though is like has been said beofre, make sure you have the right tool for the job...
If you aim at having a simple script program "controlling" another (command-line, of course) program, then you should review Tcl/Tk, especially its dialect expect - they're simple and oriented towards that goal - it's very easy to create a script that controls ftp and even does a su with them!
Awk's very nice to process text files - not as powerful as perl, yet much more simple and straightforward (and without the horrible syntax).
Of course, your mileage may vary, so I guess the best answer would be to ask you: what do you want to write scripts for? And then: Are you familiar with any language script? The answers to these questions will point you to the scripting language you should use, according to the pros/cons of each one and their main target.
On Linux? Choose your poison, basically. I like Python, others Ruby, still others Perl. Pick one and go for it. :-)
I'd say Python - it has a very high readability, it is simple( no curly brackets, key words as close to english as possible etc.) and you can do almost everything in it, from simple to very complex things. It is also popular and fun to code.
This may sound a little odd, I had been using bash for over 10 years. I have started using PHP5 and it was difficult at first, but now I have a much better reusable code base.
I wouldn't recommend it as a starting point though!