Does cron have an authoritative standards document for cron syntax? - cron

Searching for examples of Cron usage and Cron parsing tools for cron's syntax, I see some tools with 5 input places and some with 6 (the sixth being for a seconds field). Furthermore, I see some flavors which support ? character as a valid input while other variants do not.
Does cron have an authoritative standards document for cron syntax? (E.g. an RFC or an ISO document). Or is cron just a loose collection of similar tools each with their own syntax rules?

Related

building a quartz cron expression

Basicly I need to create complex quartz cron expression strings with .net. Do you have any idea how to achieve that?
For example: Everyday, Every (2 hours, 1 minutes, 10 seconds), between 10AM - 19PM
Or at least can anyone please help me to understand cron strings to manually create them?
Of course I did my homework before asking here and looked quartz.net tutorials, several jquery plugins and some cron string generator web sites but all of them are generating or giving examples of simple crons like "on every seconds/minutes/hours", "on fridays at 11 am".

Stata programming language without syntax?

I recently got into Stata coming from a procedural/OO/functional background, and am having trouble understanding the basic elements of the language.
For example, I discovered that there is a syntax command which "allows programs to interpret the arguments the user types according to a grammar, such as standard Stata syntax". I infer this is the reason why some command require a list of variables given as arguments to be separated by whitespaces while others require a comma-separated list. But the idea of a program defining its own syntax instead of the (parameter) syntax being enforced seems plain weird.
Another quite interesting construct is the syntax for macro definition and expansion (`macro') and the apparent absence of local variables as known in other languages.
Is there something like a "Stata for Java developers" document explaining the basic concepts of the language to people with my background?
PS: Apologies if this question seems unclear. Unfortunately, I can't formulate more concrete/clear questions at this point :(
I'm not exactly sure what you are looking for... but here's a few related points. Stata is kind of like writing a Unix shell script or a Windows batch file. Each line executes a command, and the first word is the command name. By convention, most commands have the following structure:
command [varlist] [=exp] [if expression] [in range] [weight] [using filename] [, options]
Brackets [.] means it's optional (or unavailable, depending on the command). Some commands can be prefixed (such as by:, xi:, or svy:) The syntax of commands by Stata Corp and experienced users are pretty consistent. But, because Stata users also write commands, you occasionally see things that are wacky.
When Stata users write commands, they are saved in .ado files (not .do) and are defined using the program command. (See help program and the "Ado files" section of the manual.) Writing a command is akin to writing a function in other languages (e.g., MatLab)
The syntax command is used to help you write your own command. When you execute a command, everything following the command's name (command above) is passed to the program in the local macro `0'. The syntax command parses this local macro, so that you can reference `varlist' or `if' and so on. In theory, you could parse `0' yourself, but the syntax command makes it much easier for you and your users (as long as you are following the conventional syntax). I put an example at the bottom.
I don't know exactly what you mean by "apparent absence of local variables as known in other languages." Macros store a single string or a single number in memory. Here's a comment I wrote about Stata's local/global macros. They are indeed a unique feature of Stata's programming language. As their names imply, "local" macros are only available within a specify program (command) or .do file while "global" macros are available throughout a Stata session.
I found that, once I got used to macros in Stata, I started to miss them in other languages. They are pretty handy. In addition to (local/global) macros and the main data set, you can also store "things" in memory with the scalar and matrix commands (and one or two other obscure things).
I hope that helps. Here's a list resources that might help.
Example:
program define myprogram
syntax varlist [if], [hello(string) yes]
macro list _0 _varlist _if _hello _yes
summarize `varlist' `if'
display "Here's the string in my hello option: `hello'"
if !missing("`yes'") di "Yes is on"
else di "Yes is off"
end
sysuse auto.dta
myprogram rep78 headroom if price > 5000 , hello("world") yes
A few books offer an "X for Y users" approach, but generally between stats software solutions. Regarding your question, I would recommend using instinct first.
I started reading (programming and markup) code about ten years ago, and even though I cannot code in a large number of languages, I can read a few languages rather easily. I found Stata easy because most of its core commands are straightforward, with recurrent optional statements like over, if or replace (to take a voluntarily diverse set of statements) that are easy to understand and then apply.
When I teach Stata, I always have problems getting students to use the help pages as much as I do (and I love the fact they can be accessed so easily, just like in R). I explain the paradox by considering the fact that I can read the syntax indications straightaway. Syntax is very well covered by the previous reply to your question.
The extra mile consists in opening the [R], [U] and especially [P] handbooks that come with Stata in the utilities folder. There is a wealth of details there, which will interest both programmers and training statisticians. This is where I learnt to use macros and loops, beyond the obvious logic of commands like local/global and foreach/while (if I understand the term correctly, Stata is Turing-complete).
Stata is sometimes a bit of a pain when it comes to using single/double quotes in macro loops, but it's pretty straightforward otherwise. Have fun!

Linux utility to know command for a particular task

In my initial days of using linux I usually had to search google to know the command for
doing a particular task. Once I have the command name, i can view its usage using man command-name.
Similarly I was thinking of some utility which can tell the command to do a particular task if the task to be done is specified as an argument and opens the man page for that command
e.g:
findUtilty "find all files in a directory"
output:
ls
find
I want to know if some utility of that kind exists, if so it will be very handy especially for newbies.
If not then i may like to implement it.
thanx,
Not as nice as you are asking about, but
apropos <keyword>
and
man -k <keyword>
can be very useful.
Parsing natural language is hard because there are thousands of ways to rephrase one sentence. Google does it best as far as I know. So, there is no such tool. There are handy and practical manuals that makes it easy to find the right tool for the job. Also, there is a huge community behind core-utils (and linux in general), so try both forums and IRC. Often, the latter is the fastest. And people tend to parse natural language as expected :)
apropos will do something like you suggest.
I guess it is: List of Unix utilities # Wikipedia
on Debian (and presumably derived systems) this is also useful:
sudo apt-cache search <keyword>
Few years ago NetBSD decided to rewrite its apropos. The new implementation does a full text search with results ranked in order of relevance. It comes close to what you have asked. See the output here
https://man-k.org/search?q=find+all+files+in+directory

Preferred terminal scripting language [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
What language do you prefer for writing scripts for common tasks (backup, sync, etc.) and why? I'm not talking about programming web pages or applications.
I've came up with this question when thinking about why bash is still popular. For example Python looks more comfortable for me. Do you use just because you know it or for some special reasons?
If it's "create this directory, run this command, if that worked do run this"-level, I just use bash shell-scripts..
Anything more complicated, say something that parses the output of a command and acts upon it, becomes a Python script - I find it just as quick to write, mainly because shell scripts are difficult to debug (bash script error messages aren't exactly useful compared to Python's tracebacks..), and the end code becomes much more readable
...why bash is still popular?
Well, because Bourne Shell ( sh , and not necessarily bash ) is pretty much available in any +*n*x installation.
A good command of sh and vi its extremely helpful when connecting to remote servers via telnet/ssh
For local admin ( when you own the server ) you can use python/perl/ruby and customize them at your will. But most certainly, any day you could be asked to "quickly fix" other server where the two defaults are installed: sh+vi
That's why.
Unix has a philosophy of small tools which do one particular job and do it well. Often the easiest way to solve problems is to use a combination of such tools. Shell scripting is the king for this, no questions about that.
Of course, there's also the "when all you have is a hammer" syndrome :)
This really depends on the type of script. I am starting to use ruby for many sysadmin type tasks however bash is still my first choice for quick and dirty scripts. The advantage of bash, in my eyes, is the interactive nature of it.
To give an example. The other day I was searching for some particular values in approx 200 compressed log files, re-formatting the output and mailing the results.
It was very easy to use bash to do this iteratively, so, zcat one file piping the output to grep, retrying a few times to get the regex correct. Then take that output and reformat the result using awk, again retrying several times to get the format correct.
This process took a couple of minutes after which I wrote the bash commands into a script file, paramaterized it, wrapped a for loop around it, mailed the result and the job was done.
I find this process much simpler in bash just using command editing and retrying the regexes etc than I would in a separate script file where I have to keep editing the script and retrying etc.
G'day,
Different tasks call for different languages. I tend to use either shell, usually bash, or Perl depending on the task.
Now I'm getting more comfortable with Ruby, for those tasks that might suit an OO approach, I'll use that.
HTH
cheers,
Perl would be best in handling system administration tasks. I have never come across a *nix system that does not have Perl installed.
Python for me at the moment, I like using python because it has an interactive terminal that I can use to build up and execute the script as I go along - but I used perl in the past.
Bash, or various sh dialects in the broader sense can be assumed to be present on pretty much any unix system. Often, production Unix systems (Solaris, HP/UX, AIX etc.) have a very plain vanilla install; quite often they will not have perl or python installed. There may be company policies restricting this, so getting it installed may not be an option either. If you want something that will work on this type of platform, you will probably be limited to sh/sed/awk.
Bash is quite good for tasks that primarily involve running other commands, so you shouldn't underrate it. However, it rapidly becomes a write-only language at fairly trivial levels of complexity, so Perl or Python might be a better choice if you are programming something with a lot of internal processing.
For scheduling a backup, a bash script run from cron is quite possibly the best way to do the job. For something that involves parsing multiple log files, formatting the output to a summary status file and sending you an email notification if it notices certain types of events you might be better off with perl or python.
Bash is the preferred scripting language for these kinds of tasks. It's pretty ubiquitous, and it's intended to be a glue language, in the sense that you can glue together a bunch of commands that you would normally do in the terminal pretty much unchanged.
I use Ruby for most of my shell scripting tasks. I can never remember some of the nuanaces/gotchas of Bash scripting.
I use Ruby because I am most comfortable in it. It's one of the few languages in which I find myself struggling with the logic of my problem, rather than the syntax or restrictions of the language. Compare this to C++ or Perl, in which I get frustrated over pointers and sigils. I find recursive directory traversal and running system commands very easy to do in Ruby, e.g. using Ruby to rename files and edit their content.
I use perl, typically. The module library at CPAN makes many tasks simple. Net::SSH is a great tool for automating system administration tasks.

A better Linux shell? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I use bash, and have done so for over a decade - but occasionally I wonder whether there has been any significant new developments in the world of Linux shells.
A few years back Microsoft released PowerShell, which seemed very interesting. Is there any comparable innovation going on in Linux shells?
You do realize bash 4 has very recently been released with a load of new features and language additions?
Shell options globstar (**/foo) does a recursive search, dirspell fixes typos during pathname expansion.
Associative arrays, map strings to strings, instead of just numbers to strings.
The autocd shell option allows changing directories by just typing the directory path instead of having to put cd in front.
Coprocesses
&>> and |& redirection operators that redirect both stdout and stderr
Loads of additions to existing builtins for improved scripting convenience.
Check out:
The "official" changelog: http://tiswww.case.edu/php/chet/bash/CHANGES
A short guide to some of the new features: http://bash-hackers.org/wiki/doku.php/bash4
I'd take a look at zsh or fishshell.
One of the least touted features of Bash (and several other shells) is the ability to write your own loadables, and have the shell run them as builtins.
Lets say you write the loadable 'on' .. and you want it to work like this:
on node 123 run some command
on class nodes run some command
on all nodes run some command
... etc ..
You can follow simple examples on how to write a loadable, then enable it as a bash built in via enable -f /path/to/loadable loadable_name
So in our case, enable -f /opt/bash/loadables/on on
... in your bashrc , and you've got it.
So, if you want to have bash interpret your spiffy new language natively, you would write a loadable named 'use' or 'switch_to', then modify the parser to load a different grammar / runtime if a certain environment variable was set.
I.e.:
#/bin/bash
switch_to my-way-cool-language
funkyfunc Zippy(int p) [[
jive.wassup(p) ]]
Most people are not going to want to hack their shell, however. I did want to point out that facilities exist to take Bash and make it the way you want it, without fiddling too much with core code.
See /path-to-bash-source/examples/loadables, you might be able to get that to fly where you work, since you're still using Bash.
You can run PowerShell on Linux via Pash. It uses Mono the way PowerShell uses .NET.
I think the "original improved shell" is ksh93. bash came into existence at a time when the ksh source code was proprietary; if ksh had been open-source then, it might not have been deemed necessary to have a new shell (though with the FSF you never know). ksh is worth studying, especially for its ability to be extended through C modules, but it's not a clear win over bash. bash's autocompletion is clearly superior, which may be enough to make bash a win overall. In any case bash and ksh have made substantial effort to converge, so differences are minor.
The other interesting shell is zsh, which attempts to be everything that ksh is while also including csh. Since I never saw any point or use to csh, I am not the right person to advocate for zsh. I will point out one unusual incompatibility: by default, in zsh a variable $var always expands to a single token, even if it contains spaces. This behavior is incompatible with all other sh-derived shells, and it is occasionally inconvenient, but really it makes a lot more sense than the original, and it saves a hell of a lot of quoting.
csh was the first shell to have job control, but in my mind it (and its successors) has been superseded by bash and ksh. It was never mucn fun to write scripts in.
Finally, there are many tiny shells designed for rescue floppies (!) and other Spartan environments, but it sounds like you have little interest in those.
(In the matter of innovation, I should add that more than half the scripts I used to write as shell scripts are now Lua scripts. Others could say the same for Python or Ruby, or back in the day, Perl or Tcl. So I think the real innovation is migration away from the shell for programmable interaction at the command line.)
IIRC, Powershell is Object Oriented, whereas most unix shells and utilities operate on text. On that regard, Squirrel Shell might interest you. I've never used it, though.
If you’re willing to lose sh compatibility, you could look at using a scripting language like Python or Tcl as your shell. rlwrap can be very handy if the interpreter doesn't provide line editing, command history, completion, etc.
One philosophy regarding shells is that they should primarily only be used to connect processes with files (here is one page that espouses that approach). That said, people have written some remarkably complex software using them.
Shells don't come much more inovative than the Scheme Schell. All the power of Scheme combined with the ability to run Unix commands and an embedded awk interpreter (written in Scheme, of course). The only drawback is that it needs a tiny bit of patching to build on 64 bit Linux.
It's not exactly Bourne-shell, but it's different. Of course, you have to learn Scheme - bonus!
if you like ruby, you can use rush (ruby-unix shell, not irb)
see the presentation here
http://www.slideshare.net/adamwiggins/rush-the-ruby-shell-and-unix-integration-library
or official website to see more examples
http://rush.heroku.com/

Resources