What is the purpose of MaxL in Hyperion? - workspace

I am new to Essbase. Here I am creating an Application by using Essbase Administration Services (EAS), Planning and Workspace.
If I am able to create and manage an application using the above services alone, then what are the additional features that MaxL offers..?

There is excellent documentation on MaxL that Oracle provides. It is located here.
If you like working from the command line, you might prefer MaxL to EAS and Workspace because there aren't any mouseclicks involved. Secondly, you are connecting directly to the Essbase Server, so that makes it all that much faster as well.
Raam is correct that you can automate tasks using MaxL. For instance, if you have four calculation scripts that you need to run in a row, you can forego kicking them off in EAS sequentially and instead write a MaxL script that runs them all in a row for you.
Another benefit of scripting the process is being able to take advantage of some nice logging features Essbase in a directory that is convenient for you instead of buried deep within the Hyperion or EPM system directories.
However, since you're new to Essbase, you should know that MaxL does not provide all the functionality you have in the Hyperion Planning and Workspace UIs. There are command line utilities for planning and workspace, but they are not based on MaxL.

MAXL is a command line tool for administration of Essbase. While EAS is a GUI based tool. As with all command line tools, MAXL can be used to automate tasks by creating scripts. For example you can create a data load script and run that as and when needed instead of doing the same from the UI.

Related

Is there a integration between crontab and Rundeck?

I'm trying to find a tool where you can easily monitor cronjobs for the company I work at. Rundeck seems like the perfect tool for this but I can't figure out if it's possible to integrate the existing cronjobs into Rundeck. It's too much work to do this manually because there are hundreds of them.
If you now another tool that can do this feel free to recommend!
I'd rather want it to be open source but if it's paid and it works correct I'm open to it.
Rundeck works as a "very enhanced cron replacement" maybe the best approach in your case is to migrate your cron jobs to Rundeck and use the notification for monitoring. This looks like a good starting point for you, donĀ“t forget to visit the official documentation.
What Mega mentions is correct, rundeck can be used as a replacement for cron.
Even better, there is no need to configure all of those manually: Rundeck allows you to import job definitions via API call.
Steps:
Get rundeck installed
Set up a job to run one of your cron jobs manually
Export that job
Use a script to make many copies of that file, that each contain a different item from cron*
Import all those files via API call
*You'll need to change at least the name and workflow (called sequence commands file - you'll be able to see based on the workflow step you configured)

Automizing the process of setting up a new server

I'm maintaining the servers of a web game. Whenever we add a new server to our game, I have to configure many environment details and install softwares (for example, testing if some ports of the new machine can be connected from other places, installing mysql-client, pv..., copying the game server files from the other machine, and changing mysql server connection URL) on the new machine.
So my question is "How can I automize the whole process of setting up a new server?" Because most of the works I do are repetitive. I don't want to do this kind of job whenever a new machine comes in.
Is there a tool that allows me to save the state of a linux machine so that next time when we buy a new server, I can copy the state of an old linux machine to the new machine? I think this is one of the ways to automize the process of setting up a new game server.
I've also tried using some *.sh scripts to automize the process. But it's not always possible to get the return value of every command I execute. This is why I come here and ask for help.
Have you looked at Docker, Ansible, Cheff or Puppet?
In Docker you can build a new container by describing required operations in docker file. And you can easily move container between machines.
Ansible, Cheff and Puppet are systems management automation tools.
I doubt you'll find such tool to automatize an entire customization process because it's rather difficult to define/obtain a one-size-fit-all linux machine state, especially if the customisation includes logical/functional sequences.
But with good scripting you can obtain a possibly more reliable customisation from scratch (rather than copying it from another machine). I'd recommend a higher-level scripting language, tho, IMHO regular bash/zsh/csh scripting is not good/convenient enough. I prefer python, which gives easy access to every cmd's return code, stdout, stderr and with the pexpect module it can drive interactive cmds.
There are tools to handle specific types of customisations (sw package installations, config files), but not all I needed, so I didn't bother and went straight for custom scripts (more work, but total control). Personal preference, tho, others will advise against that.

Extract SAS Enterprise Guide into Unix Server runnable batch?

We have built a project in Enterprise Guide for the purpose of creating a easy understandable and maintainable code. The project contain a set of process flows which run should be done in specific order. This project we need to run on a Linux Server machine, where the SAS Metadata Server is running.
Basic idea is to extract this project into SAS code, which we would be able to run from command line in Linux as a batch job.
Question 1:
Is there any other way to schedule a batch job in Linux-hosted SAS Server? I have read about VBS scripting for scheduling/running batch jobs, but in order this to be done on Linux Server, a installation of WINE is required, which on a production machine which already runs a number of other important applications, is almost completely out of question.
Is there a way to specify a complete project export into SAS code, provided that I give the specific order of running process flows? I have tried out ordered list, which is able to make you a list of tasks to run in order (although there is no way to choose a whole process flow as a single task), but unfortunately, this ordered list itself is later not possible to be exported as a SAS code.
Current solution we do is the following:
We export each single process flow of the SAS EG project into SAS code, and then create another SAS code with %include lines to run all the extracted codes in order that we want. This is of course a possible solution, but definitely not the most elegant one.
Question 2:
Since I don't know how exactly the code is being exported afterwards, are there any dangers I should bear in mind with the solution I chose.
Is there any other, more elegant way?
You have a couple of options from what I'm familiar with, plus I suspect if Dom happens by he'll know more. These answers are based on EG 6.1, which is the current version (ships with 9.4); it's possible some of these things may not be true in earlier versions.
First, if you're running Enterprise Guide from Windows, you can schedule the job locally (on any Windows machine with Enterprise Guide). You're not scheduling the server directly, you schedule Windows to launch an EG process that connects to the server and does its magic. That's how I largely interact with scheduling (because I have a fairly 'light' scheduling need).
Second, from the blog post "Four Ways to Schedule SAS Tasks", options 3 and 4 may be helpful for you. The SAS Platform Suite is designed in part for scheduling, and the options using SAS Management Console to schedule via operating system tools, are both very helpful.
Third, you may want to look into SAS Stored Processes, which should be schedulable. A process flow can be converted into a stored process.
For your specific questions:
Question 1: When you export a process flow or a project, at least in 6.1 you have the option to change the order in which the programs are exported. It's manual, so it's probably not perfect, but it does give you that option. (The code seems to be by default in creation order, which is sub-optimal.) The project export does group process flows together, but you don't have the option of manipulating the order of process flows - you have to move each program around, which would be tedious. It also of course gives you less flexibility if you need to multiply run programs.
Question 2: As Stig Eide points out in comments, make sure your System Option LRECL is > 256 (the default) or you run some risk of code being cut off. In 9.2+ this is modifiable; just place LRECL=32767in your config.sas file.

Testing automation tool suited for operation team

I would like to start using an testing framework that does the following:
contains an process(the process can be a test) management engine. It is able to start processes(tests) with the help of a scheduler
it is distributed, processes can run locally or on other machines
tests can be anything:
simple telnet on a given port (infrastructure testing)
a disk I/O or mysql benckmark
a jar exported from Selenium that does acceptance testing
will need to know if the test passed or not
has the capability to get real time data from the test(something like graphite) -- this is optional
allows processes to be build in many programing languages: perl, ruby, C, bash
has a graphical interface
open-source
written in any language as long as it doesn't use resources , I would prefer C, perl or ruby
to run on linux
What not to be:
an add on to a browser: Selenium, BITE ..
I do not want something focused on web development
I will like to use such a tool or maybe collaborate on building one. I hope I was explicit enough. Thank you .
You might want to look at the robot framework combined with jenkins. Robotframework is a tool written in python for doing keyword-based acceptance testing. Jenkins is a continuous integration tool which allows you to schedule jobs, and distribute jobs amongst a grid of nodes.
Robotframework tests can do anything python can do, plus a whole lot more. It has a remote interface, which means you can write test keywords in just about any language. For example, I had a job where we had keywords written in Java. In another job we used robotframework with .NET-based keywords. This can be accomplished via a remote interface (so you can write keywords in many different languages) or you can run robot using jython to run on the JVM, or iron python to run in a .NET environment.

Front End for Running Talend Jobs

I am looking for a front end for our operator to run our Talend jobs. We do not want him to have the ability to delete or modify jobs. Only to run them and monitor their results. Any suggestions for tools for doing this?
Thanks
The subscription version of Talend (called Talend Integration Suite, or TIS) has precisely that. It's a web-based console called Talend Administration Center (TAC) and it allows an operator to run jobs and monitor their results -- among many other things. The permissions can be set in the way you described, so that the operator is not able to delete or modify the jobs.
In my company we use SOS Job Scheduler, which you can use with TOS out-of-the-box as a general diy scheduling solution. Just these days I'm working on a Talend Custom Component to integrate deeply between the first and the latter (like log-propagation, context parameter sharing and so on. Basically it let you use SOS API from a TOS job). I can speed-up and put on github in a few days, if you need :)
You can export job as shall script (.sh) and anyone can execute that and will not able to delete those jobs.
To see the execution details put some tlog component into your talend jobs.
This is just a work around not the solution if you only have open studio (free version).
I'm using Rundeck for manual and scheduled execution of Talend jobs . I find the job setup and scheduling to be far easier in Rundeck than in SpagoBI (which I frequently use for reports and BI).
You could also use something like automic (uc4) automation engine. It has fine grained security and is a commercial product . I've used it in the past for similar work.

Resources