Change survival plot template back to default - graphics

How to change the proc lifetest survival plot template back to default template?

Generally, when you change a template, you change it in a particular template store. If you run
ods path show;
That will write to the log which template store you are currently writing to ('UPDATE'). For me this returns:
WORK.TEMPLAT(UPDATE)
SASUSER.TEMPLAT(READ)
SASHELP.TMPLMST(READ)
The original template should be stored in SASHELP.TMPLMST. If you overwrote it, it would be in the (UPDATE) store - in my case, WORK.TEMPLAT, although it's often in SASUSER.TEMPLAT.
Assuming your (UPDATE) is still the same as it was when you made the change, this should work:
proc template;
delete Stat.Lifetest.Graphics.X;
run;
where X is the template you modified. That will remove the 'customized' version and SAS will use the base version.
If your customized version is in SASUSER.TEMPLAT and that currently is READ but was once UPDATE, you can revert to that by running
ods path sasuser.templat(update) sashelp.tmplmst(read);
and then rerun the delete.

Related

How to reference the most current Physical Sequential (PS) file in JCL

I wanted to create a job where I need to consider the latest file available as input file.
File format is as below: FILE1.TEST.TYYMMDD
is there any way to identify latest file based on date present in file name via JCL.
P.S. GDG versions are not created in existing process . Only PS file is created.
Thank you
I wanted to create a job where I need to consider the latest file available as input file. File [name] format is as below: FILE1.TEST.TYYMMDD is there any way to identify latest file based on date present in file name via JCL.
No.
You indicate that GDGs are not created in the existing process. GDGs would be the best way to accomplish your goal. Absent GDGs, you must write code.
You could accomplish your goal by writing (C, clist, COBOL, PL/I, Rexx) code using the LMDINIT and LMDLIST ISPF services. Then you would execute your code by running ISPF in batch. Many mainframe shops have a cataloged procedure to execute ISPF in batch.
Agree with #cschneid that there is not a platform way to handle this. However, I want to point out that GDGs are the platform way of managing PS files for access in a relative form.
Your comment
GDG versions are not created in existing process . Only PS file is
created.
That statement didn't make sense to me. GDGs are not a file type like physical sequential (PS) or partitioned (PO). It's a convention to allow relative reference to files created over time which sounds like what you want. I've only seen the use of GDGs for PS files.
Putting the date in the file name can have its uses but to z/OS its only part of the filename and not meta information that it operates on (like G0000v00's in GDGs.

SVG export with MiKTex using IguanaTex / Tex2img

I'm trying to transfer some formulas from a ".tex" document into a PPT-presentation. Using the IguanaTex AddIn, I'm able to insert my Latex Code into the presentation, which is then compiled and by default converted to a ".png" graphic. Works fine.
But I would like to have the output vectorized with TeX2img as to make the presentation look better, but the conversion either fails (MiKTex doesn't recognize the documentclass) or size and location of the output change, distorting it.
What could be the problem? (or: How do I reset the TeX2img settings as to start looking for it myself?)

Can I export matrix to Excel from SAS University on PC under VMWare

I wish to write to Excel on my PC a "big" matrix of p rows and c columns, e.g.
3,000 rows and 20 columns. But it's not easy, and I'm wondering if I can simplify it by using a fixed number for rows and columns instead of:
array mat {&periods,&columns};
Right now, I'm on the free version of SAS called "SAS University Edition", which has only community help.
I would like to output it to Excel, but using VMWARE on a PC to get SAS Studio to run, you can't write directly to disk (although there is a myfolder).
I tried this, but got this error log:
proc export data=WORK.CPAPMONTE1
file= "/folders/myfolders/outfile1.xlsx"
DBMS=xlsx
;
run;
ERROR: XLSX file can not be created -> /folders/myfolders//outfile1.xlsx. Make sure the path name is correct and that you have
write permission.
ERROR: Too many variables for the output file
I figure that the 2nd error is just due to the first error, which has a // instead of a /
I have defined a special folder for my data in SAS University Edition as:
/folders/myfolders/CPAP1
but I haven't figured out how to point to there
You can write directly to disk, you need to set up a shared folder similar to myfolders and then you reference its as
/folders/myshortcuts/myname
The folder and shortcut must be exactly correct, and all need to be lower case as it's case sensitive. If you have myfolders set up, all you need to do is right click on the folder>Properties and you'll get the path to the folder. Use that in your export. A similar process can be used for the custom shared folder you set up.
SAS University Edition Help Center/FAQ
https://support.sas.com/software/products/university-edition/faq/main.htm
Your specific question - How do I create a folder shortcut to my existing SAS files?
https://support.sas.com/software/products/university-edition/faq/shared_folder_access_existing.htm

save MATLAB code file along with results in one folder?

I'm processing a data set and running into a problem - although I xlswrite all the relevant output variables to a big Excel file that is timestamped, I don't save the code that actually generated that result. So if I try to recreate a certain set of results, I can't do it without relying on memory (which is obviously not a good plan). I'd like to know if there's a command(s) that will help me save the m-files used to generate the output Excel file, as well as the Excel file itself, in a folder I can name and timestamp so I don't have to do this manually.
In my perfect world I would run the master code file that calls 4 or 5 other function m-files, then all those m-files would be saved along with the Excel output to a folder names results_YYYYMMDDTIME. Does this functionality exist? I can't seem to find it.
There's no such functionality built in.
You could build a dependency tree of your main function by using depfun with mfilename.
depfun(mfilename()) will return a list of all functions/m-files that are called by the currently executing m-file.
This will include all files that come as MATLAB builtins, you might want to remove those (and only record the MATLAB version in your excel sheet).
As pseudocode:
% get all files:
dependencies = depfun(mfilename());
for all dependencies:
if not a matlab-builtin:
copyfile(dependency, your_folder)
As a "long term" solution you might want to check if using a version control system like subversion, mercurial (or one of many others) would be applicable in your case.
In larger projects this is preferred way to record the version of source code used to produce a certain result.

executing script file from azure blob and write its results to file

I'll explain the task requested from me:
I have two containers in Azure, one called "data" and one called "script". In the "data" container there's a txt file with data, and in the "script" container there's a script file.
Now, I need programatically (with WorkerRole) to execute the script file, with the content of the data file as parameters (Example: a script file that accepts a string 's' and returns to the screen "Hello, 's'", when 's' in the string given, and in the data file there's a string), and save the result of the run into another file which needs to be saved in another container called "result".
How do I do all these? I've already uploaded the files and created the blobs programatically, but I can't seem to understand how to execute the file of how to save its result to another file?
Can I please have some help?
Thanks in advance
Here are the steps in pseudo code:
Retrieve the script from the blob(using DownloadToStream())
Compile the script(I will leave this to you as I have no idea what
format your script is)
Load parameters from blob(same as step 1)
Execute script with those parameters.
If your script's can be written as lambda expressions then this becomes a lot easier as you can turn them into Action's
Edit based on your questiions:
DownloadText() is no longer included in Azure Storage 2.0, you only have access to DownloadToStream(). Even if you are using an older version(say 1.7) I would recommend using DownloadToStream() in the event you ever upgrade in the future. This will prevent having to refactor your code.
In terms of executing your script, depending on what type of script it is(if it is c# code you can use this example: Is it possible to dynamically compile and execute C# code fragments?. If you need to execute a different type of script you would need to run it using Process.Start and you can look at this example: http://www.dotnetperls.com/process-start
I do not have much experience with point number 2 but those are the processes I have heard and seen used.

Resources