Cognos Analytics - save weekly reports to local drive - cognos

In the current situation weekly scheduled reports are saved to the Cognos server. However, I was asked if there is a way to save them to a local network drive. Is this possible (documentation on this issue will help too)? We're using Cognos Analytics 11.0.13 on-premise.
Thanks in advance!

Yes. There are two ways.
Using Archive Location File System Root:
Within the Archive Location File System Root (in Configuration), a folder is created for a user or business unit to write to. In my case (on MS Windows) I use a junction pointing to their folder on the network.
In Define File System Locations (in Cognos Administration), I configure an alias to the folder and apply permissions for certain users, groups, or roles to use (write to) that alias.
Users schedule or use "run as" to get their reports to run and write output to their location.
Using CM.OutPutLocation and CM.OutputScript:
For the folder or report in Cognos, apply permissions for certain users, groups, or roles to create versions. (write or full permission?)
Users schedule or use "run as" to save their report output as a version of the report.
Cognos saves a copy to the folder defined in CM.OutPutLocation (in Cognos Administration).
The script defined in CM.OutputScript runs. This script can:
Identify which file was just created.
Read the description (xml) file to determine the report path.
If the report path matches a specific pattern or value, do something (like copy it to the subfolder/junction and rename it to something meaningful like the report name).
Delete the original output file.
The second option seems like more coding, but far more flexible. It must be a .bat file, but that can call anything (PowerShell, Windows Scripting Host, executable) to do the actual work, so a thorough knowledge batch programming isn't even needed.

Down the line, if the requirement changes from save to local file system and becomes save to cloud. There is also this functionality which has been added:
https://www.ibm.com/support/knowledgecenter/SSEP7J_11.1.0/com.ibm.swg.ba.cognos.ag_manage.doc/t_ca_manage_cloud_storage_add_location.html

Related

Lotus Notes : Fetch the information from share folder text file and store in Lotus notes database automatically

I am working on remotely on client system. I don't have knowledge on lotus notes.
There are some automated job executed in database which read the text file from share folder and store the information in lotus notes database. And another automated process generated report with that information.
Problem:
I have no idea which automated job are executed and also i checked in database but i didn't get anything. Can some please help here where i have to check or any other way to resolve this problem.
The way I would start looking for this is by using the Design Synopsis feature to generate a listing of all agents in the NSF file for the application. You can search that listing for the name of the folder. If you find a hit, backtrack a bit in the listing to find the name of the agent.
The thing is, this job could be running as an agent in any database, not just the one where the data resides. (I have often designed systems with "Agent Container" databases.) The job could also be running on an ETL server somewhere else on the network, or it could be running as a script launched by the operating system's scheduler elsewhere on the network. Hopefully, it's not one of those things, but even if it is in an agent in the target NSF file, you still might not find the folder name. The agent could be reading the folder name from a profile document in the NSF file, or from a configuration document in some other NSF file.
But that's where I'd start.

Setting up a trigger to watch new folders Azure Logic Apps

I am trying to create a logic app that will transfer files as they are created from my FTP server to my Azure file share. The structure of the folder my trigger is watching is structured by date (see below). Each day that a file is added, a new folder is created, so I need the trigger to check new subfolders but I don't want to go into the app every day to change which folder the trigger looks at. Is this possible?
Here's how my folder(Called data) structure is, each day that a file is added a new folder is created.
-DATA-
2016-10-01
2016-10-02
2016-10-03
...
The FTP Connector uses a configurable polling where you set how many times it should look for a file. The trigger currently does not support dynamic folders. However what you could try is the following:
Trigger your logic app by recurrence (same principle as the FTP trigger in fact)
Action: Create a variable to store the date time (format used in your folder naming)
Action: Do a list files in folder (here you should be able to dynamically set the folder name using the variable you created)
For-each file in folder
Action: Get File Content
Whatever you need to do with the file (call nested logic app in case you need to do multiple processing actions on each fiel is smart if you need to handle resubmits of the flow by file)
In order to avoid that you pick up every file each time, you will need to find a way to exlude files which have been processed in an earlier run. So either rename the file after it's processed to an extension you can exclude in the next run or move the file to a subfolder "Processed\datetime" in the root.
This solution will require more actions and thus will be more expensive. I haven't tried it out, but I think this should work. Or at least it's the approach I would try to set up.
Unfortunately, what you're asking is not possible with the current FTP Connector. And there aren't any really great solution right now...:(
As an aside, I've seen this pattern several times and, as you are seeing, it just causes more problems than it could solve, which realistically is 0. :)
If you own the FTP Server, the best thing to do is put the files in one folder.
If you do not own the FTP Server, politely mention to the owner that this patterns is causing problems and doesn't help you in any way so please, put the files on one folder ;)

How to get the internal table data in excel format if the report runs in background?

Could you please tell me the way to get internal table data in excel file to be saved in local desktop (presentation server). Remember that the report runs in background.
Both requirements a contradictionary because a backround job does not have any connection to any SAPGui. I.e. there is no SAPGui connection linked to a background job ans thus it is not possible to determine onto which local desktop the excel file should be saved.
A possibility would be to store the results that are created in the backround job somehow and externalize the save to a local desktop into another program. When creating files on the SAP AS you should keep in mind what happens with these file after they are not needed any longer. I.e. you need to do the file maintenance (deletion of files after they are not needed any longer) your self.
Theoretically you can do this, if you create the file on a SAP AS and move this file using any shell file move command. However, it's a bad practice to make any connection from SAP AS to a user's machine.
The best solution here is to create the file on SAP AS. The user must download the file manually from the SAP AS. You can also send this file to a user for example per e-mail. The user will do no manual work in the last case.
Also a good practice is to use PI server functionality. It can deliver your file within a way the user wants to have.

Renaming executable's image name is giving it write permission

Dear community members,
We have three of same hardware Windows 7 Professional computers. No one of them is connected to a domain or directory service etc.
We run same executable image on all three computers. In one of them, I had to rename it. Because, with my application's original filename, it has no write access to it's working directory.
I setup full access permisions to USER group in working directory manually but this did not solve.
I suspect some kind of deny mechanism in Windows based on executable's name.
I searched the registry for executable's name but I did not find something relevant or meaningfull.
This situation occured after lot of crashes and updates of my program on that computer (I am a developer). One day, it suddenly started not to open files. I did not touch registry or did not change something other on OS.
My executable's name is karbon_tart.exe
When it start, it calls CreateFile (open mode if exist or create mode if not exist) to open karbon_tart.log file and karbon_tart.ini file.
With the files are exist and without the file exists, I tried two times and none of them, the program can open the files.
But if I just rename the name to karbon_tart_a.exe, program can open files no matter if they are exist or not.
Thank you for your interest
Regards
Ömür Ölmez.
I figured out at the end.
It is because of an old copy of my application in Virtual Store.

source code location for debugging multiple instance of an application

Hi have an application running separateley (1 instance for customer) in different folders, 1 per each customer.
Each customer is a separate user on my machine.
At the moment I have the source code in each of these folders where I rebuild the code per each instance. Would it be better if I do something like the following?
create a shared folder where I build the code
deploy the binary in each user folder.
allow permission for each user to access the source code in READ ONLY mode.
when it is time to debug, by using gdb in each user folder will allow to read the source code and debug will happen.
Do you think that this could be a better approach or there are better practice?
My only concern is that each user has the chance to read the source code, but since the user will not access directly his folder (it is in my control) this should not trouble me.
I am using CENTOS 6.4, SVN and G++/GDB.
in different folders
There are no "folders" on UNIX, they are called directories.
I rebuild the code per each instance
Why would you do that?
Is the code identical (it sounds like it is)? If so, build the application once. There is no reason at all to have multiple copies of the resulting binary, or the sources.
If you make the directory with sources and binaries world-readable, then every user will be able to debug it independently.

Resources