How to get the internal table data in excel format if the report runs in background? - excel

Could you please tell me the way to get internal table data in excel file to be saved in local desktop (presentation server). Remember that the report runs in background.

Both requirements a contradictionary because a backround job does not have any connection to any SAPGui. I.e. there is no SAPGui connection linked to a background job ans thus it is not possible to determine onto which local desktop the excel file should be saved.
A possibility would be to store the results that are created in the backround job somehow and externalize the save to a local desktop into another program. When creating files on the SAP AS you should keep in mind what happens with these file after they are not needed any longer. I.e. you need to do the file maintenance (deletion of files after they are not needed any longer) your self.

Theoretically you can do this, if you create the file on a SAP AS and move this file using any shell file move command. However, it's a bad practice to make any connection from SAP AS to a user's machine.
The best solution here is to create the file on SAP AS. The user must download the file manually from the SAP AS. You can also send this file to a user for example per e-mail. The user will do no manual work in the last case.
Also a good practice is to use PI server functionality. It can deliver your file within a way the user wants to have.

Related

Lotus Notes : Fetch the information from share folder text file and store in Lotus notes database automatically

I am working on remotely on client system. I don't have knowledge on lotus notes.
There are some automated job executed in database which read the text file from share folder and store the information in lotus notes database. And another automated process generated report with that information.
Problem:
I have no idea which automated job are executed and also i checked in database but i didn't get anything. Can some please help here where i have to check or any other way to resolve this problem.
The way I would start looking for this is by using the Design Synopsis feature to generate a listing of all agents in the NSF file for the application. You can search that listing for the name of the folder. If you find a hit, backtrack a bit in the listing to find the name of the agent.
The thing is, this job could be running as an agent in any database, not just the one where the data resides. (I have often designed systems with "Agent Container" databases.) The job could also be running on an ETL server somewhere else on the network, or it could be running as a script launched by the operating system's scheduler elsewhere on the network. Hopefully, it's not one of those things, but even if it is in an agent in the target NSF file, you still might not find the folder name. The agent could be reading the folder name from a profile document in the NSF file, or from a configuration document in some other NSF file.
But that's where I'd start.

Cognos Analytics - save weekly reports to local drive

In the current situation weekly scheduled reports are saved to the Cognos server. However, I was asked if there is a way to save them to a local network drive. Is this possible (documentation on this issue will help too)? We're using Cognos Analytics 11.0.13 on-premise.
Thanks in advance!
Yes. There are two ways.
Using Archive Location File System Root:
Within the Archive Location File System Root (in Configuration), a folder is created for a user or business unit to write to. In my case (on MS Windows) I use a junction pointing to their folder on the network.
In Define File System Locations (in Cognos Administration), I configure an alias to the folder and apply permissions for certain users, groups, or roles to use (write to) that alias.
Users schedule or use "run as" to get their reports to run and write output to their location.
Using CM.OutPutLocation and CM.OutputScript:
For the folder or report in Cognos, apply permissions for certain users, groups, or roles to create versions. (write or full permission?)
Users schedule or use "run as" to save their report output as a version of the report.
Cognos saves a copy to the folder defined in CM.OutPutLocation (in Cognos Administration).
The script defined in CM.OutputScript runs. This script can:
Identify which file was just created.
Read the description (xml) file to determine the report path.
If the report path matches a specific pattern or value, do something (like copy it to the subfolder/junction and rename it to something meaningful like the report name).
Delete the original output file.
The second option seems like more coding, but far more flexible. It must be a .bat file, but that can call anything (PowerShell, Windows Scripting Host, executable) to do the actual work, so a thorough knowledge batch programming isn't even needed.
Down the line, if the requirement changes from save to local file system and becomes save to cloud. There is also this functionality which has been added:
https://www.ibm.com/support/knowledgecenter/SSEP7J_11.1.0/com.ibm.swg.ba.cognos.ag_manage.doc/t_ca_manage_cloud_storage_add_location.html

MS-Access 2013 unable to remove .laccdb locking file

This is a long shot, but does any one know how to remove the lock file created by access 2013 file type ".laccdb".
I have an excel sheet which is connected to the access database via power query. The access database is on a shared drive. However even when this file is closed the locking file for the access database is not deleted.
When trying to remove the lock file it just says that unable to close as another program is using.
I've closed down the machine, removed all temp files, checked nothing is running and also checked in computer management within the administration tools. and checked for any open files.
I know the database should be split to stop this happening. however this is not my database, and the user refuses to split.
Any help will be grateful.
You can open and read the lock file with a text editor (I use Notepad++); within the file you should find the computer name (or some similar identifier) of the one(s) who have it open. You could then take that name/number/whatever to IT and see if they can identify who the user is. You should be able to close it from their computer. Hope this helps.

how to programmatically detect a new file in a sharepoint shared folder

I am using wss3.0 and I need a way to listen on a shared folder library for file changes that are coming from users and check out those files and copy them somewhere else on disk. It's almost like an alert functionality, but every time it happens instead of emailing people, it needs to run some code to check out the new files and copy them to a network location.
the best solution that I can come up with is creating some custom timer job and check which files have changed since my last successful run but then I will need to save my last successful run date time somewhere.
If anybody has a better idea, they are more than welcome to share it.
You can add Event Receiver to this library, and every time an item is added it would start. Then inside Event Receiver you would copy the file to your disk location.

How to use exe in SharePoint on itemAdded?

I have a need to convert any document gets uploaded to Image.
I downloaded the exe (with all the dlls) on my local machine (dont have to install)
export.exe sourcefile.doc destinationfile.gif >> this syntax works from my local dos prompt.
How do I use the same syntax "export.exe exampledoc.doc exampledoc.gif" when an item is added to sharepoint doc library.
and Do I need to put the folder (where the exe and dlls are for this) in the sharepoint frontend server so it's accessible? If yes, where should this folder reside? Does the folder and files need sharepoint service account access?
I am totally new and I would really like if someone can shed some light on this (step by step if possible)?
Thanks
Justin...
In order to do this from a SharePoint event handler, each WFE on the farm would need to have your conversion application available, your event handler code would need to place the uploaded file in a temporary location on disc, invoke the conversion application (look at the .NET Process class for this), cancel the addition of the original, unconverted document, and add the output of your processed file to the library (ensure you use the DisableEventFiring() method of the event handler as to not have the event handler trigger itself during the addition of the new file), and then clean up after itself.
Having said that, this operation seems like something that you really wouldn't want to tax a web server getting any real traffic with doing in real time. You may want to look into batching the jobs to be done daily during traffic lulls by another system, or one dedicated resource on the farm.

Resources