Schedule weekly Excel file download to an unique name - excel

We got a database where every Monday a Excel file gets uploaded from a client. The file is always the same name so if we forgot it we lost it. Is there a way how we can make a script that renames the script and gives it the date or a number?
We're using FileZilla to get the files now.

FileZilla does not allow any kind of automation.
You can use this WinSCP script:
open ftp://user:password#host/
get "/path/sheet.xls" "c:\archive\sheet-%TIMESTAMP#yyyy-mm-dd%.xls"
exit
The script connects to the server, downloads the sheet to a local archive under a name like sheet-YYYY-MM-DD.xls.
For details see:
Automate file transfers (or synchronization) to FTP server or SFTP server
Downloading file to timestamped-filename
Then create a task in Windows scheduler to run the winscp.exe every Monday with arguments:
/script="c:\path_to_script\script.txt" /log="c:\path_to_script\script.log"
Logging (/log=...) is optional, but it's recommended.
For details, see Schedule file transfers (or synchronization) to FTP/SFTP server.
(I'm the author of WinSCP)

Related

Automating Macros for an excel file in SFTP location

I have an excel file in an SFTP location with inbuilt macros in it. I want to do an automation such that the macro runs at a certain time daily in the SFTP server.
Initially I did it using task scheduler, but now that the file is not available locally and since it is residing in the SFTP Location I am unsure on what to do.
Help.
You can't. The SFTP server is a file store, not an application server.
So, download the file in question to a local folder, run your macro using this local file, and - when done - upload it to the SFTP server overwriting the old file.

Cognos Analytics - save weekly reports to local drive

In the current situation weekly scheduled reports are saved to the Cognos server. However, I was asked if there is a way to save them to a local network drive. Is this possible (documentation on this issue will help too)? We're using Cognos Analytics 11.0.13 on-premise.
Thanks in advance!
Yes. There are two ways.
Using Archive Location File System Root:
Within the Archive Location File System Root (in Configuration), a folder is created for a user or business unit to write to. In my case (on MS Windows) I use a junction pointing to their folder on the network.
In Define File System Locations (in Cognos Administration), I configure an alias to the folder and apply permissions for certain users, groups, or roles to use (write to) that alias.
Users schedule or use "run as" to get their reports to run and write output to their location.
Using CM.OutPutLocation and CM.OutputScript:
For the folder or report in Cognos, apply permissions for certain users, groups, or roles to create versions. (write or full permission?)
Users schedule or use "run as" to save their report output as a version of the report.
Cognos saves a copy to the folder defined in CM.OutPutLocation (in Cognos Administration).
The script defined in CM.OutputScript runs. This script can:
Identify which file was just created.
Read the description (xml) file to determine the report path.
If the report path matches a specific pattern or value, do something (like copy it to the subfolder/junction and rename it to something meaningful like the report name).
Delete the original output file.
The second option seems like more coding, but far more flexible. It must be a .bat file, but that can call anything (PowerShell, Windows Scripting Host, executable) to do the actual work, so a thorough knowledge batch programming isn't even needed.
Down the line, if the requirement changes from save to local file system and becomes save to cloud. There is also this functionality which has been added:
https://www.ibm.com/support/knowledgecenter/SSEP7J_11.1.0/com.ibm.swg.ba.cognos.ag_manage.doc/t_ca_manage_cloud_storage_add_location.html

How to get the internal table data in excel format if the report runs in background?

Could you please tell me the way to get internal table data in excel file to be saved in local desktop (presentation server). Remember that the report runs in background.
Both requirements a contradictionary because a backround job does not have any connection to any SAPGui. I.e. there is no SAPGui connection linked to a background job ans thus it is not possible to determine onto which local desktop the excel file should be saved.
A possibility would be to store the results that are created in the backround job somehow and externalize the save to a local desktop into another program. When creating files on the SAP AS you should keep in mind what happens with these file after they are not needed any longer. I.e. you need to do the file maintenance (deletion of files after they are not needed any longer) your self.
Theoretically you can do this, if you create the file on a SAP AS and move this file using any shell file move command. However, it's a bad practice to make any connection from SAP AS to a user's machine.
The best solution here is to create the file on SAP AS. The user must download the file manually from the SAP AS. You can also send this file to a user for example per e-mail. The user will do no manual work in the last case.
Also a good practice is to use PI server functionality. It can deliver your file within a way the user wants to have.

How update a Azure Website via FTP transfer

My requirement is to transfer files like DDL's and config files to the Azure Website from my local system.
I have gone through links and have followed following steps :
Open FTP_Server_name_from_Publish_profile
Supplied User name and password and Able to connect
Cd Site\wwwroot
put "Some_File"
Getting following error:
200 PORT command successful.
150 Opening ASCII mode data connection.
When I am trying to connect via WinSCP client and transferring files via GUI it is working fine.
Also, I tried the above steps for transferring files to the sample FTP servers, and it is working Fine.
Let me know where I am going wrong.
It's not clear what you ask for. I assume you want to automate an update of Azure Website.
As you have succeeded using WinSCP GUI, you can do the same using WinSCP scripting. The WinSCP script would look like:
open ftps://user:mypassword#waws-prod-am1-234.ftp.azurewebsites.windows.net/
put C:\myjob\* /site/wwwroot/App_Data/jobs/type/name/
exit
For details see Automating WebJob Update in WinSCP documentation.

Retain file server's original creation / modified dates of a file when downloading it from a website

How can I keep the original 'Created' and 'Modified' date and timestamps of a file when I download it from a website?
When I download a file from a website, the 'Created' and 'Modified' dates are taken at the time of the download. I want to have the original values from the file server on my downloaded file.
In Windows, there is a tool that can do this called the Internet Download Manager (IDM) but is there something that can do the same for Linux and Mac OSX?
I know that it can also depend on the file system that the file server uses in order to interpret date and timestamps for files. For example, a Windows-based file server will probably be using NTFS so it's interpretation of a date and timestamp of it's files might be different to that of a Linux-based server. I don't know if this will have any impact on the end user being able to download the original dates and times, regardless of which file server they are downloading them from.
wget can retrieve the timestamp where possible - it's completely dependable on the file server and how it sends the file down. For example, I cannot retrieve the timestamp of a file if I download it from the Internet Archive (http://www.archive.org). This site does not provide the original file creation date and timestamp.

Resources