Automating Macros for an excel file in SFTP location - excel

I have an excel file in an SFTP location with inbuilt macros in it. I want to do an automation such that the macro runs at a certain time daily in the SFTP server.
Initially I did it using task scheduler, but now that the file is not available locally and since it is residing in the SFTP Location I am unsure on what to do.
Help.

You can't. The SFTP server is a file store, not an application server.
So, download the file in question to a local folder, run your macro using this local file, and - when done - upload it to the SFTP server overwriting the old file.

Related

What is the best way to retain and download locally a .txt file that is generated during an Azure Devops release?

I am generating a .txt file which includes information about the deployment during my release at a specific stage located at the $(System.BuildDirectory). How can I obtain this .txt file and download it locally, for example onto my desktop?
Thanks
As a workaround, you can try to add a Windows Machine File Copy task to copy the .txt file to your local machine.

File archival using spring integration smb

I have a use case where I should read the file from windows remote location with the following steps:
Read file from windows share location say input folder.
Process file and upload to REST service with the content of file.
If the call is successful move the remote file to archive folder, archival folder will also be on windows remote share.
I am using spring integration smb. Please help how can I achieve the archival use case.
Thanks,
barvepan
If it's on the same share, use the SmbRemoteFileTemplate's rename() method.
If it's a different share you'll have to copy the file and remove it (you can use the remote file template for that too).

How to create or use Local Folder in Azure?

I have a required to download a file from SFTP server and the file downloaded is stored to local folder say "D:\Data\tempData.csv"
I have to read the data from local file and consume in my application for other data manipulation.
This job is created using web hooks scheduler in Azure Web Jobs.
I am unable to download file to azure and then read from there.
Can some one help me to use a location for temp data which is equivalent to "D:\Data\tempData.csv" in local system in the azure environment.
Suggest a place in azure where can I download file and then to read from there.
Thanks in Advance.
What I tried?
Tried using SSH.NET dll to download file from SFTP to local folder
Again to read from local folder to my application
Tried looking at BLOB storage usage, which was not approved Tech Arch.
In an Azure Web App, you can create files anywhere under d:\home (for persistent files) or under d:\local (temporary files). See this page for more details on the file system. Try using Kudu Console to see those locations.
How you get the file in that location sounds mostly unrelated to your primary question about what location you can use.
In Azure Environment, the "Web-Jobs" are stored in its local folder where known as "D:\home" and "D:\local" is the local folder used by the Web-hooks.
I was in need to use a folder for temporary usage of downloading a file from SFTP server and again read the file from that local temporary location file and consume it in my application.
I have used the "D:\local\Temp" as the temporary folder which is created by the code after checking the folder existence, then after creating the folder the code will download a file from server and store to this location and then read from the same location and delete the file from that temporary folder.
Thanks all for your help, #David Ebbo Thanks.

Schedule weekly Excel file download to an unique name

We got a database where every Monday a Excel file gets uploaded from a client. The file is always the same name so if we forgot it we lost it. Is there a way how we can make a script that renames the script and gives it the date or a number?
We're using FileZilla to get the files now.
FileZilla does not allow any kind of automation.
You can use this WinSCP script:
open ftp://user:password#host/
get "/path/sheet.xls" "c:\archive\sheet-%TIMESTAMP#yyyy-mm-dd%.xls"
exit
The script connects to the server, downloads the sheet to a local archive under a name like sheet-YYYY-MM-DD.xls.
For details see:
Automate file transfers (or synchronization) to FTP server or SFTP server
Downloading file to timestamped-filename
Then create a task in Windows scheduler to run the winscp.exe every Monday with arguments:
/script="c:\path_to_script\script.txt" /log="c:\path_to_script\script.log"
Logging (/log=...) is optional, but it's recommended.
For details, see Schedule file transfers (or synchronization) to FTP/SFTP server.
(I'm the author of WinSCP)

Notepad++ upload to FTP and keep file

So I have a domain and a directory on my computer that can be accessed with my ipadress as link and I want so when I click save it uploads the updated version to the FTP AND updates the file on my computer so both my domain website and my ipadress website will be updated with the new changes, is that possible?
I'm sorry if it is hard to understand the text, kinda hard to explain to...
Use Win-scp.
Install it, then either:
Open the file directly from the server and when you save the file will be uploaded
click on Keep remote directory up to date and once you save for file locally, winscp will upload it for you.
To work with Notepad++ with winscp you need to configure winscp to use notepad++.

Resources