I have a required to download a file from SFTP server and the file downloaded is stored to local folder say "D:\Data\tempData.csv"
I have to read the data from local file and consume in my application for other data manipulation.
This job is created using web hooks scheduler in Azure Web Jobs.
I am unable to download file to azure and then read from there.
Can some one help me to use a location for temp data which is equivalent to "D:\Data\tempData.csv" in local system in the azure environment.
Suggest a place in azure where can I download file and then to read from there.
Thanks in Advance.
What I tried?
Tried using SSH.NET dll to download file from SFTP to local folder
Again to read from local folder to my application
Tried looking at BLOB storage usage, which was not approved Tech Arch.
In an Azure Web App, you can create files anywhere under d:\home (for persistent files) or under d:\local (temporary files). See this page for more details on the file system. Try using Kudu Console to see those locations.
How you get the file in that location sounds mostly unrelated to your primary question about what location you can use.
In Azure Environment, the "Web-Jobs" are stored in its local folder where known as "D:\home" and "D:\local" is the local folder used by the Web-hooks.
I was in need to use a folder for temporary usage of downloading a file from SFTP server and again read the file from that local temporary location file and consume it in my application.
I have used the "D:\local\Temp" as the temporary folder which is created by the code after checking the folder existence, then after creating the folder the code will download a file from server and store to this location and then read from the same location and delete the file from that temporary folder.
Thanks all for your help, #David Ebbo Thanks.
Related
I verified the user has permission, but I do not see it showing up in the list of possible Logic Apps:
Here is a screenshot of what I can do and it is missing Create Folder.
There is no create folder action available in the SFTP connector. Share your idea here
https://feedback.azure.com/forums/287593-logic-apps
You can get around this problem by using the "SFTP Extract archive to folder". You need to put a file that contains a single zipped file on your SFTP server. Then use the connector just mentioned to unzip the zip file to the new folder you want to create. It will then create the required folder.
So I have a zip file called CreateFolder.zip which I unzip to a new folder. The SFTP server must have unzip installed, it won't work work with gunzip.
I have a use case where I should read the file from windows remote location with the following steps:
Read file from windows share location say input folder.
Process file and upload to REST service with the content of file.
If the call is successful move the remote file to archive folder, archival folder will also be on windows remote share.
I am using spring integration smb. Please help how can I achieve the archival use case.
Thanks,
barvepan
If it's on the same share, use the SmbRemoteFileTemplate's rename() method.
If it's a different share you'll have to copy the file and remove it (you can use the remote file template for that too).
Bit of a loose question so if it gets marked down I'll remove it.. but..
I'm using Primefaces/Spring/Hibernate for Java server.
My application knows a load of file names I need to upload. Those files are on my local computer. Is it possible to tell the application the root directory of these files, for it to then setup uploads for each of these files without me needing to browse for each file individually?
I assume this is a browser security issue, i.e. the user needs to explicitly state which file the application is allowed to know about etc?
If not I'll have to do it in a local application but I was hoping there was a way a mass upload could be kicked off from the browser by just setting the local directory of the files.
I decided to use the Primefaces uploader, upload all the files in the directory and let the application sort them out once it has them on the server.
I have created a mvc4 application with text file. I write some data in a file "ddd.txt". I wrote such address at my PC: "#D://Project//ddd.txt" and it worked finely. However, when I deploy my website on "azurewebsites.net", then I should write another address.
It is very important text file for me. And I would like to read data from it.
What address of directory should I write in my application to work him on Windows Azure server?
I would try using App_Data (you can just add the directory if it doesn't exist in your project).
You could then load the file like this:
string path = HttpContext.Server.MapPath("~/App_Data/ddd.txt");
// load file here
The other option would be to store it in Blob storage, there is a good walkthough of the different features here: http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/
I'd like to force sharepoint to save files in directory. Is there a way to do that?
I think about this scenario: users upload files to some list / library in sharepoint and automatically or by pushing "publish" the files are copied to some local server's directory.
Edit:
In other words i would like to connect sharepoint library with physical directory in server that runs IIS, so that files uploaded to library were seen in that folder.
I'm new to sharepoint.
Are you talking about Remote BLOB Storage? I have not tried this and assume that RBS can be enabled for a Site level only and not for individual document libraries. If you want this for a particular doc library, you can write an event handler to save the uploaded documents to file system and then remove the uploaded file.
Most likely you don't want to do this. If you're doing it in order to access the files from other applications, or having them show up in a users home directory or something, you can just map the document library as a network drive/web folder.