I wrote a script that is using slack API to parse AWS S3 files looking for strings or samples. As this is in testing, I'm using my local machine and ngrok to forward localhost traffic.
The thing is that the generated files are getting stores in my machine and will be stored in server once the script is ready for production.
Ideally, I'd like to avoid users needing to grab files from server. Do you think it's possible to store directly in user local machine?
No. Slack does not allow you to access the local machine of their users through a Slack app / API.
Solution 1: Download via browser
The easiest solution would be to offer a direct download link in a Slack message, e.g. via a Link button. Once the user clicks it he is prompted to download the file to his local machine.
Here is an example from one of my apps:
And once you click it you get this window:
To enable downloading via browser you need to set appropriate headers and send the file contents to the browser.
One approach is to have a helper script for the actual download and include a link to the helper script in the link button (you may also want to include some parameters in the link that defines what which file is downloaded).
The helper script then does the following:
Fetch the file to be downloaded (e.g. an PNG image)
Set headers to enable downloading via browser
Send the file to the browser
Here is an example in PHP:
<?php
$filename = "demo.png";
$file = file_get_contents($filename);
header('Content-Disposition: attachment;filename=' . $filename);
header('Content-Type: image/png');
echo $file;
die();
For more infos on download headers see also this answer on SO.
Solution 2: Upload to Slack
Alternatively you could upload the file to the user's Slack workspace via file.upload API method. That way the user does not need to download anything and and you can remove the file from your server after your app has finished processing.
Related
I would like to generate a one-time download link in Node.js and email it to the user so he/she can download it. I would want the link to expire after a while, say one day or one week for example. How can I do this using node.js?
Thanks!
I can download the file using res.download but it sends the file directly to the client and do not generate download link.
This depends on where are you saving the file.
If you just save the file in your own server, then if you save the file in a static folder you can expose the file through your server's link.
This explains it "uploading File to a static folder in the Server" https://www.bezkoder.com/node-js-express-file-upload/
Now, you also want to expire the link. This would be more complicated since now you have to store the timestamp with respect to a link in the database and make the link invalid when the allocated duration passes.
This all is done by amazon s3 so if its possible to use that you should use it instead of implementing everything.
In aws s3, you can store your file and generate signed links that expire after a duration.
I tried to upload file on file.io which has api using curl -F "file=#filename.txt" "https://file.io/"
how to achieve this using easyupload.io and validate that file is uploaded properly and also display the url for the uploaded file.
Easyupload.io uses Google Captcha in the background to prevent automatic uploading.
So I guess if there is no specific API for this site, it's not meant to be used with a script / program.
You can reverse engineer the upload process by opening the network tab in your browser's developer tools.
I have a nodejs backend and I want to send a file download link to the client such that, the file is directly accessible by the client. The file types are JPEG and PNG. Currently, I am serving these files as data-uri but, due to a change in requirements, I must send a download link in response to the file request and, client can download the file later using that link.
Now the current workflow exposes a path /getAvatar. This path should send a response back to the client with the file link. The file, is stored in /assets/avatars relative to the server root. I know I can express.static middleware to send back static resources. However, the methods I have seen so far, res.send() and res.download() both tries to send the file as attachment rather a link that can be used later to download.
Basically, the behavior is like a regular file sharing site where, once a file is clicked, a link to it is generated which, is used for downloading the file. How can I do this?
I am currently using dropbox file picker to download the file. I got the download link after selection of file using dropbox picker.
Is there any possibility that we can save it inside bytestream in browser and upload it to server(Node.JS) using http post call ?
Or Is there any alternative to this scenario ?
Any help would be appreciated.
Instead of downloading and reuploading the file on the browser, I would have this step to be processed on the server side.
You can use Dropbox and S3 sdk's and follow the steps below:
Make a call to the server that will send IDs of the list of files available in Dropbox.
Let the user select a file in the angular app and send the selected file's resource identifier back to the server.
Download the file and then re-upload it to the S3 on the server side.
Display the result/status back to the user.
Is there any reason you want this to be done in the frontend?
The requirement is to upload an excel file to Google Drive which file is initially being stored on an FTP server.
I would like to know if it is Possible achieving this through Google App Script. If not App script, is there any way in which we can fetch files from the FTP server and then upload it to Google Drive.
I found out about the Class UrlFetchApp.
var response = UrlFetchApp.fetch("http://www.google.com/");
makes request to fetch url.
UrlFetchApp.fetch("http://example.com/upload_form.cgi", parameters);
Makes a request to fetch a URL using optional advanced parameters.
I don't know if the above 2 methods would be of any use.
Well, I found on this thread that FTP access is currently unavailable for Google Drive.
But I found a tutorial here that can connect or integrate Google Drive to FTP. This tutorial use a multiple cloud storage manager that easily combine the two services together.
So if you are interested with it, just read the tutorial link to know more about MultCloud.