heroku doesn't let me create a file even temporarly - node.js

i just discovered that Heroku uses an Ephemeral File System, so i opted to use AWS S3 to store images and upload them. the problem i encountered is creating the file to upload it.
my app is built with NextJS and my app is hosted on a basic dyno in Heroku. this app is an api with no front end code.
here is the code i use to create the file in my app using 'fs' library:
if (
!fs.existsSync(
path +
new Date().toISOString().split("T")[0] +
".csv"
)
) {
fs.writeFileSync(
path +
new Date().toISOString().split("T")[0] +
".csv",
await csv.toString(true)
);
} else {
fs.appendFileSync(
path +
new Date().toISOString().split("T")[0] +
".csv",
await csv.toString(false),
"utf-8"
);
}
this code creates the file normally in my computer.
the 'path' variable is created using either :
let folder = await fs.promises.mkdir(path, {
recursive:true,
mode: '0o755'
});
or i pass it to my request body.
i tried different paths like NextJS public folder and the "/tmp" folder, i also created folders manually in both paths but no result.
after a bit of research i found that you can create files it just gets erased when the dyno restarts and that usually happens every 24Hrs or so.
so my request is if you know how i can create a file even temporally just enough time to upload it to AWS S3? or a way to upload a file (png, csv) that is not created in the file system?
thanks in advance for your help.

Related

Is there a way to save a file to Heroku from a click via Puppeteer - NodeJs

I am using NodeJs & Puppeteer to access a web page and click a button which downloads a csv file. Locally this all works fine - I assign a path for the download and it saves it into the targeted folder. However, when running this deployed through Heroku, I can see that the Puppeteer script runs yet no file is downloaded. I assume this is an issue with saving files to Heroku. I realise Heroku uses ephemeral hard disks and I am fine with the csv file being deleted in a short period of time, I just need to read the data, handle it & then pass it to the frontend.
I have this bit of code which saves the csv file to my local folder destination.
const client = await page.target().createCDPSession();
await client.send('Page.setDownloadBehavior', {
behavior: 'allow',
downloadPath: downloadLocation, // Set download location
});
const csvButtonName = '.widget-actions-item';
await page.waitForSelector(csvButtonName);
await page.hover(csvButtonName);
await page.click(csvButtonName); //Downloads the csv file
I can my same directory structure on Heroku but cannot find the attempted csv download file.
Any help is appreciated.

How to use node:fs inside of a vue app to save images into source directory

I'm building a personal portfolio website using Vue.js, and I'm attempting to build a form to allow me to add to my portfolio later. I'm storing the text data in firebase, but I also want to be able to upload and access pictures. I'm attempting to upload through a form and save with node:fs with the following
import { writeFile } from 'node:fs'
export function saveImages (data:FileList, toDoc: string) {
const reader = new FileReader()
const imageNames = []
console.log(toDoc)
for (let i = 0; i < data.length; i++) {
imageNames.push(toDoc + '/' + data[i].name)
reader.readAsBinaryString(data[i])
reader.onloadend = function (e) {
if (e.target?.readyState === FileReader.DONE) {
const imageFile = e.target.result as string
if (imageFile) {
writeFile('./assets/' + data[i].name, imageFile, 'binary', (err) =>
console.log('was unable to save file ' + data[i].name + ' => ' + err)
)
}
}
}
}
return imageNames
}
When I attempt to call saveImages, I get the error
ERROR in node:fs
Module build failed: UnhandledSchemeError: Reading from "node:fs" is not handled by plugins (Unhandled scheme).
Webpack supports "data:" and "file:" URIs by default.
You may need an additional plugin to handle "node:" URIs.
As pointed out by the comments on your answer, the Node.js-fs-module is cannot be used in the frontend. Here is why:
While developing your vue.js-app, you should remember that you always have to run a development server in order for the app to compile to a browser-readable format. The resulting page will not be delivered as some .vue-files and .js-files but everything will be bundled into an html-file and some additional .js-files.
While running the development server, the source directory of your app is 'lying' on a server, but this is not even the directory that is delivered to the browser.
In a production server, there will be static assets built out for your vue.js-app, which does also only contain .html- and .js-files.
When a client (browser) accesses a route, some static files will be delivered to the browser, and all the code you are writing in your .vue-files will be run on the client's machine, not on a server. So you cannot interact with server-side directories.
Therefore, you should look into a backend server framework, which you can connect to your frontend to allow users to upload files to a server, and those files would be saved on the server. You will then set up your vue app to communicate with the backend. Here are some additional resources:
Node modules with vue: StackOverflow
Express.js: popular backend framework for Node.js
Express.js: Deliver files
Blog article on Express.js file upload (Attacomsian)
You might also want to take a look on how to send static files with express, because once the files are uploaded and the server receives them, it could store them into a static-directory, where you could access them without having to use separate API-routes.

Issues with code modifying file structure when deploying

There is a site that contains data I want to parse through in my application. The JSON file is in a tar.gz. My code issues a request to that site, downloads the tar.gz file, extracts the JSON and then parses the information.
This is how the code looks so far but I have not added it into my backend yet.
const fs = require("fs");
const rp = require("request-promise");
const tar = require("tar");
(async function main() {
try {
const url = "https://statics.koreanbuilds.net/bulk/latest.tar.gz";
const arcName = "latest.tar.gz";
const response = await rp.get({ uri: url, encoding: null });
fs.writeFileSync(arcName, response, { encoding: null });
tar.x({ file: arcName, cwd: ".", sync: true });
let text = fs.readFileSync("latest1.json");
let fullText = JSON.parse(text);
let championsObj = {};
// Following logic that parses the json file
.......
} catch (err) {
console.error(err);
}
})();
I plan on storing my parsed JSON object into MongoDB. I also want to perform the above operation and update the JSON and tar.gz file every 24 hours.
I am worried that these operations have many consequences when deploying this project. This is my first time deploying a Full stack application and I am almost positive that having code that messes with the file structure of the overall project will cause some issues. But I just don't know what exactly I should be worried about and how to tackle it. I believe that there will be a problem with CORS but I am more worried about the application actually working and updating correctly. The entire application is being made with the MERN stack.
When you deploy your code on a VPS, saving and reading from the filesystem is completely fine. When you deploy to PaaS like Heroku, you have to keep in mind that the filesystem is ephemeral which means that you get a fresh new copy on each deploy. Files that are not part of the version control will disappear after a release. You can't rely on the filesystem for storage and you have to use an external service to store images/files (e.g.: AWS S3).
Having said that, your code will work on Heroku because you're saving and reading from the file right away. One thing I'd do is add a date/timestamp to the downloaded file name so you don't get an error on the second run when a file with that name already exists. You could also research the possibility of extracting the archive in memory so you don't have to use the filesystem at all.
Other than that you shouldn't be worried. CORS is not relevant in this context.

Node/Express: File not downloading using fs.pipe()

I'm having an issue downloading files to my local machine using fs.pipe() in my Node.js/Express cloud foundry application.
Basically, a POST request is sent to my server side code containing the file name my user wants to download. I access the file using the GET command from the npm module ssh2-sftp-client. Finally, this file gets saved to the users local downloads folder using the npm module downloads-folder to identify this location. The code looks like this:
app.post('/download-file', function(req, res) {
// Declare the files remote and local path as a variable.
const remoteFilename = 'invoice/csv/' + req.body.file;
const localFilename = downloadsFolder() + '/' + req.body.file;
// Use the SFTP GET command to get the file passing its remote path variable.
sftp.get(remoteFilename).then((stream) => {
// Download the file to the users machine.
stream.pipe(fs.createWriteStream(localFilename));
// Redirect user.
res.redirect('/invoice')
});
})
This works perfectly when running locally and the file gets downloaded with no issues. As this screenshot shows, the output for the destination file path is this:
However, when I push this to our cloud foundry provider using cf push, the application still works fine but when I want to download the file it fails. I get no errors when error catching, the only thing thats changed is that the output for the destination file path has changed to:
I have no idea why this is, this code works fine in Chrome, Safari when running locally but when hosted doesn't do anything. Can anyone explain what's going wrong here?
Many thanks,
G
// Download the file to the users machine.
It doesn't do that, though: it downloads the file to the machine on which the server code is running. That's why it seems to work when you run the server on localhost, because the server machine and the user machine are the same.
Offering the file as a download involves streaming the file through the response object, making sure that you set the correct header. Something like this:
sftp.get(remoteFilename).then((stream) => {
res.set('content-disposition', `attachment; filename="${ req.body.file }"`);
stream.pipe(res);
});
When offering a file as "attachment", it typically opens the download window of the browser and the browser stays on the same page. You can't both stream and perform a redirect, but because the browser doesn't change the URL, that should be okay.
Also, you have no control over where the user will download the file to. It may be the downloads folder, but they are free to chose another folder.

node.js on heroku keeps losing image after a while

my Node.js app keeps losing static images(.jpg, .png...) after a while. It doesn't lose any images on my local win10 desktop and even on heroku, my webpack bundle.js is served from the same static route(/pub or /dist) and they work just fine. somehow only the static images, they are served alright for first few minutes when I first uploaded then after a while, it disappears. I am using express.static for static route declaration and multer for file upload. The files used for test were all lowercase .jpg(since I've heard heroku arbitrarily changes all uppercase extensions) so I don't know what's causing the problem.
server code:
const storage = multer.diskStorage({
destination: (req,file,cb)=>{
cb(null,'pub/')
},
filename: (req,file,cb)=>{
cb(null,Date.now() + file.originalname)
}
})
const upload = multer({storage:storage})
//access to static files
app.use('/pub', express.static(pubDir))
app.use('/dist', express.static(dstDir))
app.post('/modwimg',upload.any(),(req,res,next)=>{
//here I connect filename from files array to db
})
then if there's a client request, the server fetches filename from the db and put '/pub/' in front of it. It works just fine on both my local machine and heroku. it's only that images on heroku disappear after a while.
The heroku file system is transient. If you want to allow users to upload files to your app, you'll need to use external storage like S3, database blobs, or a hosted service like cloudinary. See this thread for more information: https://www.reddit.com/r/rails/comments/2k9sq4/heroku_any_files_you_upload_will_not_be_saved/

Resources