I'm having trouble storing csv files on a heroku server. I have a javascript application running express and a few other node packages and wondering what I should do in order to maintain those csv files. Heroku will always wipe new data and revert to the last deployed code (https://help.heroku.com/K1PPS2WM/why-are-my-file-uploads-missing-deleted), so any new files added while the app is running will be erased. Wondering if I should look for another dedicated service to deploy the project on or store the data in a database.
Related
I deployed app on heroku which can save files with database record. Problem is, when i deploy again some changes, folder with files is new without files which were there before deploy (rows in database stay of course). How can resolve this? I want to folder "files" in root app folder stay this same during all future deploys on Heroku.
App - nodejs + express + react + postgresl
This is not possible, Heroku provides an ephemeral file system which is wiped out after each deployment.
You need to persist data in a database (MongoDB Atlas, Heroku Postgres addon) or use an external file storage (AWS S3, Dropbox).
Check out Files on Heroku to see some options and examples.
Is there any possibility to download running app file?
Im using quick.db for my discord bot database and I need that running json database file with user data before restarting. When I will deploy a new build, then the old json database file from github will load and I will lose all new data in that database.
Thx for anything...
You can programmatically upload the json file to remote storage (S3, DropBox): you can do this at restart (if you can intercept the event) or at regular times (this might not work for you, it depends how often the db is updated).
You can read about some possible options on Heroku Files
The download option is not a good idea because you need to secure the download endpoint (I believe you don't want anybody be able to grab your data) and also because the Dyno restarts at least every 24hrs (if you don't deploy for a day the Dyno will restart without notice and giving the time to download your data).
I have a backend Nodejs application and I am fetching and streaming files in the background when a certain event happens in the client.
I have deployed the backend to Google App Engine.
The file downloading is working fine but I am a bit confused where the files are downloaded and stored ? In the app I am creating a folder relative to the deployed app folder and storing them there with createWriteStream. I also init a git repository where the files are (using simple-git npm module)
It seems the files are not accessible via the cloud shell since I can not find them there
Can I for example create a storage bucket and use "normal" file operations command there (and init the repo there)
-Jani
To store data downloaded you want to store it in Cloud Storage, you can find a complete guide in this Using Cloud Storage documentation.
Under almost any circumstances you want to download files into the App Engine Deployment since the instances doesn't have much memory to store data, and also when the deployment scales up and down you are prone to lost data
I am using node.js to upload files to my Heroku server. Everything works fine, but when the Heroku server restarts or goes down all the uploaded files disappears, the URL hits returns 'Not Found'.
Experienced this months ago. You need to host the images somewhere else as Heroku does not support storing files in them. Ended up using Cloudinary to store files, and then later on getting a VPS server.
Media files and other user content that is uploaded by users should not be stored on Heroku. Heroku was built to run the application code and it only cares about the files of your application that are in your repository.
Heroku discards your previous environment on every deploy, and launches a new one based on your code repository.
So only application code should stay there, other things should be delegated to other services, in this case for file storage you should use something like S3 or similar.
Heroku has something called Ephemeral filesystem.
From the documentation:
Each dyno gets its own ephemeral filesystem, with a fresh copy of the
most recently deployed code. During the dyno’s lifetime its running
processes can use the filesystem as a temporary scratchpad, but no
files that are written are visible to processes in any other dyno and
any files written will be discarded the moment the dyno is stopped or
restarted.
I am creating a Windows 10 Universal App which uses a local SQLite Database.
In order for the app to use the database file It must be placed in:
C:\Users\<Username>\AppData\Local\Packages\<Name of Package>\Local State
Now I understand this is the 'local' file structure for the application. However I have a pre-made database that the app needs to interact with and therefore should be bundled as part of the app on install.
Is there a method of including my database in a usable fashion when distributing my application via a side-load install?
Furthermore, This problem is of paramount importance as This 'C:\' Directory will not exist when pushing my application to the mobile phone or other Windows 10 (not a desktop) device.
You cannot package the database directly as read-write data (local state). If you only ever need to read from the database, you can just include it in your project and read it from Package.Current.InstalledLocation.
If you need to write to the database, but it contains some initial values you want to ship with your app, then you still need to include the database in your project, but then copy it from the InstalledLocation to ApplicationData.Current.LocalFolder if it doesn't exist when your app starts up.
You can all ways export your existing data base as SQL script and save it in your project assets.
On the first run of your application you can create the Sqlite file in your LocalFolder, and run the script with CREATE and INSERT queries.