Pushing Bluemix app wipes files in the public folder - node.js

I have a Bluemix app (with Node.js backend) on which i upload some files (in the folder public/uploads).
Whenever I change my server code and cf push the app, the files that are in the uploads folder are wiped. How can i publish my app without touching files and folders that I would like not to touch?
Thanks in advance.

This is happening because the way Cloud Foundry works. Bluemix runs on Cloud Foundry. What is causing this is the fact that the file system is ephemeral. The file system should not be used to store uploaded files.
When an app restarts, crashes, scales, or you upload a new version the file system is wiped.
Additionally if you scale your app to for example 5 instances each instance of your app would have different uploads.
I would highly encourage you to check out the 12 Factor App. One of the tenants is not storing files on disk.
I would encourage you to use a shared file system such as OpenStack Swift. It is available in Bluemix and is called Object Storage.

Restaging will wipe your files on Cloud Foundry, as there is no permanent local storage available. Try using Bluemix Live Sync to edit your code on the fly -- no restaging required. You can either download the Bluemix Live cli, or use IBM Dev Ops Services to take advantage of Live Edit. The documentation goes over all the options.
For more permanent storage solutions, check out the Bluemix catalog for services like Cloudant and Redis.

The file system for cloud foundry applications is ephemeral. Every time your push, restart, scale, you get a new fresh file system. Your application should not store files to disc (except cache/temp). Your application should store the uploaded files in some kind of database or a blob store. Look into the Cloudant service.
Considerations for Designing and Running an Application in the Cloud

Related

How Can I Save my images uploaded in file system for a deploy on google app engine

In my App deployed on Google App Engine I save images uploaded by user in public folder of my File system (using Multer)
Now If I deploy a new version of my app the images uploaded are lost are lost how do I solve this issue
Is there any method with which I can deploy new version of my App keeping the images uploaded Intact?
Basically How can I Backup my File System?
App Engine is a serverless and stateless product. You lost the image if you deploy a new version, but also is the service scale up and down, or if your instance is stopped for maintenance and restarted on other servers (it's the case for Flex app engine which restart at least once a week).
In this context, your design is not correct for using serverless product. You need to save the file elsewhere, typically in Cloud Storage and to load them from there. If you need to index, search, list your file, it's also a common pattern to save the metadata in a database, Cloud Firestore for example, to easily search the files, and then download them from Cloud Storage.
Notes: there is no persistent file system, it's in memory file system on serverless environment. You can also have out of memory error if you store too much file.

Why are my file uploads missing/deleted? Azure App Service

I'm currently developing a web application using Flask, this app takes information from a SQL database to generate a .pdf that the final user can download from the browser.
My problem is that some hours after I deploy the app using Azure App Services, make some changes to the SQL database and generate some .pdf, the app automatically resets to its original state, therefore, I lose all the changes to the SQL database and generated documentation as it had some kind of ephemeral memory.
Is there any way in which I can store these files without losing them after a while?
In this case what Azure service/solution do you recommend me to use to store these files?
The data generated once the app is in use should be pretty little, the SQL will be updated once a month with a couple of .pdf generated.

Migrating Large Content Web App to Azure

I have a large web app with around 20 gigabytes of images and mp3s. It currently uses standard file IO libraries to read and write the sounds and mp3s. I'd like to migrate it to Azure, but I have concerns about storing that much content. Is it possible to use an App Service to host the web app and some sort of storage mounted to the root of the site for the assets without rewritting all of the file access to use blobs or some other api?
If you look at the App Service plans here, you will notice that with Standard and better plans, you get storage more than 20GB (50GB+) so it is certainly possible to take your app as is and run it in Azure. However it is not recommended practice.
What you should do is make use of Azure Blob Storage for storing media content. You will need to make some changes as you can't simply mount Azure Blob Storage as a network drive.
There's Azure Files as well that can be mounted as a network drive but as of today you can't mount a File Share as a network drive in an Azure WebApp. You would need to deploy your application in either a Virtual Machine (IaaS) or rewrite your application to run as a Cloud Service.

Best way to store private files in a node.js application hosted on Windows Azure

I planning to host a simple node.js app on the website tier of Azure.
The application uses a .pem file to sign responses. The issues is that I'd like to deploy the application by simply pushing the git repo, but I don't want to include the .pem file in that repo because it seems that would be a big security issue?
Is there a way I can manually upload one file? What's the best way to store a .pem file on Windows Azure? What are common ways to handle situations like this?
This question is a bit open-ended, as I'm sure there are several viable ways to securely transfer a file.
Having said that: From an Azure-specific standpoint: You should be able to upload a file to your Web Site via ftp. Also, you could push a file to a specific blob and have your app check periodically for files in said blob. To upload (or download later), you'd need the storage account's access key, and as long as you're the only one with that key, you should be ok. When uploading from outside of Azure, you can connect to storage via SSL, further securing the upload.
While you could probably use a queue (either Storage or Service Bus) with the .pem file as payload in a message that your node app would monitor, you'd need to ensure that the file fits within the limits of the message size (64K for Azure Queue, 256K for Service Bus queue).

Azure: Start large app quickly on demand?

I don't really understand how cloud services work.
Especially I would like to know if its possible to:
Upload a big application (>1GB) one time only and pay little (only for the storage) and on demand, quickly spawn instances of it (with max. a few minutes of startup time).
So, only pay if the application really runs and not having to upload it all again every stop/start.
Thanks!
Greets,
soacc32
It is certainly possible to do so. In fact this is how cloud service deployment works. When you deploy your application from your local computer, first the application package files are uploaded in blob storage and then deployed from there. You could upload the package separately in blob storage and when you want to create the deployment, you just specify the blob URL. So if you're creating a deployment using the portal, for package file and configuration file location you would choose "FROM STORAGE" instead of "FROM LOCAL" as shown in the screenshot below.
Hope this helps.

Resources