Where are files downloaded in Google App Engine? - node.js

I have a backend Nodejs application and I am fetching and streaming files in the background when a certain event happens in the client.
I have deployed the backend to Google App Engine.
The file downloading is working fine but I am a bit confused where the files are downloaded and stored ? In the app I am creating a folder relative to the deployed app folder and storing them there with createWriteStream. I also init a git repository where the files are (using simple-git npm module)
It seems the files are not accessible via the cloud shell since I can not find them there
Can I for example create a storage bucket and use "normal" file operations command there (and init the repo there)
-Jani

To store data downloaded you want to store it in Cloud Storage, you can find a complete guide in this Using Cloud Storage documentation.
Under almost any circumstances you want to download files into the App Engine Deployment since the instances doesn't have much memory to store data, and also when the deployment scales up and down you are prone to lost data

Related

Azure Windows App Service files not available across nodes

Our application has the ability to request a generation of a file that are then downloaded by the client. We are seeing issues that when our app service has more than one node that the files generated are not available across the other nodes.
E.g.:
POST request to generate file and save to d:\home\site\wwwroot\app_data is send by user from machine-1 and is successful.
GET request from user to download this file is received by machine-2, this fails because the file cannot be found.
My reading of the microsoft docs is that anything in d:\home is backed by azure storage and is not local to the machine: https://learn.microsoft.com/en-us/azure/app-service/operating-system-functionality#file-access
File access across multiple instances The home directory contains an
app's content, and application code can write to it. If an app runs on
multiple instances, the home directory is shared among all instances
so that all instances see the same directory. So, for example, if an
app saves uploaded files to the home directory, those files are
immediately available to all instances.
But this doesn't seem to be happening, is there something else that needs configuring?

Download Firebase Storage folder to a Cloud Functions Temporary Folder

I'm trying to download the entire folder from Firebase Storage so I can zip those files and upload them back to Firebase Storage while generating a download link.
I have read a lot of posts and code but everything seemed getting away from my scope.
Do you have a clear example on how to download a Firebase Storage Folder to a Cloud Functions Temporary Folder or at least some hints on I could do it, keeping in mind that I am targeting a specific folder?
There is no bulk download operation provided by the SDK. The general pattern for downloading all files with some shared prefix using the node SDK for Cloud Storage will be:
list the files using getFiles at the prefix (folder) of interest
iterate them
download each one separately

storing csv files on a dedicated server

I'm having trouble storing csv files on a heroku server. I have a javascript application running express and a few other node packages and wondering what I should do in order to maintain those csv files. Heroku will always wipe new data and revert to the last deployed code (https://help.heroku.com/K1PPS2WM/why-are-my-file-uploads-missing-deleted), so any new files added while the app is running will be erased. Wondering if I should look for another dedicated service to deploy the project on or store the data in a database.

Where are source files stored on Google Cloud Platform when deployed from local machine

I have just deployed the basic NodeJS express app on Google Cloud Platform from IntelliJ IDEA. However I cannot find and browse the source files. I have searched in the Development tab, App Engine tab and it shows the project but not the actual files. I can access the application from my browser and it is running fine. I can see the activity and requests everything coming into the application but I cannot see the source files. I tried searching for them in the terminal Google Cloud Console and I cannot locate the files in there either. It's puzzling because I don't know where the files are being served from.
AFAIK seeing the live app code/static content directly in the developer console is not possible (at least not yet), not even for the standard environment apps.
For apps using the flexible environment (that includes node.js apps) accessing the live app source code may be even more complex as what's actually executed on GAE is a container image/docker file (as opposed to plain app code source file from a standard environment app). From Deploying your program:
Deploy your app using the gcloud app deploy command. This command
automatically builds a container image for you by using the Container
Builder service (Beta) before deploying the image to the App
Engine flexible environment control plane. The container will include
any local modifications you've made to the runtime image.
Since the container images are fundamentally dockerfiles it might be possible to extract their content using the docker export command:
Usage: docker export [OPTIONS] CONTAINER
Export a container's filesystem as a tar archive
Options:
--help Print usage
-o, --output string Write to a file, instead of STDOUT
The docker export command does not export the contents of volumes
associated with the container. If a volume is mounted on top of an
existing directory in the container, docker export will export the
contents of the underlying directory, not the contents of the
volume.
One way of checking the exact structure of the deployed app (at least in the standard environment) is to download the app code and check it locally - may be useful if suspected incorrect app deployment puts a question mark on the local app development repository from where deployment originated. Not sure if this is possible with the flexible environment, tho.
The accepted answer to the recent Deploy webapp on GAE then do changes online from GAE console post appears to indicate that reading and maybe even modifying live app code might be possible (but I didn't try it myself and it's not clear if it would also work for the flexible environment).

AWS to host assets and client side static code at central repository which should be accessible from node to upload files to central location

I am using AWS to run my application based on MEAN stack. I am using load balancer with three instances of Node application servers and three instances of mongo database server on cluster. My application has feature to upload file contents to server, mainly images, audios, vidoes etc. I want following
I want to create one central content repository which should be accessible from all of my three node application servers so that my node code should be able to upload files to central content repository.
I want to one URL to access this central content repository which can be used on user interface to load assets and display it
I also want to re-purpose this same central repository to host all of my client side javascript, css, images and would like my index.html to refer client side assets from central repository URL
I was looking into options in AWS however go confused. Not able to understand what is best and easy approach. Following is server architecture for reference.
Please advice
I couldn't figure out to use EC2 to solve this problem, so I changed my implementation approach, rather than hosting files on file system, I am using the MongoDB GridFS to store files. It helped to make it available for each nodejs hosts.

Resources