AWS to host assets and client side static code at central repository which should be accessible from node to upload files to central location - node.js

I am using AWS to run my application based on MEAN stack. I am using load balancer with three instances of Node application servers and three instances of mongo database server on cluster. My application has feature to upload file contents to server, mainly images, audios, vidoes etc. I want following
I want to create one central content repository which should be accessible from all of my three node application servers so that my node code should be able to upload files to central content repository.
I want to one URL to access this central content repository which can be used on user interface to load assets and display it
I also want to re-purpose this same central repository to host all of my client side javascript, css, images and would like my index.html to refer client side assets from central repository URL
I was looking into options in AWS however go confused. Not able to understand what is best and easy approach. Following is server architecture for reference.
Please advice

I couldn't figure out to use EC2 to solve this problem, so I changed my implementation approach, rather than hosting files on file system, I am using the MongoDB GridFS to store files. It helped to make it available for each nodejs hosts.

Related

Remotely upload a file to Wordpress from nodejs instance

I’m running a node app on Heroku. It gathers a chunk of data and sticks it into a set of JSON files. I want to POST this data to my Wordpress server (right now it’s on siteground, but moving to WP Engine).
I thought the WordPress REST API would provide what I wanted but after reading the docs I’m not not so sure.
Does anyone have any advice on this? It’s not the kind of thing I’ve done before.
Naturally, I could download the generated files and manually upload them in the right place… but I want it to be automated!
Can anyone point me in the right direction?
Looked at WordPress REST API but don’t think that’s the answer.
An option would be to setup a cron job on your Wordpress site to download the already made JSON files. This would let you download the files via http if they're publicly available, or using an api-key, or SFTP even.
You could also go the opposite way and setup a cron job on the server running your node app and use SFTP to deliver the updated files onto your Wordpress site.
If you are really committed to wanting to use the wordpress rest api, I think the closest one that comes to my mind (not an expert in wordpress) would be that you'd upload the json files as media objects: https://developer.wordpress.org/rest-api/reference/media/.

Can we set the frequency of pull to get better instant updates from config server using Jhipster Registry central-config approach with file-system

Currently I am using in one of my project JHipster micro-service centralization approach using central-config folder, using JHipster Registry app using native file system .
I have two questions to ask---
If we can set the frequency of the pull so we can get better instant updates from the config server?
How the config server treat the data pulling the information from a source (git or binary repository) like it copy the files to a local directory, keep the information in memory ?
this is not how it works, applications do not poll config server on a regular basis. The configuration gets loaded at startup time. If you make a change in config server and want the applications to reload their application context (see also #RefreshScope bean annotation), it's up to you to either call /management/refresh on each client or you will Spring Bus (if you use it) to send refresh events, see https://cloud.spring.io/spring-cloud-config/reference/html/#_push_notifications_and_spring_cloud_bus
For git backend, the server clones remote repositories when configuration is first requested or at startup and then refreshes according to spring.cloud.config.server.git.refreshRate value, see https://cloud.spring.io/spring-cloud-config/reference/html/#_git_refresh_rate

How to setup file upload from NodeJS/VueJS to another server?

I've created a VueJS App for file uploading in which it will be my admin panel (For CMS) with NodeJS as back-end. Now I want to upload the files that was passed to the NodeJS and move it to another VueJS App which is going to be my primary website in order for me to access the files locally. How can I do it? Any suggestions or different approach will do.
You have many options on how to move the files to another site.
You could store them in a shared bucket or shared directory between your 2 backends or you could add another route to download that file.
You could configure a cronjob to scp or rsync those files to your target machine.
This is really more of a question of how to sync a directory to some place else.

Where are files downloaded in Google App Engine?

I have a backend Nodejs application and I am fetching and streaming files in the background when a certain event happens in the client.
I have deployed the backend to Google App Engine.
The file downloading is working fine but I am a bit confused where the files are downloaded and stored ? In the app I am creating a folder relative to the deployed app folder and storing them there with createWriteStream. I also init a git repository where the files are (using simple-git npm module)
It seems the files are not accessible via the cloud shell since I can not find them there
Can I for example create a storage bucket and use "normal" file operations command there (and init the repo there)
-Jani
To store data downloaded you want to store it in Cloud Storage, you can find a complete guide in this Using Cloud Storage documentation.
Under almost any circumstances you want to download files into the App Engine Deployment since the instances doesn't have much memory to store data, and also when the deployment scales up and down you are prone to lost data

How to serve images with a dokku application and nginx

I am new with dokku and nginx.
I want to serve images from my server, right now I have an application running in dokku serving me different json from a database. Right now I want to upload files and images and serve that images to the users.
I could find some documentation about dokku persistent storage, to mount a local storage and redirect all the data from the application to that directory.
My question right now is. How can I serve the image to the users? If I user http://app-name.host.com/storage/image-url.jpg it will do the trick?.
Or should I use some configuration from the nginx file to serve the files directly from the server local storage?
Maybe someone could guide me in the right path because I can not find any clear information about that.
Thank you!
You can either:
ship a customized nginx.conf.sigil with your repository to make the external nginx route to the storage directory directly
modify your application to serve it. Most frameworks have this functionality.

Resources