I have an application that looks like this one:
https://github.com/heroku-examples/node-workers-example
In short, I have 2 processes:
Server: it server US, handles requests and adds them to Redis
Worker: it pulls requests from Redis and works on them
Should I use only one Docker image for both processes or should I have 2 docker images (one for the server and the second for the worker)? What is the best practice?
I personally think, it's better to have 2 images. In this case, can my project structure be like this one:
Project Folder
-node_modules
-utils
-server.js
-package.json
-Dockerfile
-docker-compose.yml
-/worker
-/worker/index.js
-/worker/Dockerfile
Is there any advice?
Thanks a lot.
Disclaimer: this is very opinionated response, but author is asking for an opinion.
You can do one or two but it all depends on how you wanna schedule it.
If you want to stay flexible in the amount of processes you want for each I would go with two docker images otherwise you'll need to each time spin a fixed amount of each or you'll need to tweak that setting via env variables or via other means...
Hence one for the frontend part and one for the background process.
As you are having two different images, I usually prefer to separate that in two distincts projects but that's a matter of taste. Even though because of how NodeJS manages dependencies (node_modules) it's easier to have 2 distincts folders when the dependencies are very different.
I would go with following:
.
├── docker-compose.yml
├── front
│ ├── Dockerfile
│ ├── node_modules
│ ├── package.json
│ └── src
│ └── main.js
└── worker
├── Dockerfile
├── node_modules
├── package.json
└── src
└── main.js
Related
I am trying to serve a react frontend using actix server with the following service :
service(actix_files::Files::new("/app", "./react-front").index_file("./react-front/index.html")
And I have the following structure in react-front which is a react app build using npm run build:
react-front/
├── asset-manifest.json
├── favicon.ico
├── index.html
├── logo192.png
├── logo512.png
├── manifest.json
├── precache-manifest.70429bbe96a34ea56e1cb1a6570602b0.js
├── robots.txt
├── service-worker.js
└── static
├── css
│ ├── main.09af38e2.chunk.css
│ └── main.09af38e2.chunk.css.map
└── js
├── 1.e6893027.chunk.js
├── 1.e6893027.chunk.js.map
├── main.dc35c614.chunk.js
├── main.dc35c614.chunk.js.map
├── runtime~main.229c360f.js
└── runtime~main.229c360f.js.map
Visiting /app I am greeted with this, where "Your React App loaded successfully" is the actual text in my index.html and the rest of the files as you can see from the network tab, gets a 404. I have placed this service before any of the routers in my App configuration, and were I to put this service after my routers I would just get this text and a 204 No Content. No more 404 errors for the react js scripts whatsoever.
This leads me to two questions:
1. What effects result from the order in which service for file resources is written for App builder?
2. How can one serve a complete folder such that the browser can obtain all the relevant css and js files necessary to display the react app?
Currently here's how I am dealing with it: I use the crate actix-web-static-files and static-files to embed the contents react-front folder into the final executable and then run it
You can see that this works now. I want to achieve this without using these crates, just actix-files, if possible.
Here's the working code for this :
HttpServer::new(move || {
let generated = generate();
App::new()
.wrap(Cors::permissive())
.route("/book", web::get().to(get_phonebook_handler))
.route("/book/{id}", web::get().to(get_by_id))
.route("/{name}", web::get().to(get_by_name))
.route("/book/{id}", web::delete().to(delete_id))
.route("/book", web::post().to(post_phonebook_handler))
.route("/book/{id}", web::put().to(put_update))
.service(ResourceFiles::new("/", generated))
})
.listen(tcp)?
.run()
.await
I have a folder structure like so:
.
└── client
├── components
└── routes
├── index.js
├── Login
│ ├── index.js
│ ├── assets
│ ├── components
│ ├── container
│ └── modules
└── UpdatePassword
├── index.js
├── assets
├── components
├── container
└── modules
I would like to see if anyone is importing files from the UpdatePassword folder to the Login folder and vice versa.
Basically I'm following a fractal project structure where I want components that are related to the UpdatePassword or Login route to only exist in their respective folders. Shared components would exist in the client/components subdirectory. To maintain a structure like this, I would like to write a test that fails when an 'unacceptable' imports or require is used. I.e. if a file in UpdatePassword imports from Login/components.
Is there a way to test or check whether an import is coming from specific folders?
Try madge: I usually run it as madge --image /path-to-folder/dependencies.png routes (There is also a exclude option if you need it)
You'll get a visual graph which shows you dependencies between files.
I have no idea about native way to do it.But you can wrap "require" function:
function myRequire(fromPath, requiredPath) {
//code to judge whether or not can load requiredPath from fromPath
var can = ...
if(can) {
return require(requiredPath);
}
else {
throw new Error(`you can not load ${requiredPath} from ${fromPath}`);
}
}
I want to deploy my own version of git client side hooks. for this, i need to know whether the client side hooks are generated from the clients system or are also cloned from the server.
If it is cloned from the server, what is the location where the client side hooks are located?
The hooks location is under .git folder in your cloned folder
.git
├── branches
├── COMMIT_EDITMSG
├── config
├── description
├── HEAD
├── hooks
├── index
├── info
├── logs
├── objects
└── refs
there is a folder hooks, where you can find post, pre hooks.
├── applypatch-msg.sample
├── commit-msg.sample
├── post-update.sample
├── pre-applypatch.sample
├── pre-commit.sample
├── prepare-commit-msg.sample
├── pre-push.sample
├── pre-rebase.sample
└── update.sample
rename .sample files to like post-update.sample to post-update to run pre, post hooks.
The client side hooks are not stored in server. Server has nothing to do with client side hooks in git. i.e. client side hooks and server side hooks are not inter-related. However, they are generated inside your local repository by your own local system.
In short, client side git hooks are not related to server and aren't stored there.
I'm newbie developing MEAN apps, i'm coming from a Laravel structure to develop web applications, what i don't like about Laravel is that is not so modular, everything is separated, especially Models, Views, and Controllers, to navigate through the folders you have to scroll a lot of times when building large apps...
My question is all about Express under the MEAN stack, what i'm looking for is a structure where i create modules for every entity of an app, for example:
I have three modules: users, questions, and answers; each module will contain the routes, the model, and the controller for that specific module, for example:
├── node_modules
├── src
│ ├── client
│ │ └── ... //Frontend things managed by angular (like views, etc...)
│ └── server
│ ├── modules //By module i mean an entity
| | ├── users
| | | ├── users.model.js
| | | ├── users.controller.js
| | | ├── users.routes.js
| | | └── index.js
| | ├── questions
| | └── answers
| ├── config
| └── etc... //Suggestions please...
|
├── package.json
└── server.js
How could i set up that structure?, so far i've found this tutorial about modularizing an Express app, but i would like to extend it, so that i keep it DRY, and make use of the LIFT principle described by John Papa in his Angular Style Guide but under the backend.
Why am i asking this? Simple, i don't like to scroll through large folders to get open a file and then scroll again to open another file that are related, i want to work with a structure easy to maintain, easy to understand, etc...
If possible could someone explain an example of how to setup an application using this structure and upload it to github?
Place one node_modules at the top level.
Give each module a package.json
Now write a script in your bootstrap that finds all unique node_modules required from the package.json.
There, node_modules solved. Make sure it identifies the variances of versions too.
Build a universal run script file that reads a universal configuration file that points to all of the other modules.
Angular 2..4..5 whatever is a great modularized file structure to model after.
I found that Express has an application generator, however the documentation does not explain the purpose of each directory and file. If someone could just give me a short explanation of which files I should be putting where, that would be much appreciated. Here's the generated app structure:
├── app.js
├── bin
│ └── www
├── package.json
├── public
│ ├── images
│ ├── javascripts
│ └── stylesheets
│ └── style.css
├── routes
│ ├── index.js
│ └── users.js
└── views
├── error.jade
├── index.jade
└── layout.jade
7 directories, 9 files
The app.js file is the entry-point of your application.
The package.json file contains all of your dependencies and various details regarding your project.
The bin folder should contain the various configuration startup scripts for your application.
For example, instead of applying all the Express middleware in the app.js file, you module.exports = {} them from their own configuration file and require them in app.js. [additional info LINK]
The views folder contains all of your server-side views.
The public folder contains all of your front-end code.
The routes folder contains all the routes that you have created for your application.
As stated in the official documentation, be aware that this is just one way to organize your code.
You should test it out and see if it fits your project.
This thread gives a deeper answer about the www file specifically: What does "./bin/www" do in Express 4.x?
Basically, running your app from the www file (which calls app.js) allows you to start your app with different configurations. You might have a "www" file representing the way the app should be run when on the web, a "dev" file that you, as the developer, would run, a "test" file you would run when running tests, etc. Read the thread linked above for more detail!
This structure is a standard organization for web-app
public contains all client static files (css, client javascript (ex. jQuery), images, fonts...)
routes contains the main back-end code (server side), which compute data before calling a template engine (see below) or respond to the client (via json of xml).
views contains each page template, see jade template. These files are used by scripts in "route"
app.js contains the express core, such as uri parser, modules, database...
package.json is the project descriptor file (used by npm for dependencies and sharing)
If the application generator provide a full example, don't hesitate to open each file (starting from app.js) to understand the project's organization.