where is the location of client side git hooks in server - linux

I want to deploy my own version of git client side hooks. for this, i need to know whether the client side hooks are generated from the clients system or are also cloned from the server.
If it is cloned from the server, what is the location where the client side hooks are located?

The hooks location is under .git folder in your cloned folder
.git
├── branches
├── COMMIT_EDITMSG
├── config
├── description
├── HEAD
├── hooks
├── index
├── info
├── logs
├── objects
└── refs
there is a folder hooks, where you can find post, pre hooks.
├── applypatch-msg.sample
├── commit-msg.sample
├── post-update.sample
├── pre-applypatch.sample
├── pre-commit.sample
├── prepare-commit-msg.sample
├── pre-push.sample
├── pre-rebase.sample
└── update.sample
rename .sample files to like post-update.sample to post-update to run pre, post hooks.

The client side hooks are not stored in server. Server has nothing to do with client side hooks in git. i.e. client side hooks and server side hooks are not inter-related. However, they are generated inside your local repository by your own local system.
In short, client side git hooks are not related to server and aren't stored there.

Related

Unable to serve react app using actix files crate

I am trying to serve a react frontend using actix server with the following service :
service(actix_files::Files::new("/app", "./react-front").index_file("./react-front/index.html")
And I have the following structure in react-front which is a react app build using npm run build:
react-front/
├── asset-manifest.json
├── favicon.ico
├── index.html
├── logo192.png
├── logo512.png
├── manifest.json
├── precache-manifest.70429bbe96a34ea56e1cb1a6570602b0.js
├── robots.txt
├── service-worker.js
└── static
├── css
│ ├── main.09af38e2.chunk.css
│ └── main.09af38e2.chunk.css.map
└── js
├── 1.e6893027.chunk.js
├── 1.e6893027.chunk.js.map
├── main.dc35c614.chunk.js
├── main.dc35c614.chunk.js.map
├── runtime~main.229c360f.js
└── runtime~main.229c360f.js.map
Visiting /app I am greeted with this, where "Your React App loaded successfully" is the actual text in my index.html and the rest of the files as you can see from the network tab, gets a 404. I have placed this service before any of the routers in my App configuration, and were I to put this service after my routers I would just get this text and a 204 No Content. No more 404 errors for the react js scripts whatsoever.
This leads me to two questions:
1. What effects result from the order in which service for file resources is written for App builder?
2. How can one serve a complete folder such that the browser can obtain all the relevant css and js files necessary to display the react app?
Currently here's how I am dealing with it: I use the crate actix-web-static-files and static-files to embed the contents react-front folder into the final executable and then run it
You can see that this works now. I want to achieve this without using these crates, just actix-files, if possible.
Here's the working code for this :
HttpServer::new(move || {
let generated = generate();
App::new()
.wrap(Cors::permissive())
.route("/book", web::get().to(get_phonebook_handler))
.route("/book/{id}", web::get().to(get_by_id))
.route("/{name}", web::get().to(get_by_name))
.route("/book/{id}", web::delete().to(delete_id))
.route("/book", web::post().to(post_phonebook_handler))
.route("/book/{id}", web::put().to(put_update))
.service(ResourceFiles::new("/", generated))
})
.listen(tcp)?
.run()
.await

Running NodeJS worker in Docker image

I have an application that looks like this one:
https://github.com/heroku-examples/node-workers-example
In short, I have 2 processes:
Server: it server US, handles requests and adds them to Redis
Worker: it pulls requests from Redis and works on them
Should I use only one Docker image for both processes or should I have 2 docker images (one for the server and the second for the worker)? What is the best practice?
I personally think, it's better to have 2 images. In this case, can my project structure be like this one:
Project Folder
-node_modules
-utils
-server.js
-package.json
-Dockerfile
-docker-compose.yml
-/worker
-/worker/index.js
-/worker/Dockerfile
Is there any advice?
Thanks a lot.
Disclaimer: this is very opinionated response, but author is asking for an opinion.
You can do one or two but it all depends on how you wanna schedule it.
If you want to stay flexible in the amount of processes you want for each I would go with two docker images otherwise you'll need to each time spin a fixed amount of each or you'll need to tweak that setting via env variables or via other means...
Hence one for the frontend part and one for the background process.
As you are having two different images, I usually prefer to separate that in two distincts projects but that's a matter of taste. Even though because of how NodeJS manages dependencies (node_modules) it's easier to have 2 distincts folders when the dependencies are very different.
I would go with following:
.
├── docker-compose.yml
├── front
│   ├── Dockerfile
│   ├── node_modules
│   ├── package.json
│   └── src
│   └── main.js
└── worker
├── Dockerfile
├── node_modules
├── package.json
└── src
└── main.js

test or prevent some relative path imports / requires

I have a folder structure like so:
.
└── client
├── components
└── routes
├── index.js
├── Login
│ ├── index.js
│ ├── assets
│ ├── components
│ ├── container
│ └── modules
└── UpdatePassword
├── index.js
├── assets
├── components
├── container
└── modules
I would like to see if anyone is importing files from the UpdatePassword folder to the Login folder and vice versa.
Basically I'm following a fractal project structure where I want components that are related to the UpdatePassword or Login route to only exist in their respective folders. Shared components would exist in the client/components subdirectory. To maintain a structure like this, I would like to write a test that fails when an 'unacceptable' imports or require is used. I.e. if a file in UpdatePassword imports from Login/components.
Is there a way to test or check whether an import is coming from specific folders?
Try madge: I usually run it as madge --image /path-to-folder/dependencies.png routes (There is also a exclude option if you need it)
You'll get a visual graph which shows you dependencies between files.
I have no idea about native way to do it.But you can wrap "require" function:
function myRequire(fromPath, requiredPath) {
//code to judge whether or not can load requiredPath from fromPath
var can = ...
if(can) {
return require(requiredPath);
}
else {
throw new Error(`you can not load ${requiredPath} from ${fromPath}`);
}
}

Runtime error when sharing code between client/server in Node.js app

I have a Node.js project with this basic folder structure:
├── project
├── client
├── app
   ├── assets
   │   └── css
   └── components
   ├── about
   ├── categories
   ├── home
   ├── navbar
   └── posts
├── common
├── model
└── util
├── server
├── api
├── model (deprecated for ../common/model)
├── conf
└── db
server is an Express API, client is an Angular2 App using this API. I use Gulp to build a dist folder which is being deployed to AWS Elastic Beanstalk.
Important: In dist, Client lives in app folder, while Server is in the root.
Everything was working fine until I decided to share some code between Server and Client (that's the point in Node.js I thought...). I've created a common folder in project's root with, among others, a post model (/project/common/model/post.ts)
In Server's /project/server/server.ts I replaced:
import {Post} from './model/post'
for:
import {Post} from '../common/model/post'
And it works.
But in Client's /project/client/app/components/posts/post-list.component.ts I replaced:
import {Post} from './post'; // Model was in the same folder here...
for:
import {Post} from '../../../../common/model/post';
and it compiles fine but then, when I try to reach Client in my browser, I get:
zone.js:101 GET http://127.0.0.1:3000/common/model/post 404 (Not Found)
(index):24 Error: Error: XHR error (404 Not Found) loading http://127.0.0.1:3000/common/model/post
I've checked and my Gulp's build task is correctly moving the compiled (transpiled) common files to dist/common.
Any hint on how to proceed to solve this? Thank you so much!
Presumably common has to be accessible from the browser, but is your webserver serving up files in the common folder?
In your express configuration you'll likely have something like this to serve up static content:
app.use(express.static(__dirname + '/public'));
But did you do the same for the new common folder you have?

How to properly decouple MongoDB

I'm rather new to web development all together. I'm trying to get a node.js/express/mongo project off the ground. I'm coming from a C/C++ background, and the module organization paradigms seem a little strange to me.
Currently, in my server.js, I create, connect, and initialize my mongoDB object using mongoose, then pass around this object query, insert ect on it. Doesn't seem like something I should be doing inside server.js.
Is the proper way to loosely couple Mongo from my project by creating a separate module (ex. database) where I do all the initialization, options, ect, and return a mongodb instance through this new module?
I've never undertaken a project of this size (or type) before, and just don't know how to best organize everything..
Any advice from those people much more experienced than me would be appreciated
Many ways to do it. Here's some ideas.
Yes, you'll want to have a routing/handlers setup of some kind so that different modules have the ability to call different services and/or decouple.
Below is a fairly standard node.js / express structure:
├── server.js
├── config
│ ├── development
│ ├── production
│ └── staging
├── handlers
│ ├── customers.js
│ └── stores.js
├── node_modules
│ ├── assert
│ ├── ejs
│ ├── express
│ ├── forever
│ ├── mongodb
│ └── mongoskin
├── package.json
├── README.md
then in server.js, you can import your handlers like so:
// import route handlers
var customers = require('./handlers/customers'),
stores = require('./handlers/stores');
and then inside your handlers, you'll be able to declare functions:
exports.addCustomer = function(req, res) {
// ....
};
which in server.js you can use for routing:
app.post('/customers/add/:id, metrics.addCustomer);
so then you've got a basic framework. Just defining the database connections outside of the exports.XXX functions in the handler files is fine because those functions will have access, but not anything in server.js so you won't pollute your namespace.
var url = config.user +":"
+ config.pass +"#"
+ config.host +"/"
+ config.database;
var mongo = require('mongoskin').db(url);
where you might load the config object from a JSON file.
hope that helps.

Resources