Is there a way to share framework code across multiple projects? - node.js

I have a set of build tools, UI components, boilerplate code, etc that I want to share across projects. Within this framework, there's an src folder that's specific to each project. E.g.
Project 1:
scripts/
src/
package.json
Project 2:
scripts/ <- shared with project 1
src/ <- specific to project 2
package.json <- shared with project 1
These 2 projects would be in separate repos. I thought Git submodules would be the solution, but it seems like this would make the project a dependency of the framework. I want them to either be independent or have the framework be a dependency of the project.
I put the project inside the framework instead of the other way around because there are certain files that are usually in the root directory, e.g. Node's package.json.
Is there a way to make this easy to work with?

The NPM way
The way I prefer to do this is to structure the framework as a module. Then in both project1 and project2 you can do:
npm install git://github.com/leojiang/framework.git
There are lots of advantages for doing it this way:
People familiar with node.js already know how this works so you don't need to teach new developers how to install the framework.
It makes maintaining the framework separate from maintaining projects. This not only lead to cleaner code but allows all projects to pull bug fixes by simply doing an npm install or npm update.
It moves all your framework files into node_modules thus allowing you to use require() without specifying paths.
It moves all your framework files into node_modules thus basically hiding them from projects allowing each project to focus on only project-specific stuff.
The Git way
Sometimes what you have isn't really a collection of modules like Express or Nest.js. Sometimes it is a specific way to structure projects. For example what collection of modules are used, where is the config file, where are the templates, etc. This is often called boilerplate (or project boilerplate to differentiate it from boilerplate code in a single file).
If you want to save a project structure the best way is to create an example project (a kind of simple Hello World) then push that project to a git repo. Then both project1 and project2 can fork from the boilerplate repo. If you are using something like github or gitlab you can use their built-in fork functionality. If you have your own remote repo like gitosys then you can just clone the repo or pull the repo into a new empty git project:
mkdir project1
git init
git pull path/to/boilerplate/repo
Boilerplate repositories are very popular in the hackathon community as it allows teams to quickly start a project for the 24 or 48 hours of coding they have.
The Ruby-on-Rails way
I call this the RoR way but maybe other frameworks did this first. RoR certainly popularized this method. In javascript-land you can see this method being used by Next.js and React.js.
The idea is simple. Write a script that creates the correct project structure. Then just execute that script. Let the script do all the work of creating folders and installing modules.
This is of course the most involved way of doing this and takes the most amount of work. However it is also the most flexible.

Related

How can I deploy my application within a cloned repository on Google App Engine?

I'm using a node package to run a web server (among other benefits) for my project. The catch is, my project is only loaded on the server if it's within a directory of the node package. In other words, my directory structure looks like this:
<npm_pkg>/
<npm_pkg_src>/
clients/
<my_project_name>/
<my_project_src>
I would like to be able to use standard deployment processes for my project (e.g. gcloud app deploy, Travis continuous deployment, etc.), but I need to run my project from within a subdirectory of the larger package. Is there an easy way to force a git clone <pkg> during a build step and deploy my project in the target subdirectory?
I'm pretty new to CI/CD, but I tried to search around for similar examples and couldn't find any. Note: the parent project is not owned by me and thus I can't just use submodules without forking it (and I have no intention to alter it in any way). I also strictly just want to be able to trigger deploys based on my actual project's repository, if possible, whereas submodules would involve maintaining two and committing features twice (from what I understand).
Any help is much appreciated.
Edit: I forgot to mention that as part of this configuration I also need to run my server script from the root of the parent package. IOW, my package.json's start script will look like "start": "cd ../.. && npm start". Just in case it's relevant.
This might be what you’re looking for: CI/CD with App Engine
Clone from the repo and deploy from the subdirectory it is located, and Cloud Source Repositories can automate the whole process for you
I would also suggest you keep services separate, this will make things clearer for you and others that will/might be working on the project with you

What is the simplest way to share react components between projects?

How can I share components across multiple react projects without having to publish them on a public package manager like NPM?
Option 1: You can use npm and use private packages so they're not external facing. There are also artifactories and scoped packages that usually represent company-wide projects that can be public or private. See https://docs.npmjs.com/private-modules/intro and https://docs.npmjs.com/misc/scope.
Option 2: Essentially, you can develop projects with a flattened structure. You can then import various projects and/or components into other projects or folders. This is entirely dependent on your codebase and configuration. With this model though, a lot of times publishing to npm comes fairly naturally since each folder may be its own project with its own package.json.
Updated:
Option 3: Bit focuses on the composability of components from everything from the little things like a button to the actual view and app itself—each target is its own package. Overall, it's an opinionated, yet customizable framework that can enable quicker development, managed dependencies, and organized code.
Option 4: RushJS is a monorepo manager built by Microsoft that allows for flexibility of different kinds of apps and services utilizing pnpm underneath (as opposed to yarn and npm), which alleviates problems that stem from dependency issues.
Check out Bit:
Bit is an open-source cli tool for collaborating on isolated components across projects and repositories. Use Bit to distribute discrete components from a design library or a project into a standalone reusable package and utilize it across applications.
You could also upload them to a private git repo such a Github and then pull them in from there.
Ryanve has a nice example over here: https://stackoverflow.com/a/28729646/1592783
You could create a repo of shared components and then have your Node.js start script call a shell script to do a git pull from that repo and the move the shared components from that directory to your project's directory. That way, every time you call run 'npm start' you will have the latest version of the shared components loaded into your project

Can I turn code into an NPM module without extracting it from a project into its own repo?

Project A contains a few functions and data models I use in diff't repos, all tied to the same product. I'd like to turn them into an npm module, but without extracting the code from project A.
When I see other modules on npm, they generally tie to a github repo that contains all the source code, as well as a full stack to run/modify the module.
Does this mean I have to extract the code from project A into its own repository, build/configure a stack to allow it to run in isolation from project A, and then import it back into project A & other projects?
Or is it possible to just export the functions w/o a full stack, and without moving the code from my main project?
an attempt to pre-empt 'duplicate' comments:
this Q talks about working with an existing module, which doesn't answer my concern, as it has to do w/ worrying about pull requests being merged on time
npm link, discussed here, looks like it'd do the trick if I'd already extracted the code from the project, but I'd like to avoid that.
If you really want to share a snippet through npm but still use the code at the same place in your project, you could extract the code into its own repo, but still use it inside your project as a git sub-module.
Create a submodule repository from a folder and keep its git commit history
Do you know if it's standard for npm modules in their own repos to include the full stack for running them?
Ideally, it's to test them and ease the development, but it's totally optional. You could only put a JavaScript file and the package.json and it would work.

Including local dependencies in deployment to lambda

I have a repo which consists of several "micro-services" which I upload to AWS's Lambda. In addition I have a few shared libraries that I'd like to package up when sending to AWS.
Therefore my directory structure looks like:
/micro-service-1
/dist
package.json
index.js
/micro-service-2
/dist
package.json
index.js
/shared-component-1
/dist
package.json
component-name-1.js
/shared-component-2
/dist
package.json
component-name-2.js
The basic deployment leverages the handy node-lambda npm module but when I reference a local shared component with a statement like:
var sharedService = require('../../shared-component-1/dist/index');
This works just fine with the node-lambda run command but node-lambda deploy drops this local dependency. Probably makes sense because I'm going below the "root" directory in my dependency so I thought maybe I'd leverage gulp to make this work but I'm pretty darn new to it so I may be doing something dumb. My strategy was to:
Have gulp deploy depend on a local-deps task
the local-deps task would:
npm build --production to a directory
then pipe this directory over to the micro-service under the /local directory
clean up the install in the shared
I would then refer to all shared components like so:
var sharedService = require('local/component-name-1');
Hopefully this makes what I'm trying to achieve. Does this strategy make sense? Is there a simpler way I should be considering? Does anyone have any examples of anything like this in "gulp speak"?
I have an answer to this! :D
TL;DR - Use npm link to link create a symbolic link between your common component and the dependent component.
So, I have a a project with only two modules:
- main-module
- referenced-module
Each of these is a node module. If I cd into referenced-module and run npm link, then cd into main-module and npm link referenced-module, npm will 'install' my referenced-module into my main-module and store it in my node_modules folder. NOTE: When running the second npm link, the name of the project is the one you find in your package.json, not the name of the directory (see npm link documentation, previously linked).
Now, in my main-module all I need to do is var test = require('referenced-module') and I can use that to my hearts content. Be sure to module.exports your code from your referenced-module!
Now, when you zip up main-module to deploy it to AWS Lambda, the links are resolved and the real modules are put in their place! I've tested this and it works, though not with node-lambda yet, though I don't see why this should be a problem (unless it does something different with the package restores).
What's nice about this approach as well is that any changes I make to my referenced-module are automatically picked up by my main-module during development, so I don't have to run any gulp tasks or anything to sync them.
I find this is quite a nice, clean solution and I was able to get it working within a few minutes. If anything I've described above doesn't make any sense (as I've only just discovered this solution myself!), please leave a comment and I'll try and clarify for you.
UPDATE FEB 2016
Depending on your requirements and how large your application is, there may be an interesting alternative that solves this problem even more elegantly than using symlinking. Take a look at Serverless. It's quite a neat way of structuring serverless applications and includes useful features like being able to assign API Gateway endpoints that trigger the Lambda function you are writing. It even allows you to script CloudFormation configurations, so if you have other resources to deploy then you could do so here. Need a 'beta' or 'prod' stage? This can do it for you too. I've been using it for just over a week and while there is a bit of setup to do and things aren't always as clear as you'd like, it is quite flexible and the support community is good!
While using serverless we faced a similar issue, when having the need to share code between AWS Lambdas. Initially we used to duplication the code, across each microservice, but later as always it became difficult to manage.
Since the development done in Windows Environment, using symbolic links was not an option for us.
Then we came up with a solution to use a shared folder to keep the local dependencies and use a custom written gulp task to copy these dependencies across each of the microservice endpoints so that the dependency can be required similar to npm package.
One of the decisions we made is not to keep two places to define the dependencies for microservices, so we used the same package.json to define the local shared dependencies, where gulp task passes this file and copy the shared dependencies accordingly also installing the npm dependencies with a single command.
Later we made the code open source as npm modules serverless-dependency-install and gulp-dependency-install.

Node.js - How do I use modules from another project without copying code?

To be completely specific:
I am writing a Node.js app that is intended to be a websocket bot for Slack.
A Node project exists that abstracts the majority of the Slack API. (It is NOT an npm module.)
I'm not overly familiar with grunt, etc. but I can get the dependencies to install and utilize all this code by placing my own mybot.js in the root folder of this git clone and running node mybot.js with mybot.js being based on the files in the example folder.
Committing to my own repository, I don't want to commit any of the aforementioned project code -- it's not mine! I do, however, want it as a dependency. Unfortunately, this code by Slack is not an npm module that makes it easy to do. The project has a /bin folder and a /src folder full of coffee script, etc. that grunt builds to .js files.
The Slack project code has its own dependencies. In my way of thinking, those are sub-dependencies for me, or cascading dependencies. My project only depends on whatever the Slack project depends on.
I would like to be able to update my project with updates (manually, or via build) from the git repo of the Slack project as needed.
It seems there must be a way for me to include this project as a dependency, and once built, properly reference it's bin and src folder objects (bin/slack, src/message, client, channel, user, etc.) without committing it to my own repository. Especially great if it could be in a subfolder separate from my own model definitions. In a way, this seems no different to me than including jQuery in my website layout via a CDN. I'm only asking for the jQuery project and depending on my link flavor, I can get a specific version or the latest version, etc.
So, it turns out the comment by Ben pointing me to the npmjs.com slack-client npm module was the help I really needed. I just didn't really know how to ask the right question, I think.
And while I hate to look a gift horse in the mouth, a little more than a link, Ben, would've saved me another three hours, probably. Perhaps: "It is an npm module, not just a project from github." But thank you, even if it took me a while to decipher what you were saying.

Resources