Share and manage rc files (or config files) between mltiple projects - node.js

I'm working on different projects based on node, and one thing I always have to do is create the configuration files in all the projects since you all share a lot of configuration, for example, in all projects I use commitlint, lint-stage, husky, eslint, nodemon, and typescript and other settings.
How could I share all these settings in all projects and if I update any of them, update them in all projects?
The first thing that occurs to me is to create a npm packet with all the configurations, and several scripts, that copies / updates these configuration files in the root directory of the project where user is, something like
> myscript update
> myscrpt init
Another option would be to use the configurations programmatically, that is, instead of using a .rc use a .js, but this would force me to manage the dependencies in each project and create a .rc file that uses the configuration of the js file which is in the configuration package.
Another option is to create a github repository as a template, but if I update this repository, the projects I have created using this template are not updated, right?
What do you think is the best way to do it?

Even though git submodules seem to be discouraged lately, I think it's the most reasonable choice (assuming all of your projects are git-based): https://git-scm.com/book/en/v2/Git-Tools-Submodules
In your case, you'd have a 'common' repository with configuration, eg. configuration and two projects: projectA and projectB. Both of them would have the submodules added:
git submodule add <your_git_repo_server>/configuration
Please notice, however, that submodule refers to a particular commit (not a branch or tag - commit). You always have to take big care of synchronizing your dependencies correctly.

Related

Is it okay to use a single shared directory as Cargo's target directory for all projects?

Cargo has the --target-dir flag which specifies a directory to store temporary or cached build artifacts. You also can set it user-wide in the ~/.cargo/config file. I'd like to set it to single shared directory to make maintenance easier.
I saw some artifact directories are suffixed with some unique(?) hashes in the target-dir which looks safe, but the final products are not suffixed with hashes, which doesn't seem to be safe for name clashes. I'm not sure on this as I am not an expert on Cargo.
I tried setting ~/.cargo/config to
[build]
target-dir = "./.build"
My original intention was to use the project's local ./.build directory, but somehow Cargo places all build files into ~/.build directory. I got curious what would happen I put all build files from every project into a single shared build directory.
It has worked well with several different projects so far, but working for a few samples doesn't mean it's designed or guaranteed to work with every case.
In my case, I am using single shared build directory for all projects of all workspaces of a user. Not only projects in a workspace. Literally every project in every workspace of a user. As far as I know, Cargo is designed to work with a local target directory. If it is designed to work with only local directory, a shared build directory is likely to cause some issues.
Rust/Cargo 1.38.0.
Yes, this is intended to be safe.
I agree with the comments that there are probably better methods of achieving your goal. Workspaces are a simple solution for a small group of crates, and sccache is a more principled caching mechanism.
See also:
Fix running Cargo concurrently (PR #2486)
Allow specifying a custom output directory (PR #1657)
Can I prevent cargo from rebuilding libraries with every new project?

Share one file across multiple git repo to be updated by multiple users

I am working on automating the markdown spell check for all the documents on my website which involves multiple git repo. I have a .spelling file that contains all the word to be excluded from the documents. I would like to keep it one file and updated across the entire website. I can get it to work for one repo. I looked into the npm package method. Is there a way to configure package.json to share this file to many repo? Or is there a better way to do it without npm? Thanks!
Make a separate spell-check repository with the .spelling file and script in it, then include it as a submodule in each of your docs repos. You can then reference it from each repository separately, and pull its latest updates into each one.
This could be cumbersome if you have a large number of docs repos, so another alternative is to centralize the spelling check script by making a separate repository for it and adding a configuration file to tell your script which Github repositories to spellcheck. This way, you can selectively apply the spell check process to any number of repositories in your organization.

How to update repository with built project?

I’m trying to set up GitLab CI/CD for an old client-side project that makes use of Grunt (https://github.com/yeoman/generator-angular).
Up to now the deployment worked like this:
run ’$ grunt build’ locally which built the project and created files in a ‘dist’ folder in the root of the project
commit changes
changes pulled onto production server
After creating the .gitlab-ci.yml and making a commit, the GitLab CI/CD job passes but the files in the ‘dist’ folder in the repository are not updated. If I define an artifact, I will get the changed files in the download. However I would prefer the files in ‘dist’ folder in the to be updated so we can carry on with the same workflow which suits us. Is this achievable?
I don't think commiting into your repo inside a pipeline is a good idea. Version control wouldn't be as clear, some people have automatic pipeline trigger when their repo is pushed, that'd trigger a loop of pipelines.
Instead, you might reorganize your environment to use Docker, there are numerous reasons for using Docker in a professional and development environments. To name just a few: that'd enable you to save the freshly built project into a registry and reuse it whenever needed right with the version you require and with the desired /dist inside. So that you can easily run it in multiple places, scale it, manage it etc.
If you changed to Docker you wouldn't actually have to do a thing in order to have the dist persistent, just push the image to the registry after the build is done.
But to actually answer your question:
There is a feature request hanging for a very long time for the same problem you asked about: here. Currently there is no safe and professional way to do it as GitLab members state. Although you can push back changes as one of the GitLab members suggested (Kamil Trzciński):
git push http://gitlab.com/group/project.git HEAD:my-branch
Just put it in your script section inside gitlab-ci file.
There are more hack'y methods presented there, but be sure to acknowledge risks that come with them (pipelines are more error prone and if configured in a wrong way, they might for example publish some confidential information and trigger an infinite pipelines loop to name a few).
I hope you found this useful.

maintaining different package.json and config giles for dev and prod

I have a production branch and a development branch on git.
When merging changes from development branch to production, I would like to ensure that my package.json and gulpfile.js do not get merged in the changes.
How can I prevent this from happening? For deployment sake, I want to preserve productions package.json and gullpfile.js as being untouched with any possible changes to developments package and gulpfile
You can use Git attributes to tell Git to use different merge strategies for specific files in your project. One very useful option is to tell Git to not try to merge specific files when they have conflicts, but rather to use your side of the merge over someone else’s.
Full article : http://git-scm.com/book/en/v2/Customizing-Git-Git-Attributes#Merge-Strategies
I believe the best practice would be to save all files in both branches, then define in the deployment process which file should be copied to root (this file will be the one used).
In my opinion it is confusing to maintain many files per branch (especially when working with git flow).
If you want it even "cleaner", in my opinion (and I think it cover most scenarios, but it might not fit your needs), you should have only one gulpfile.js, as well as package.json and an additional config file that manages the differences (this file can be copied on deployment, or you can choose which config to use by env variable).

Orchard CMS - Multiple Module Directories

Is it possible to configure multiple root Module directories in Orchard? My use case is that I want to keep my custom modules completely separate from the GIT clone of the orchard repository and to make it easier to pull down the latest orchard changes without having my customizations in the mix.
One solution for this problem that I often use is to store the modules in separate repositories and create hardlinks in the Orchard's Modules folder. For example, if you store your module's code in C:\Modules\MyModule and you want to use this with an Orchard enlistment in C:\Orchard, then you can create a hardlink (using mklink command in cmd.exe) in C:\Orchard\src\Orchard.Web\Modules which points to C:\Modules\MyModule. You can then use the module's code as if it was located directly in the Modules folder. You can even easily modify the code in the Modules folder and then commit the changes from C:\Modules\MyModule.
Here is an example of a script which creates such links:
https://github.com/Proligence/OrchardPs/blob/master/MapToOrchard.cmd
This is currently not supported but most possibly will in the next major version of Orchard since there is an open PR for it: https://github.com/OrchardCMS/Orchard/pull/5973

Resources