Explain package-lock.json and if mine will cause a problem - node.js

I just cloned our remote repo on my new system that I just installed Node/npm on. Then I ran npm install to get all the packages installed. Was this not the right command?
VSCode is showing me huge differences in the lock file. The lockFileVersion changed from 1 to 2 and there many, perhaps hundreds, of changes in this huge file. Why would that happen and what is the potential impact of checking this in?
It looks like the changes are mostly related to node modules. example:
"node_modules/css-declaration-sorter/node_modules/chalk/node_modules/supports-color": {}
Where that entry wasn't there in the existing repo.
Or am I making a big deal out of nothing?

Your package.json file specifies limits and ranges of acceptable versions, while the lock file specifies the exact versions you are using, taking into account all the dependency resolutions that were available the last time you ran install.
In general, if your code builds and runs, you want to publish the lock file to your repository. This will ensure the production build will use the exact versions you have built with.

Related

Synchronise 2 repositories in git

I have my main project hosted in GitHub. Everything is good and works.
Now I'm trying to create a Solaris port. I make myself an OpenSolaris VM installed Solaris Studio as compiler/IDE and built.
Everything works fine.
Now what I'm thinking is that since Solaris Studio is completely different IDE/compiler from MSVC/Anjuta/Xcode, I should create a different repository (NOT A FORK) and push Solaris stuff there.
The only problem is - code synchronization.
If I make the change in my main repository and push it to remote, I want my second repository to be updated as well with the changes to the *.cpp/.h files.
Is there exist some kind of hook to do that?
Or maybe I'm better off with creating a fork? But then changes to the build system will be overwritten.
Please advise.
This is the current structure for the main project:
Project folder -> main app folder (*.cpp, *.h, *.sln, Makefile.am/Makefile.in, xcodeproj folder)
|
----> library 1 folder (*.cpp, *.h, *.sln, Makefile.am/Makefile.in, xcodeproj folder)
Or maybe I'm better off with creating a fork? But then changes to the build system will be overwritten.
I wouldn't even bother with a fork.
I would just make sure the build system is isolated in its own folder, which allows you to have in one repository two build configuration folders:
one for a default environment
one dedicated to a Solaris environment
That way, you can go on contributing to the code, from either one of those environments, without having to deal with any synchronization issue.

Share and manage rc files (or config files) between mltiple projects

I'm working on different projects based on node, and one thing I always have to do is create the configuration files in all the projects since you all share a lot of configuration, for example, in all projects I use commitlint, lint-stage, husky, eslint, nodemon, and typescript and other settings.
How could I share all these settings in all projects and if I update any of them, update them in all projects?
The first thing that occurs to me is to create a npm packet with all the configurations, and several scripts, that copies / updates these configuration files in the root directory of the project where user is, something like
> myscript update
> myscrpt init
Another option would be to use the configurations programmatically, that is, instead of using a .rc use a .js, but this would force me to manage the dependencies in each project and create a .rc file that uses the configuration of the js file which is in the configuration package.
Another option is to create a github repository as a template, but if I update this repository, the projects I have created using this template are not updated, right?
What do you think is the best way to do it?
Even though git submodules seem to be discouraged lately, I think it's the most reasonable choice (assuming all of your projects are git-based): https://git-scm.com/book/en/v2/Git-Tools-Submodules
In your case, you'd have a 'common' repository with configuration, eg. configuration and two projects: projectA and projectB. Both of them would have the submodules added:
git submodule add <your_git_repo_server>/configuration
Please notice, however, that submodule refers to a particular commit (not a branch or tag - commit). You always have to take big care of synchronizing your dependencies correctly.

Monitor changes done by small program

I need to install small programs I do not fully trust.
Therefore I would like to monitor all files for changes - whether this script places some files it is not supposed to or edits others.
As I want to monitor all folders and files I thought about using something similar to rsync - but is there an alternative to only watch for changes?
Does this way guarantee that I catch everything the software changes? Or are there some kind of "registry-entries" / changes in the configuration, I could miss?
Thanks a lot!
I would suggest you use some kind of sandbox (probably the most straightforward way nowadays is to use Docker).
You could use Git to track all the changes that are made into the sandbox/container:
Initialize a git repo in the root dir
Add all files and commit as the base version
Execute the install script you do not trust
Using git status is going to show you all the changes that were made during installation.

Delete files not needed anymore because of upgrade

When building a new version of my application, it's possible that files that were needed in a previous version are no longer needed. I would like these to be cleaned up during an upgrade. My ideas so far:
I considered using the InstallDelete, but this would require the current build to know what files the previous build contained. The build process is automated, and I'd prefer that the build didn't have to check in anything. (It makes tagging and the like rather messy.)
I also considered running an uninstall, but this would mean that the upgrade could not be fully rolled back (since the application would have been uninstalled).
Is there a way to detect files that were present in the old install but not the new one during the install and to have Inno delete them in a way that could be rolled back (or that happens only if the install was successful)?

Clean a maven repository - Delete all files except the x newest per folder

I would like to clean my local maven repository, but keep the last y snapshot versions for each artifact.
I found this script but it works by date.
I think I could adapt it to make it count files, but I'd like to know if such thing could be done directly with a linux find command.
Any idea?
Thanks
If you have an in-house repository manager it's probably simpler to clear your local repository and let it populate itself again over time.
If you do not have a repository manager, get one, even if it is just for yourself. I use Nexus, but there are alternatives. Nexus let's you decide how many snapshots to keep and/or for how long. In the end it's going to be easier to manage artifact lifetime with Nexus rather than try to devise complicated scripts.

Resources