Install JUST the `dist/` folder of the repo? - jspm

When I jspm install certain packages, what looks like the entire source of the dependency is downloaded. I'm all about reviewing the full source of something I incorporate, but the packages folder of a consuming app doesn't seem like the right place for it. Is there a way to specify a source subfolder (ie dist/) for pulling rather than the repo's root folder? If this is documented, I'm having trouble teasing such search results apart from other posts about overriding the destination folder.

Related

Optimize retrieval of multiple git-sourced terraform modules from the same repo

We have terraform code in a git repo that references custom modules in another private repo:
module "myModule" {
source = "git::https://mygiturl//some/module"
bla bla bla...
}
When we reference multiple modules that live in the same git repo, terraform init will go and clone the same git repo repeatedly for every module reference. In the end, it takes minutes to do something that would take seconds if the same repo were not cloned repeatedly into different folders.
What options do we have for optimizing the module retrieval for speed?
The terraform init command does include an optimization where it tries to recognize if a module has a module package address that matches a module that was already installed, and if so it will try to copy the existing content already cached on local disk rather than retrieving the content over the network a second time.
In order for that to work though, all of the modules must have the same package address. The "package address" is the part of the address which tells Terraform what "package" (repository, archive) it should download, as opposed to which directory inside that package it should look in to find the module's .tf files.
If you are specifying particular subdirectories inside a single repository then you are presumably already using the Modules in Package Sub-directories syntax where the package name is separated from the subdirectory path using a pair of slashes //, giving a source address like this:
module "example" {
source = "git::https://example.com/foo/bar.git//path/to/directory"
}
In the above, the package address is git::https://example.com/foo/bar.git and the subdirectory path is path/to/directory. It's the package address portion that needs to match across multiple module calls in order for Terraform to detect this opportunity for optimization.
Another option, if your goal is to have everything in a single repository anyway, is to use only relative paths starting with ../ and ./ in your module source addresses.
When you specify a local path, Terraform understands it as referring to another directory within the same module package as the caller, and so Terraform doesn't need to download anything else or create any local copies in order to create a unique directory for that call.
This approach does assume that you want to have everything in a single repository. If you have a hybrid approach where some modules are isolated into separate repositories but others are kept together in a large repository then that is a design pattern that Terraform's module installer is not designed to support well.
If the installer optimization isn't sufficient and you cannot use a single repository for everything then the only remaining option would be to split your modules across multiple smaller packages. A Git repository is one example of a "package", but you can also potentially add a level of indirection by adding a CI process to your repository which packages up the modules into separate packages and publishes those packages somewhere else that Terraform can install from, such as .zip files in an Amazon S3 bucket.
Terraform does not offer a way to share the same local directory between multiple module packages because modules are sometimes written in a way that causes them to modify their own source directory during execution (not a recommended pattern, but still possible) and in that case the module is likely to misbehave if multiple instances of it are trying to work in the same directory.

Share and manage rc files (or config files) between mltiple projects

I'm working on different projects based on node, and one thing I always have to do is create the configuration files in all the projects since you all share a lot of configuration, for example, in all projects I use commitlint, lint-stage, husky, eslint, nodemon, and typescript and other settings.
How could I share all these settings in all projects and if I update any of them, update them in all projects?
The first thing that occurs to me is to create a npm packet with all the configurations, and several scripts, that copies / updates these configuration files in the root directory of the project where user is, something like
> myscript update
> myscrpt init
Another option would be to use the configurations programmatically, that is, instead of using a .rc use a .js, but this would force me to manage the dependencies in each project and create a .rc file that uses the configuration of the js file which is in the configuration package.
Another option is to create a github repository as a template, but if I update this repository, the projects I have created using this template are not updated, right?
What do you think is the best way to do it?
Even though git submodules seem to be discouraged lately, I think it's the most reasonable choice (assuming all of your projects are git-based): https://git-scm.com/book/en/v2/Git-Tools-Submodules
In your case, you'd have a 'common' repository with configuration, eg. configuration and two projects: projectA and projectB. Both of them would have the submodules added:
git submodule add <your_git_repo_server>/configuration
Please notice, however, that submodule refers to a particular commit (not a branch or tag - commit). You always have to take big care of synchronizing your dependencies correctly.

Share one file across multiple git repo to be updated by multiple users

I am working on automating the markdown spell check for all the documents on my website which involves multiple git repo. I have a .spelling file that contains all the word to be excluded from the documents. I would like to keep it one file and updated across the entire website. I can get it to work for one repo. I looked into the npm package method. Is there a way to configure package.json to share this file to many repo? Or is there a better way to do it without npm? Thanks!
Make a separate spell-check repository with the .spelling file and script in it, then include it as a submodule in each of your docs repos. You can then reference it from each repository separately, and pull its latest updates into each one.
This could be cumbersome if you have a large number of docs repos, so another alternative is to centralize the spelling check script by making a separate repository for it and adding a configuration file to tell your script which Github repositories to spellcheck. This way, you can selectively apply the spell check process to any number of repositories in your organization.

How to stage repository with multiple folders

I have a multifolder project that I want to push to a remote repository. It's a pretty standard issue structure with a root folder that contains a client and a server. For some reason when I want to stage this, server is the only file acknowledged and committed. When it's pushed to GitHub, the repository shows both client and server folders, but the server(which I built from scratch) is the only one that contains files and folders. The client folder(which is built from create-react-app) is an empty folder.
Root_
|
client(create-react-app client)_
| _.gitignore
| _package.json
|
server_
_.gitignore
_package.json
In other projects I've built everything from scratch and there's been no problem, so I figure it has something to do with the existence of two .gitignore files, two package.json files, or the create-react-app itself. I've searched for answers here and in git's documentation. I attempted to use submodules and nothing I've found so far seems to work. Any help would be appreciated.

How do I properly deal with a symlink folder when using Subversion?

I want to add my project to a subversion repository. The project folder contains a symlink to a folder containing thousands of txt files that I don't need to add to the svn repository. I DO want the symlink-folder to show up when I checkout the code, however.
It looks like I can use svn addprop svn:ignore symlinked-folder to ignore the folder, but then I'll have to add that symlinked folder to every working copy I check out before everything will work.
Is there a proper way to do this?
Perhaps there is no way to deal with this, since a symlink is a filesystem artifact. Is there a better way to handle this situation?
CONCLUSION - EDIT
After all this investigation, I committed the symlink-folder by accident and SVN added it to the repository without adding any of the files within it. Upon checkout, everything works fine. The symlink-folder checked out and works.
I am using assembla to manage my SVN repository, so that might have something to do with this success.
The answers above are right, your symlink won't work if you check out the repository on windows.
But if you're aware of that and you don't care, you can add just the symlink without its contents:
svn add -N your-symlink
man svn add here
I believe you are correct, imagine if a user checked out your repo under Windows - how would SVN create the symlink when the underlying OS doesn't support it?
Is the target folder that you are symlinking to under version control? If so, you can make use of the svn-externals property.
You are right, it doesn't make sense to add a symlink to a repository. What would happen if someone checked out the source on a machine that didn't have access to the folder the symlink points to?
One way is to structure your repository so that you can check out the codebase without having to check out documents. E.g.:
Trunk
Tags
Branches
Documents
So you only check out the trunk or branch that you are working on, and when you require it you can check out the documents.
Alternatively, use a project management tool like Redmine to store your docs. It can integrate with svn as well so you can view your repository and manage permissions through it.

Resources