Orchard CMS - Multiple Module Directories - orchardcms

Is it possible to configure multiple root Module directories in Orchard? My use case is that I want to keep my custom modules completely separate from the GIT clone of the orchard repository and to make it easier to pull down the latest orchard changes without having my customizations in the mix.

One solution for this problem that I often use is to store the modules in separate repositories and create hardlinks in the Orchard's Modules folder. For example, if you store your module's code in C:\Modules\MyModule and you want to use this with an Orchard enlistment in C:\Orchard, then you can create a hardlink (using mklink command in cmd.exe) in C:\Orchard\src\Orchard.Web\Modules which points to C:\Modules\MyModule. You can then use the module's code as if it was located directly in the Modules folder. You can even easily modify the code in the Modules folder and then commit the changes from C:\Modules\MyModule.
Here is an example of a script which creates such links:
https://github.com/Proligence/OrchardPs/blob/master/MapToOrchard.cmd

This is currently not supported but most possibly will in the next major version of Orchard since there is an open PR for it: https://github.com/OrchardCMS/Orchard/pull/5973

Related

Share and manage rc files (or config files) between mltiple projects

I'm working on different projects based on node, and one thing I always have to do is create the configuration files in all the projects since you all share a lot of configuration, for example, in all projects I use commitlint, lint-stage, husky, eslint, nodemon, and typescript and other settings.
How could I share all these settings in all projects and if I update any of them, update them in all projects?
The first thing that occurs to me is to create a npm packet with all the configurations, and several scripts, that copies / updates these configuration files in the root directory of the project where user is, something like
> myscript update
> myscrpt init
Another option would be to use the configurations programmatically, that is, instead of using a .rc use a .js, but this would force me to manage the dependencies in each project and create a .rc file that uses the configuration of the js file which is in the configuration package.
Another option is to create a github repository as a template, but if I update this repository, the projects I have created using this template are not updated, right?
What do you think is the best way to do it?
Even though git submodules seem to be discouraged lately, I think it's the most reasonable choice (assuming all of your projects are git-based): https://git-scm.com/book/en/v2/Git-Tools-Submodules
In your case, you'd have a 'common' repository with configuration, eg. configuration and two projects: projectA and projectB. Both of them would have the submodules added:
git submodule add <your_git_repo_server>/configuration
Please notice, however, that submodule refers to a particular commit (not a branch or tag - commit). You always have to take big care of synchronizing your dependencies correctly.

Is it okay to use a single shared directory as Cargo's target directory for all projects?

Cargo has the --target-dir flag which specifies a directory to store temporary or cached build artifacts. You also can set it user-wide in the ~/.cargo/config file. I'd like to set it to single shared directory to make maintenance easier.
I saw some artifact directories are suffixed with some unique(?) hashes in the target-dir which looks safe, but the final products are not suffixed with hashes, which doesn't seem to be safe for name clashes. I'm not sure on this as I am not an expert on Cargo.
I tried setting ~/.cargo/config to
[build]
target-dir = "./.build"
My original intention was to use the project's local ./.build directory, but somehow Cargo places all build files into ~/.build directory. I got curious what would happen I put all build files from every project into a single shared build directory.
It has worked well with several different projects so far, but working for a few samples doesn't mean it's designed or guaranteed to work with every case.
In my case, I am using single shared build directory for all projects of all workspaces of a user. Not only projects in a workspace. Literally every project in every workspace of a user. As far as I know, Cargo is designed to work with a local target directory. If it is designed to work with only local directory, a shared build directory is likely to cause some issues.
Rust/Cargo 1.38.0.
Yes, this is intended to be safe.
I agree with the comments that there are probably better methods of achieving your goal. Workspaces are a simple solution for a small group of crates, and sccache is a more principled caching mechanism.
See also:
Fix running Cargo concurrently (PR #2486)
Allow specifying a custom output directory (PR #1657)
Can I prevent cargo from rebuilding libraries with every new project?

Share one file across multiple git repo to be updated by multiple users

I am working on automating the markdown spell check for all the documents on my website which involves multiple git repo. I have a .spelling file that contains all the word to be excluded from the documents. I would like to keep it one file and updated across the entire website. I can get it to work for one repo. I looked into the npm package method. Is there a way to configure package.json to share this file to many repo? Or is there a better way to do it without npm? Thanks!
Make a separate spell-check repository with the .spelling file and script in it, then include it as a submodule in each of your docs repos. You can then reference it from each repository separately, and pull its latest updates into each one.
This could be cumbersome if you have a large number of docs repos, so another alternative is to centralize the spelling check script by making a separate repository for it and adding a configuration file to tell your script which Github repositories to spellcheck. This way, you can selectively apply the spell check process to any number of repositories in your organization.

Is it possible to have commands available in specific directories only?

Is it possible to have a directory isolated bin folder? All packages installed to be available only in that specific directory?
For example I have a directory ~/projects and I would like to have git command available only in that folder.
I think you may be interested in using one of these two tools:
https://github.com/kennethreitz/autoenv
https://github.com/direnv/direnv
The first tool (autoenv, mostly written in Bash) is simpler to install and use but is not maintained anymore, and the second tool (direnv, mostly written in Go) provides more features, including the ability to unset environment variables.
For more details on their respective features, you can take a look at this GitHub issue.

How do I properly deal with a symlink folder when using Subversion?

I want to add my project to a subversion repository. The project folder contains a symlink to a folder containing thousands of txt files that I don't need to add to the svn repository. I DO want the symlink-folder to show up when I checkout the code, however.
It looks like I can use svn addprop svn:ignore symlinked-folder to ignore the folder, but then I'll have to add that symlinked folder to every working copy I check out before everything will work.
Is there a proper way to do this?
Perhaps there is no way to deal with this, since a symlink is a filesystem artifact. Is there a better way to handle this situation?
CONCLUSION - EDIT
After all this investigation, I committed the symlink-folder by accident and SVN added it to the repository without adding any of the files within it. Upon checkout, everything works fine. The symlink-folder checked out and works.
I am using assembla to manage my SVN repository, so that might have something to do with this success.
The answers above are right, your symlink won't work if you check out the repository on windows.
But if you're aware of that and you don't care, you can add just the symlink without its contents:
svn add -N your-symlink
man svn add here
I believe you are correct, imagine if a user checked out your repo under Windows - how would SVN create the symlink when the underlying OS doesn't support it?
Is the target folder that you are symlinking to under version control? If so, you can make use of the svn-externals property.
You are right, it doesn't make sense to add a symlink to a repository. What would happen if someone checked out the source on a machine that didn't have access to the folder the symlink points to?
One way is to structure your repository so that you can check out the codebase without having to check out documents. E.g.:
Trunk
Tags
Branches
Documents
So you only check out the trunk or branch that you are working on, and when you require it you can check out the documents.
Alternatively, use a project management tool like Redmine to store your docs. It can integrate with svn as well so you can view your repository and manage permissions through it.

Resources