Which hook is suitable for ignoring some (.exe , .xml) file? - hook

Which hook is suitable for ignoring the .exe, .dll, .pdf, binary file from committing in GitLab. I don`t know which one should be used. (pre-commit, post-commit, .... and so on). But my final moto is that user should not commit the Binary file(.exe, .dll, .pdf) in git branches.

Pre-receive or pre-commit hook is suitable for this kind of operation

Related

How to patch Linux kernel source code in yotco

The part I'm working on is kernel-devsrc, which is in the recipe recipes-kernel.
I want to change one of the source .c files in drivers/usb/serial in kernal-devsrc. From some of the online materials, I need to:
Have my own layer
In the layer, need a recipe with the same name as recipes-kernel (and further more, recipes-kernel/linux)
Add the .bbappend file and patch file.
The problem is: to create a patch file I need to know the 2 git SHAs of before and after the change, but I don't have access to the third party recipes-kernel, how do I get the SHA??
OR, if that is the wrong way to do this, could you point out the right way to do it? Thanks!
NOTE: This is problem is not like this one: How patching works in yocto, which the author has access to the source code (.c and .h files). I DON"T have access to the source code, the yotco kernel I'm working on is from a public git repo, and I am not able to git commit to get the SHA, which is necessary to create the patch file.
So, the way I do it is to use Quilt, follow the steps there then good to go:
https://www.yoctoproject.org/docs/1.8/dev-manual/dev-manual.html#using-a-quilt-workflow
I don't need to know the SHA (though I still don't know why others in my organization end up writing SHAs in the patch files and how did they know the SHAs).
The power of Yocto is precisely that it makes it relatively straightforward to patch any existing recipes, without requiring write access to the upstream project source code or Yocto layer.
As a pre-requisite, the project needs to have its own layer to track the patches. Then, the easiest way it to use devtool. The general idea is to:
Create a local sandbox to patch the project: devtool modify RECIPE_NAME (use the name of the target recipe here). This command will create a temporary workspace and print the path to this workspace.
Move to the temporary workspace, apply the needed patches and commit them one by one.
Once all the desired patches have been applied, use devtool finish RECIPE_NAME CUSTOM_LAYER_NAME to save the chances as clean patch files in a bbappend in the custom layer.
Under the hood, devtool modify initializes a (writable) git repository in the sandbox. When devtool finish is invoked, devtool checks the list of extra-commits and saves them as patch files in a .bbappend in the target layer.

Checking out to another commit with git while reading the file

I am creating a server that handles version control of files in server and let the client view them at specific commit if they wanted.
The way I implement this is that when user clicks a specific commit, I call checkout [hash of commit] to revert the file back to what it was and then read from that file.
The problem is that two people may be trying to read different commits of the same repository at the same time, meaning the state of file may change while reading from the file.
I tried checking out another commit while reading from it and it seems to work okay but I cannot be sure when the it is scaled.
I am using nodeJS and express for my server. When nodeJS starts reading file, will it still be the same state as the point when it started reading or would it change along with the change that is forced by git if I checkout another commit while reading the file?
Instead of using checkout, consider show:
git show <commit id>:<filename>
This will print the contents of the file at that commit. If you absolutely need it in a file, generate a unique temporary filename and redirect the output:
git show <commit id>:<filename> > tmpfile_uniquesuffix

sync two vobs file (by clearfsimport) without checking in the updated file

I am using following command to sync B vob files from A vob
clearfsimport -master -follow -nsetevent -comment $2 /vobs/A/xxx/*.h /vobs/B/xxx/
It works fine. But it will check in all the changes automatically. Is there a way to do the same task but leave the update files in a check out status?
I want to update the file for B from A. Build my programme, and then re-cover the branch. So if the updated files is an check out status, I can do unco later. Well with my command before, everything is checked in. I cann't re-cover my branch then.
Thanks.
As VonC said, it's impossible to prevent "clearfsimport" to do the check in. And he suggested to use a label to recover back.
For me, the branch where I did "clearfsimport" is branched from a label.Let's call it LABEL_01. So I guess I can use that label for recovery. Is there an easy way (one command) to recover the files under /vobs/B/xxx/ to label LABEL_01 ? I want to do it in my bash script, so the less/easy the command is, the better.
Thanks.
After having a look at the man page for clearfsimport, no, it isn't possible to prevent the checkins.
I would set a label before the clearfsimport, and modify the config spec for the new version to be created in a branch (similar to this config spec).
That way, "re-cover" the initial branch would be easy: none of the new version would have been created in it.

Artefact folder structure does not contain empty directories

I'm trying to store whole the output of my build, this includes some empty folders. These aren't included by the artefact mechanism in teamcity:
What doesn't work:
OAR\=> OAR.zip
OAR->OAR.zip
OAR
Inside of OAR i have a folder structure that needs to be stored. I know i could put a placeholder file in each but that is not the answer i'm after. Otherwise ill have to zip it myself?
Unfortunately TeamCity, by design, searches for files and uploads them as artifacts which means that empty folders are never included. Given the open and very old issue in the TeamCity tracker I doubt they are going to fix it any time soon.
I would recommend zipping the folder yourself, that is the approach we have taken. How you implement that depends on the build technology you are using. For example, if you are building using Nant you could add the zip task to your build, there are similar options for MSBuild and Ant.
If you don't want to rely on the build performing the zip I would recommend installing 7zip on your build agents and using the command line to perform the zip. Just remember if you want 7zip to include empty directories use * as the wildcard rather than *. * like so:
7z a -r OAR.zip *
Technically you could use powershell to do the zipping, which would be better than having to install something on your agents. I haven't tried this option myself.
Apologies for not linking all my references above. Apparently, and understandably so, I need at least 10 reputation to post more than 2 links.

Layering projects on top of each other with git

Let there be:
There are different repositories repoA, repoB and repoC each respecting the same directory layout principles, which are to be merged onto a third repoM's working directory (the "master" project).
repoM has an atypical setup (--work-dir and --git-dir are sepparate). repo[A-C] are cloned as bare, and they are set as core.bare = false and core.worktree=<--work-dir-of-repoM>.
The requirements:
I need to always have an overview over the history of all files in repoM's work-dir, which could have stemmed from repo[A-C]. With this approach, I lose all that information.
Alternative:
I've been thinking about using git-subtree instead (git version 1.7.11.2, so it's already built-in), leaving repo[A-C] bare, and then
git pull -s subtree, or
git subtree ...
With the subtree pull strategy, I lose the history on a merge conflict (git blame says so).
I've never used subtree before, but from my understanding it's not possible to merge files from repo[A-C] into repoM's work-dir, those files must be put into a subdirectory of repo[A-C]. This is definitely not what I need. Why? Because of the following ...
Problem statement:
You have different git repositories each containing different sets of files, usually configuration files and some shell scripts. You want to put everything in the $HOME (which is <--work-dir-of-repoM>) directory from all those repositories. You should be able to see at all time where each file comes from, edit, commit and push changes to each one's origin. You've guessed it, it something like vundle, but generalized for any kind of configuration of any program, not just vim bundles. If a conflict occures, one should be able to track down which two authors of the same file need to get in touch with each other and make up a deal (if one needs to be made).
This is for an open-source project I'm trying to get a prototype working, so any help is highly appreciated. Also ideas about already existing projects which do this in a similar manner are highly appreciated.
Note: the "master directory" does not necessarily have to be $HOME, I've used it as a possible hint on the kind of problem this could solve.
Why not simply use Git Submodules in your "master project"?

Resources