Here's my pre-commit hook:
#!/bin/sh
exec node build.js
That code works fine when I change pre-commit to pre-commit.sh and run it, it also executes correctly when I just run exec node build.js in the terminal. The build file works fine.
Here's build.js:
var fs = require("fs")
var through2 = require('through2');
var markdownPdf = require("markdown-pdf")
var removeMarkdown = require("remove-markdown")
var resume = fs.createReadStream("README.md")
var pdf = fs.createWriteStream("Resume - Desmond Weindorf.pdf")
var txt = fs.createWriteStream("Resume - Desmond Weindorf.txt")
var md = fs.createWriteStream("Resume - Desmond Weindorf.md")
process.stdout.write('Building other file types...\n')
// pdf
resume.pipe(markdownPdf({ paperBorder: "1.4cm" })).pipe(pdf)
// txt
resume.pipe(through2(function(line, _, next) {
this.push(removeMarkdown(line.toString()) + '\n');
next()
})).pipe(txt)
// md
resume.pipe(md)
I thought it might be ending prematurely (and probably is) before the new files are written to, but the terminal should still display the initial write output in that case.
This was my output on a commit (the pdf was changed beforehand to test if it was overwriting with new changes):
Desmonds-MacBook-Pro:resume desmond$ git commit -am "updated resume"
[master 7faab35] updated resume
4 files changed, 36 insertions(+), 34 deletions(-)
rewrite Resume - Desmond Weindorf.pdf (81%)
What am I doing wrong here?
Related
I want to use a nodejs script to clone and do some other ops at a given repo. However, whenever I do shell.cd(path) as seen below it crashes with the
"No directory name could be guessed"
Here's the script
const nodeCron = require("node-cron");
const shell = require('shelljs');
const path = './';
require('dotenv').config();
const start = Date.now();
async function GitOps(){
console.log("Running scheduled job", start);
shell.cd(path);
shell.exec('git clone -b dev https://',process.env.USERNAME,':',process.env.PASSWORD,'#github.com:Jamesmosley/xyz-git-ops.git');
return console.log("Job finished");
}
const job = nodeCron.schedule("* * * * *", GitOps);
I mean to clone right into my working directory. I tried some stuff like adding 'pwd' at the const path and adding the root folder at the end of the clone command, to no avail:
shell.exec('git clone -b dev https://',process.env.USERNAME,':',process.env.PASSWORD,'#github.com:Jamesmosley/xyz-git-ops.git' ./);
After all, the only thing missing was literaly pasting my absolute path into the path variable:
const path = '/Users/Jamesmosley/Documents/Git Ops Cron/repos';
then doing shell.cd(path)
I want to update the package version automatically when I create a pull request from release branch to master and after that I want whenever I merge it, the pre-merge git hook will be executed to launch another script.
pre-merge-commit:
cd my_app
node ./hooks/post-commit-version
RETVAL=$?
if [ $RETVAL -ne 0 ]
then
exit 1
fi
hooks/post-commit-version:
#!/usr/bin/env node
const exec = require('child_process').exec;
const path = require('path');
const moment = require('moment');
const fs = require('fs');
function getBranch(){
return new Promise((resolve, reject) =>{
exec(
"git branch | grep '*'",
function (err, stdout, stderr) {
if(err)reject(err)
const name = stdout.replace('* ','').replace('\n','');
resolve(name)
}
)
});
}
getBranch()
.then((branch) => {
if(branch === 'release') {
const currentDate = moment().format('YY.MM.DD')
var pathToFile = path.join(__dirname, "../package.json");
if (fs.existsSync(pathToFile)) {
const data = fs.readFileSync(pathToFile, 'utf-8')
const content = JSON.parse(data);
content.version = currentDate;
fs.writeFileSync(pathToFile, JSON.stringify(content, null, 2), 'utf8');
exec(`git add ${pathToFile}`, (err, stdout, stderr) => {
if(err) console.log(err)
console.log(stdout)
})
} else {
console.log("Cannot find file : " + pathToFile);
return;
}
}
return;
})
.catch(error => {
console.log(error)
})
When I try this locally, with pre-commit hook and execute the git commands manually, it works successfully and update the repository in github as the one I want it to be. But I'm not sure that git hooks are executed in Github server when I click the merge request button.
The short answer is no.
Hooks are tied to one specific repository and are not transferred by Git operations.1 Any hooks you set up in your repository are not set up in some other repository. So the hook you have in your repository acts in and on your repository, but if you have a second clone elsewhere, it does not act in and on that second clone.
Besides this, GitHub use a different mechanism ("GitHub Actions") and just don't let you put any hooks into their repositories in the first place.
1If your OS provides symbolic links, you can (manually, once per clone) install a symlink as a Git hook, with the symlink pointing to a file in the work-tree for your repository. In this way, you can get a hook that is affected by various operations: since the hook's actual executable code lives in your work-tree, things that affect that file in your work-tree affect the hook.
Similarly, on OSes that don't provide symbolic links, you can (manually, once per clone) install a hook script-or-binary that works by running a script-or-binary out of your work-tree. That is, rather than relying on the OS's symbolic link mechanism to run the file directly from your work-tree, you write a hook whose "run" operation consists of "run file from work-tree and use its exit status as the hook's exit status".
I am having this command I use to get the contents inside a git repository:
git archive --remote=ssh://git#git/repository.git HEAD filename.txt | tar xvOf -
The first part of the command returns the contents of filename.txt inside the repository to the standard output.
The second part is there in order to remove the pax_global_header that git automatically adds to the contents.
I want to implement this command as a spawned child_process on node.js v0.10.36
Here is what I've tried:
var git = spawn("git", ["archive","--remote=" + repositoryUrl,"HEAD",filename]);
var tar = spawn("tar", ["xvOf", "-"]);
var out = [];
tar.stdout.pipe(git.stdout).on('data', function(data){
var string = data.toString();
if(string) {
out.push(string);
}
});
but when I run this, the strings I get inside the data handler are like the git output never gets piped through the tar process
What am I doing wrong?
I figured out the answer 10 seconds after posting this...
What I should do was pipe the output of git to the input of tar and work with the streams outputted by tar
So my code looks somewhat like this:
var git = spawn("git", ["archive","--remote=" + repositoryUrl,"HEAD",filename]);
var tar = spawn("tar", ["xvOf", "-"]);
var out = [];
git.stdout.pipe(tar.stdin);
tar.stdout.on('data', function(data){
var string = data.toString();
if(string) {
out.push(string);
}
});
....
I want to filter just the files which include the directory ui-router somewhere in the middle of the path.
I have the following code:
var gulp = require('gulp');
var mainBowerFiles = require('main-bower-files');
var debug = require('gulp-debug');
var gulpFilter = require('gulp-filter');
gulp.task('default',function() {
var bower_files = mainBowerFiles();
var js_filter = gulpFilter(['**/*.js']);
gulp.src(bower_files)
.pipe(js_filter)
.pipe(debug({title: 'unicorn:'}))
var js_filter = gulpFilter(['**/ui-router/**']);
gulp.src(bower_files)
.pipe(js_filter)
.pipe(debug({title: 'unicorn1:'}))
});
The output is:
[12:10:53] unicorn: bower_components\ngstorage\ngStorage.js
[12:10:53] unicorn: bower_components\ui-router\release\angular-ui-router.js
[12:10:53] unicorn: bower_components\x2js\xml2json.min.js [12:10:53]
unicorn1: 0 items
Meaning that ['**/*.js'] works to filter out all js files.
But ['**/ui-router/**'] does not work. What is problematic with this pattern?
I read the following doc https://github.com/isaacs/node-glob and i don't see why it should not work.
You can filter the result of main-bower-files:
var files = mainBowerFiles(/\/ui\-router\//);
After hacking with this a long time i found the issue.
In gulp-filter the vinyl file.relative property is sent.Comment from Sindre Sorhus
In our case the files are without globs(What i understand) and therefore we get just the name of the file without the directory.
The solution is to write instead of gulp.src(bower_files) gulp.src(bower_files,{base:__dirname})
Here we say gulp from where to start the relative file.
I'm trying to write a Gulp task that will bump my project version and commit the changes to the relevant files, but only if the version has not already been bumped.
var fs = require('fs'),
gulp = require('gulp'),
$ = require('gulp-load-plugins')();
/**
* Plugins being loaded are:
* gulp-bump
* gulp-git
* gulp-if
*/
// Wrapping module.exports = function... omitted
function getVersionFromFile() {
return JSON.parse(fs.readFileSync('./package.json')).version;
}
// This task is run as part of the build process
gulp.task('bump-version-if-needed', function(done) {
// Only bump the version if the current version is the same
// as the most recent git tag
$.git.exec({args: 'describe --abbrev=0'}, function(err, stdout) {
var lastTag = stdout.match(/v(\d*\.\d*\.\d*)/)[1],
currentVersion = getVersionFromFile();
gulp.src(['./package.json', './bower.json'])
.pipe($.if(lastTag === currentVersion, $.bump()))
.pipe(gulp.dest('./'))
.pipe($.if(lastTag === currentVersion, $.git.add()))
.pipe($.if(lastTag === currentVersion,
$.git.commit('Version is now ' + getVersionFromFile())))
.on('end', done);
});
});
This nearly works, except that the second call to getVersionFromFile() to get the version number for the commit message is returning the old, un-bumped version, as it is called before the stream is executed. I can't pass a callback to $.bump() to retrieve the new version, it only takes an options object.
I had previously split the git add and git commit part of this task off into their own task - getVersionFromFile() then returned the correct value, but I lost the ability to conditionally run the git commands and an error would be thrown if there were no changes to commit (which would happen when the version hadn't been changed by bump-version-if-needed).
Is there a way to change how I'm calling getVersionFromFile() so that it's only run after Gulp has written the changes to my file?