Keeping .git/HEAD - node.js

I want to use the git-branch npm module to detect my branch in the code, but Heroku doesn't show the .git folder, so git-branch displays an error.
Is there a way to make that I can access that file?
If not, can I use a workaround, like copying the file and then restoring it, or stuff like that?
I'm auto-deploying from GitHub a Node.js app.

No, there is no way. Heroku will not send the .git folder to the build.
Even if that folder were kept, you would always see the master branch though, as it's always what you push to.
However, you can still retrieve the pushed GIT commit (not the branch) using the dyno metadata feature, which will set environment variables with various informations about your app. HEROKU_SLUG_COMMIT will be the commit you deployed.

Related

git only saving certain files I want and protect some other files from pushing

I made a small replication game called Snake, and it uses a small json database through node express. And right now, I deployed the game to Heroku using git. However, whenever I change index.js or index.html in my public path, and after commit them and git push origin master, it pushes every single file I have in my project. Including all node modules and my json database. There were people's saved scrores on my online json file, but after pushing my index.js, the online json cleared it's content.
So is there any way to only push the commited files? I tried other's solution like to make a new branch and checkout or something but it doesn't work for me. When I created a new branch and checkouted my paths, it tells me I have a bunch of files needed to commit.
Also, when I push everytime, it takes a long time to upload because it is uploading all the node modules everytime.
What you are looking for is Git Ignore. It's a way to exclude certain files from being commited. As you pointed out, in NodeJS, it's a good practice to exclude the node_modules.
There's a project named gitignore.io, which helps you craft proper .gitignore files, with common patterns.
Now that you already commited your node_module, you will have to delete the commited files with:
git rm -r --cached node_modules
git push only pushes committed file. What you want should be use ignore file to ignore unnecessary file in latest commit.
check git ignore for how to use ignore file.

How to update repository with built project?

I’m trying to set up GitLab CI/CD for an old client-side project that makes use of Grunt (https://github.com/yeoman/generator-angular).
Up to now the deployment worked like this:
run ’$ grunt build’ locally which built the project and created files in a ‘dist’ folder in the root of the project
commit changes
changes pulled onto production server
After creating the .gitlab-ci.yml and making a commit, the GitLab CI/CD job passes but the files in the ‘dist’ folder in the repository are not updated. If I define an artifact, I will get the changed files in the download. However I would prefer the files in ‘dist’ folder in the to be updated so we can carry on with the same workflow which suits us. Is this achievable?
I don't think commiting into your repo inside a pipeline is a good idea. Version control wouldn't be as clear, some people have automatic pipeline trigger when their repo is pushed, that'd trigger a loop of pipelines.
Instead, you might reorganize your environment to use Docker, there are numerous reasons for using Docker in a professional and development environments. To name just a few: that'd enable you to save the freshly built project into a registry and reuse it whenever needed right with the version you require and with the desired /dist inside. So that you can easily run it in multiple places, scale it, manage it etc.
If you changed to Docker you wouldn't actually have to do a thing in order to have the dist persistent, just push the image to the registry after the build is done.
But to actually answer your question:
There is a feature request hanging for a very long time for the same problem you asked about: here. Currently there is no safe and professional way to do it as GitLab members state. Although you can push back changes as one of the GitLab members suggested (Kamil Trzciński):
git push http://gitlab.com/group/project.git HEAD:my-branch
Just put it in your script section inside gitlab-ci file.
There are more hack'y methods presented there, but be sure to acknowledge risks that come with them (pipelines are more error prone and if configured in a wrong way, they might for example publish some confidential information and trigger an infinite pipelines loop to name a few).
I hope you found this useful.

Update Gruntfile.js and Package.json across multiple projects

I am new to the world of grunt but I feel like there must be a way to do this. Hopefully I can explain my issue in a way that makes sense so you can be of assistance.
Essentially, I have a git project, including a gruntfile, that I use to start all new websites. I clone the project, delete the .git folder and setup a new project in bitbucket for it. Over time I have had to make some modifications to the gruntfile and it is annoying to go back to an old project where I hadn't made those modifications. Is there a recommended way to ensure that my template is up to date on all of my projects?
Things to note:
1) I am familiar with grunt scaffolding but I have never used it, is this the use case for it?
2) my projects live in bitbucket and are private. My initial solution to this problem was to use grunt curl and pull the latest and overwrite the previous gruntfile
3) The issue with #2 is that I would need to put my username/password in the path and can't figure out how to prompt the user, even if I do and they enter the login incorrectly bitbucket still returns something (a bad login page) and this would overwrite my gruntfile.
Thanks in advance! I appreciate anyones input
I assume you are using git with bitbucket. If that is the case you can do a pull from a master repo that contains your template grunt file in each of your project repositories for the desired effect.
See this answer for how to pull from a remote repo.
remote repo q
Since you only care about merging in changes from the Gruntfile.js you can pull it specifically from the remote template repo. I'd suggest following this pattern assuming you add the remote reference to you template repo when necessary:
From you project repo create a new branch
Pull the Gruntfile.js from the template repo
Resolve any merge conflicts
Merge with master
See the last answer on this question for how to pull a single file:
fetch a single file

How do I package my node project files and deploy them to my server?

I have a nodejs non open source web application project that I want to deploy to a production server and/or a staging server. Is there a default way for this, or some tool that does it? I want to package all files needed and exclude the files not needed like the .git folder, the tests and other files like Gruntfile, package.json and so on.
I could of course manually package the files in a tar.gz file and send them to the correct server. But I was hoping to find a more complete and configurable tool that can do it for me.
Might be not exactly what you're asking for, but i like git for automated deployment.
You could have branches like staging and production, which are checked out on the remote server.
You can set up a git hook like post-receive to update those remotely, every time you merge changes into those branches.
Here's a tutorial: http://wekeroad.com/2011/09/17/deploying-a-site-with-git-hooks

Syncing website files between local and live servers using GIT?

Say I have two web servers, one local development and one live.
Under SVN I would checkout the website files to my local webserver's public_html directory and also to the live webserver's public_html directory. I would then work on the files directly on the local server and commit any changes to the central repository.
When I'm ready for those changes to go live on the live server, I would SSH in and perform an SVN update.
Essentially I have two working copies, one on live and one locally, though other users may also have working copies on their local machines. But there will only ever be one working copy on the live server. The reason for this is so that we can just perform SVN update on live server every time we want changes to be published.
How can a simiar workflow be accomplished using GIT?
To model your current work flow almost exactly do:
Set up a git repo.
Clone the repo on the server and locally.
Work locally
git push to the git repo
ssh to server
git pull.
Another way to do it would be to set up a "production" branch in git, have a cron job that continually pulls this branch on the server, and then just merge and push to the "production" branch any time you want to publish your changes. Sounds like you need a more concrete branching strategy.
See: Git flow branching model && git flow cli tool
Good luck! This is a very solvable problem with git.
You might find this useful: http://joemaller.com/990/a-web-focused-git-workflow/
In your local working copy:
git push ssh://you#yourserver/path/to/your/wc
will push the commited changes in your local version to yourserver.
Having a setup that triggers automatically pulling like leonbloy and codemac suggested may seem like a good idea at first but it tends to be very fragile. I suggest a different alternative.
http://toroid.org/ams/git-website-howto

Resources