How secure is to use scp with GitHub actions? - security

Right now I use scp-action to copy some files to one server.
How secure is this method?
What are the alternatives to using GitHub Actions, I was thinking of running a custom action of scp and setting up a local runner on my side.

One other alternative is to configure a bare repository on your server, and adding it as a second remote on your local repository.
Now every time you want to deploy code to your server, you push to this remote. You then create a git hook on your server that fires post-push, and automatically executes a script that restarts a service, for example.
Read more here
For me, I am having a hard time choosing between these two alternatives because I have some unanswered questions :
for github actions, how secure is the SSH key being ran from a github runner ? and given that my codebase is huge, isn't it a bit overkill to scp all my files after a hotfix commit where I changed only 1 or 2 files ?
for git bare repo: would the size of the git folder be a problem ? and how to secure my server so it would not serve the .git folder ?

Related

Good practices for pulling from git repo into production server

I have a DigitalOcean VPS with ubuntu and a few laravel projects, for my projects initial setup I do a git clone to create a folder with my application files from my online repository.
I do all development work in my local machine, where I have two branches (master and develop), what I do is merge develop into my local master, then I push from master into my local repository.
Nw back into my production server, when I want to add all the changes added into production I do a git pull from origin, so far this has resulted into git telling me to stash my changes, why is this?
What would be the best approach to pull changes into production server? take in mind that my production server has no working directory perse, all I do in my VPS is either clone or push upgrades into production.
You can take a look at the CI/CD (continuous integration / continuous delivery) systems. GitLab for example offer free-to-use plan for small teams.
You can create a pipeline with a manual deploy step (you have to press a button after the code is merged to the master branch) and use whatever tool you like to deploy your code (scp, rsync, ftp, sftp etc.).
And the biggest benefit is that you can have multiple intermediate steps (even for the working branches) where you can run unit tests which would prevent you to upload failing builds (whenever you merge non-working code)
For the first problem, do a git status on production to see which files that git sees as changed or added and consider adding them to your .gitignore file (which itself should be a part of your repo). Laravel generally has good defaults for these, but you might have added things or deviated from them in the process of upgrading Laravel.
For the deployment, the best practice is to have something that is consistent, reproducible, loggable, and revertable. For this, I would recommend choosing a deployment utility. These usually do pretty much the same thing:
You define deployment parameters in code, which you can commit as a part of your repo (not passwords, of course, but things like the server name, deploy path, and deploy tasks).
You initiate a deploy directly from your local computer.
The script/utility SSH's into your target server and pulls the latest code from the remote git repo (authorized via SSH key forwarded into the server) into a 'release' folder.
The script does any additional tasks you define (composer install, npm run prod, systemctl restart php-fpm, soft-linking shared files like .env, and etc.)
The script soft-links the document root to your new 'release' folder, which results in an essentially zero-downtime deployment. If any of the previous steps fail, or you find a bug in the latest release, you just soft-link to the previous release folder and your site still works.
Here are some solutions you can check out that all do this sort of thing:
Laravel Envoyer: A 1st-party (paid) service that allows you to deploy via a web-based GUI.
Laravel Envoy: A 1st-party (free) package that allows you to connect to your prod server and script deployment tasks. It's very bare-bones in that you have to write all of the commands yourself, but some may prefer that.
Capistrano: This is (free) a tried-and-tested popular ruby-based deployment utility.
Deployer: The (free) PHP equivalent of Capistrano. Easier to use, has a lot of built-in tasks (including a Laravel one), and doesn't require ruby.
Using these utilities is not necessarily exclusive of doing CI/CD if you want to go that route. You can use these tools to define the CD step in your pipeline while still doing other steps beforehand.

Push files to GitHub from NodeJS server

I have a NodeJS server which is generating a server-side JS code into a separate folder on the server and then serves it to the user as a .zip file. I would like to be able to take this code and push it to a GitHub repository the user would specify (or even better- create a new brunch and push it to the branch). I was checking GitHub API but I could not find an endpoint, which would describe this situation. I also checked one node module, but the same story- no information about if this is possible (and how) or not.
My question is- Is it possible to take a folder on a server and push it to a GitHub repo (if all the credentials and keys are known) programatically and if so, can anyone please direct me to some resources? I tried to find something but nothing was relevant.
Thank you,
T.
I've just found these two libraries, that can help you to push your files to git from Node.js:
git-js is a lightweight wrapper around installed git binary
nodegit is a standalone Git client implementation in Node.js
Just use the provided API programmatically to commit and push files. You could create a separate branch if you want and tag your commits.
just install git bash and do it through that. I don't use anything but the terminal to do my git repos. good luck.

remote deploy scripts for nodejs?

I am looking for a way to easily deploy a nodejs app via a command line script.
I found one solution:
https://github.com/Skookum/nimbus
I also heard that the whole thing can be done with git and post commit hooks.
What would people recommend?
edit: i am deploying it to my own box where i have root
You have two options on a self hosted setup.
Do it all yourself
This entails git post-receive hooks. In short you setup your production box to host a copy of your repository, on your local machine you setup a remote, let's call the remote production.
Now when you run git push production master on your local machine, the updates are sent and the server executes the post-receive hook on your server which runs whatever you wish.
Actions you may want are: checking out/writing the data in the repo to files/folders (the git repo on the server is stored as a bare repo); restarting your webserver; notifying you that there's been a deployment etc.
I'd suggest reading up on it at http://git-scm.com/book/en/Customizing-Git-Git-Hooks and taking a look at a few tutorials, this one (http://ryanflorence.com/deploying-websites-with-a-tiny-git-hook/) looks prety legit.
Use a service to manage it for you, http://www.deployhq.com/ is the only one that springs to mind but I'm sure there's other.
Good Luck and Happy Hacking :)
There is a tool called shipit.js (https://github.com/shipitjs/shipit) which allows you to perform different deployment tasks like:
moving code from the repo to the server
restarting server
installing node_modules
etc.
You create a config file, and then runs: npx shipit deploy and all tasks you specify are performed. In case of failure, it has a rollback mechanism.
There is a nice screencast about it: https://youtu.be/8PpBySjkWEM.

git post-receive hook which connects to remote via ssh and git pulls

just trying to write my 1st git hook. I have a remote origin and when I push to it I would like a post-receive hook to fire which will ssh into my live server and git pull. Can this be done, and is it a good approach?
Ok i've got the hook firing and the live server is doing the git pull but it's saying its already up to date? any ideas?
yes, that can be done and yes, in my opinion, that is good practice. Here is our use-case:
I work in a small development group that maintains a few sites for different groups of people.
Each site has several environments (beta, which is a sandbox for all developers, staging, which is where we showcase changes to the content owners before going live, training which is what our training dude use to train new content managers and live, where everyone goes to consume content).
We control deployment to all these environments via post-receive hooks based on branch names. We may have a 'hot fix' branch that doesn't get deployed anywhere, but when we merge it with, say, the 'beta' branch, it then gets auto-deployed to the beta server so we can test how our code interacts with the code of the other developers.
There are many ways you can do this, what we do is setup your ssh keys so that the git server can ssh into your web server and do a git pull. This means you gotta add the public key for git#gitserver into your git#webserver authorized_keys file and vice-versa, then, on the post-receive hook you parse out the branch and write a 'case' statement for it, like so:
read line
echo "$line" | . /usr/share/doc/git-core/contrib/hooks/post-receive-email
BRANCH=`echo $line | sed 's/.*\///g'`
case $BRANCH in
"beta" )
ssh git#beta "cd /var/www/your-web-folder-here; git pull"
;;
esac
Hope that helps.
That can certainly be done, but it isn't the best approach. When you use git pull, git will fetch and then merge into your working copy. If that merge can be guaranteed to always be a fast-forward, that may be fine, but otherwise you might end up with conflicts to resolve in the deployed code on your live server, which most likely will break it. Another problem with deploying with pull is that you can't simply move back to an earlier commit in the history, since the pull will just tell you that your branch is already up-to-date.
Furthermore, if you're pulling into a non-bare repository on your live server, you'll need to take steps to prevent data in your .git directory from being publicly accessible.
A similar but safer approach is to deploy via a hook in a bare repository on the live server which will use git checkout -f but with the working directory set to the directory that your code should be deployed to. You can find a step-by-step guide to setting up such a system here.

Syncing website files between local and live servers using GIT?

Say I have two web servers, one local development and one live.
Under SVN I would checkout the website files to my local webserver's public_html directory and also to the live webserver's public_html directory. I would then work on the files directly on the local server and commit any changes to the central repository.
When I'm ready for those changes to go live on the live server, I would SSH in and perform an SVN update.
Essentially I have two working copies, one on live and one locally, though other users may also have working copies on their local machines. But there will only ever be one working copy on the live server. The reason for this is so that we can just perform SVN update on live server every time we want changes to be published.
How can a simiar workflow be accomplished using GIT?
To model your current work flow almost exactly do:
Set up a git repo.
Clone the repo on the server and locally.
Work locally
git push to the git repo
ssh to server
git pull.
Another way to do it would be to set up a "production" branch in git, have a cron job that continually pulls this branch on the server, and then just merge and push to the "production" branch any time you want to publish your changes. Sounds like you need a more concrete branching strategy.
See: Git flow branching model && git flow cli tool
Good luck! This is a very solvable problem with git.
You might find this useful: http://joemaller.com/990/a-web-focused-git-workflow/
In your local working copy:
git push ssh://you#yourserver/path/to/your/wc
will push the commited changes in your local version to yourserver.
Having a setup that triggers automatically pulling like leonbloy and codemac suggested may seem like a good idea at first but it tends to be very fragile. I suggest a different alternative.
http://toroid.org/ams/git-website-howto

Resources