Github post hook to DEVELOPMENT VM - linux

Overview
I have 2 servers, One server is local and other is hosted off site.
1 - Production
1 - Development
The Production is hosted and has an outside IP to be accessed from anyone on the web.
The Development can only be accessed internally ( No outside IP ).
They both use the same URL www.blah.com and our developers manage to switch between the two sites by editing their windows hosts file to point to the correct server.
The Problem
How would I update the development server though Github on a push with a hook becuase there is no internet URL? I suppose I could create a CRON but I would love to use a hook somehow to only update when a push happens. Production has a URL so I can use Github to do a post hook to update.

If I understand correctly:
When PROD is pushed to, you want to update DEV
PROD cannot access DEV
In this case you PROD cannot update DEV directly. If there is another site MIDDLE that PROD can access, and MIDDLE can access DEV, then you could setup a chain of triggers from PROD -> MIDDLE -> DEV. Otherwise the only way is a cron job on DEV, polling periodically.
If PROD could access DEV directly then you could setup a web hook, which could trigger a script on DEV, which could perform a pull from PROD. This is a common practice.

Related

nodeJS/GitlabCI: How to serve decentralized productive application

I'm developing a nodeJS application using nextJS and an expressJS application. And I'm using an own gitlab instance for managing the git repository.
But the current application should not be deployed to a webserver at the end, but I need to create decentralized productive application. To make it a bit clearer:
Developing the application locally
Push application to my remote server
My customers should be able to get the productive app code from my remove server
Customers will run the application on there local environment - should be able to pull new versions from the remove server
So the application itself won't run on my remote server, but on the local server of the customers.
Normaly I would use my CI to test and build the application (which is be done by npm run build). Then I build an docker image which I use to run the application on my server. But all that is normaly working on the same server.
In this case I need to build the application and serve it to the customers / the customers should be able to pull the productive code. How can this be done.
Maybe I lose sight of the wood for the trees... and that's why I'm asking for help/hints.
There are a number of ways you can do this and a number of tools you can use as well. You probably want a pipeline similar to the following.
Code is developed locally, committed, and pushed to the self-hosted gitlab.
GitLab CI, (or any other CI configured) will then run CI of your code.
The final step of the CI is to create a "bundle" of your application. This is probably a .zip or similar and this will be pushed to a remote storage location. It is also possible to ensure that this is done only when pushing to specific branches (such as master).
You can use a number of things as your remote storage location, such as some sort of AWS S3 bucket, or something more complex such as Nexus (there are many free alternatives).
You would then want to give your customers access to either this storage location (if you're using something like S3, or Digital Ocean Block Storage, etc), or access to your distribution repository (such as Nexus).
You should be able to generate some sort of SSH key that you can put on your GitLabCI server and use to publish to these places. It should then be a simple case of making a HTTP call to upload a file to the relevant source. This would often be called when everything has been successful, and only for specific branches. For example if all your tests pass and you're on the master branch, zip up all your code and make a HTTP call to push the new zip file to AWS S3 which your customers have access to.
For further ideas, you could make your storage / distribution location into an FTP server if you wanted to, or a local network drive depending on what your needs are for distribution. If you're just dealing with docker for your customers, then I'd suggest building a Docker image and self-hosting a docker registry. Push to that registry after you've built the image, and that would be the end of your CI run.
As a side note, if your customers are using docker you could create a docker image either push it to a registry or export it as a .tar and upload it to a file storage location (S3 for example). This would make things simple for your customers and ensure you control the image creation step (if that's something you want to manage).
The gitlab ci docs might help you with the specifics of uploading artifacts to various locations.

Deploy two VSTS repositories to one Azure web app

Let's say I have an Azure App Service web app at foo.azurewebsites.net. The code for the web app (a simple Node.js server and React frontend) is hosted on VSTS, and a custom deployment script is configured build and deploy the web app every time code is pushed to the repository's master branch. In other words, the standard web app configuration.
Now, all of my API code (just a Node.js server) is in another repository on VSTS. I'd like to be able to do the following:
Have all requests to foo.azurewebsites.net/api be handled by the API server (an implication of this, which I would nonetheless like to state explicitly, is that the server can ask the browser to set cookies that the web app can then read, and vice versa).
Set up similar continuous deployment for the API server, such that it gets redeployed whenever there are code changes in the API repo.
Be able to maintain the web app and API repositories completely separately.
This seems like a fairly standard scenario...is there an accepted solution? I came across this, but it seems like a pretty hacky way to do it, not to mention the fact that I have no idea what the correct URL is for the web hook for VSTS and can't seem to find any information on it. Also, that example doesn't cover how to deal with point (1) above.
EDIT: Additional clarification
Note that the accepted answer on this question is not what I'm looking for. It describes how to pull from a second repository at deployment time, but not how to have that second repository trigger deployments, or how to handle the fact that the the second repository is its own server. Additionally, it introduces a dependency between the two repositories, since the deploy.cmd is presumably under source control in the first repository.
EDIT: Virtual Directories
Thanks to #CtrlDot for pointing out that Virtual Directories are the way to solve (1). Still seeking guidance on (2) and (3).
I think the concept you are referring to is called Virtual Directories
I'm not sure which VSTS task you are using to deploy, but based on the article provided, you should be to configure it to target only the virtual directory you want to deploy to.
EDIT
Sorry for not being more clear. The AzureRmWebAppDeployment task has a parameter for virtual application name. You would simply set that in your deployment pipeline for the API project (/api) and for the main project (leave it blank)

Simple node.js app deployment on DigitalOcean from GitHub

I have a node.js application on my Github. Right now I am using Heroku for hosting it but I want to give DigitialOcean a try (the $5/month is more affordable).
I am used to using Heroku, where I just go create an app > connect it to my github account > deploy from the master branch > boom app deployed.
When I signed up for DO and started exploring it seemed way too much and too many steps to get my app deployed. I researched around to find a simpler way (similar to one I follow in Heroku) but all the blogs and YouTube videos go through the same tedious process.
I know I am being lazy but I just need a few clicks app deployment. Does anyone know a better (smarter) way I can deploy my app on DO from Github?
It will not be as easy with Heroku. It is always tempting to use cheaper services like Digital Ocean or Vultr and pay only fraction of the price (especially using coupon links that can make it free for months - Digital Ocean, Vultr) but having your own VPS means that you need to manage it yourself. Simplifying that process is what you pay for when you're using Heroku. But it doesn't have to be that bad.
Here is a good tutorial on how to do it:
https://www.distelli.com/docs/tutorials/build-and-deploy-nodejs-to-digitalocean/
And see this list of tutorials - search for those with "deploy" in the title:
https://www.digitalocean.com/community/tags/node-js?type=tutorials
Basically you have few options that I would consider here:
A semi-manual deploy with git - You can install a git server on your VPS and push to it whenever you want to deploy a new version
Automatic deploy with git - You can add a deployment process to you CI scripts that will do what you do manually in (1) but after all tests pass
You can trigger a pull from git on the server with ssh or a custom API
You can do (3) in your CI scripts
You can add a custom webhook in GitHub to notify your server about new version and your server may then pull the code and restart
You can add a custom webhook in CI and do the same as in (5)

How can I solve the deployment/updating of dockerized app on my VPS?

Not easy to make good title for this question so if someone have better idea please edit.
That's what I have:
VPS (KVM)
Docker
Nginx-proxy so all docker containers supposed to be exposed are automatically exposed to appropriate domain.
Some apps like Wordpress are just using container with connected volumes which are accesible by FTP so this is not an issue to manage them/update stuff etc.
I have SailsJS app (NodeJS) which I have to dockerize. It will be kept updated quite often.
I will have some apps written in C#(ASP.NET) / Java (Spring) with similar scenario as in point 5.
Both 5 and 6 source code is stored on BitBucket but can be changed if it would be better to have self hosted git server to solve issues.
What I am looking for is to have automated process which will build the docker image when I do commit and make sure that docker will pull the new image and restart container with new content. I do not want to use DockerHub as there is only 1 private repository so it will not work for long term.
I thought I can do it with Jenkins somehow but have no idea how...
You can setup private GitLab server.
It provides THREE necessary things - Git repository (managed as admin by your own), completely private Docker registry (so you can privately store your own docker images) , and own CI - complete and sufficient to do what you request, integrated seamlessly and working with former two.
You would setup GitLab runner so when you do commit image being rebuilt and pushed to component-specific registry, and there are hooks and environments which allow you to set up back connection.

Development server with Heroku?

I'm building a Node.js application and I'm using Heroku as my host. I want to be able to maybe have a development subdomain for my app where I can test changes before I push it into production. It'd be great if I could do something like git push heroku dev, and access my prerelease code at dev.myapp.heroku.com or something similar. Is something like that possible without having to set up an entirely separate app? If not, how would I configure the toolbelt to push to 2 different Heroku apps from different branches of the same repository?
I usually just create another app for staging
then add the remote for the new app like
git remote add staging https://git.heroku.com/staging.git
then you can push your development branch to the staging master branch like
git push staging development:master
i think it's actually kind of nice to have 2 different apps, so you can try out new settings or addons without messing with your production instance.
you can always add a dev.yourapp.com as a domain for this app, then point your dns to it. here is a link to a detailed way to do that
https://devcenter.heroku.com/articles/custom-domains#add-a-custom-domain-with-a-subdomain
Very easy. Use Heroku Pipelines.
You put your Dev app and your Production app in a pipeline (you might want to also add a Staging app in the pipeline - up to you).
The you push to your dev app from your development environment.
Test everything in the dev app, and when you're satisfied, you simply "Promote" the compiled slug to the next phase in the pipeline.

Resources