which directory should I checkout our java project files into for a team build - linux

we use svn(subversion) for our source repository. On the same box, we build our project PLUS deploy it onto an appserver. All the team members(under 10, in number) will login to the Linux (ubuntu server) box and run the build script.
Question : I would like to know which directory is typically used for creating the home directory for the subversion checkout and doing the build. What type of permissions should I be giving so that the teammembers can come in to that dir, update the source code(svn update) and run the build script (ant).
P.S : I'm also interested in any understand best-practices.
Thank you,

Sounds like you need a Continuous Integration server. Install Hudson on the server and use that instead.
Hudson will automatically check out changes from Subversion and build them when something is checked in. You can also make it deploy to an app server after a successful build. And you can trigger builds manually if you want, for example for a release.
You'll find it very easy to get started with.

Related

Concurrently JS application pipeline install and build hangs (Express js for server, Create-React-App for Client)

Problem: I have a project with a server (Express Server that handles file uploading and deleting) and client (Front End Create-React-App). The project structure looks like follows:
Root Folder With Server
Client Folder
Each folder has it's own package.json. Server Package.json. Client package.json
I'm trying to build and deploy onto azure however the pipeline hangs on "npm install and build".
It seems like the build succeeds but this phase just hangs. Here is my server.js (the routes are not included) file and yaml file just in case.
I'd appreciate any kind of help. Thank you!
Troubleshooting suggestions:
In the case of ensuring that the code in github is consistent with the local code, if an exception occurs, it is recommended to replace the linux platform and redeploy.
It is recommended to use my suggestion to recreate the repository, and then check the Action status in github.
Sum up:
In general, it is more appropriate to use Linux in azure than windows. For example, Linux supports npx, and may also support other packages and commands.
When the local code can run normally, there is generally no problem when deploying to github, unless there may be modifications, which we have ignored. So make sure the code is consistent.
General correct deployment steps:
First in the portal, make sure to create a web app application (not a static web app), and select the node environment.
Make sure that the sever program can run normally locally. Create a new repository in github.
->git init
->git add.
->git commit -m'init'
->git remote add origin https://github.com/{your name}/newAppname.git
->git push -u origin master
Connect in the Portal's Deployment center.
Then check the status of Action in github.

Good practices for pulling from git repo into production server

I have a DigitalOcean VPS with ubuntu and a few laravel projects, for my projects initial setup I do a git clone to create a folder with my application files from my online repository.
I do all development work in my local machine, where I have two branches (master and develop), what I do is merge develop into my local master, then I push from master into my local repository.
Nw back into my production server, when I want to add all the changes added into production I do a git pull from origin, so far this has resulted into git telling me to stash my changes, why is this?
What would be the best approach to pull changes into production server? take in mind that my production server has no working directory perse, all I do in my VPS is either clone or push upgrades into production.
You can take a look at the CI/CD (continuous integration / continuous delivery) systems. GitLab for example offer free-to-use plan for small teams.
You can create a pipeline with a manual deploy step (you have to press a button after the code is merged to the master branch) and use whatever tool you like to deploy your code (scp, rsync, ftp, sftp etc.).
And the biggest benefit is that you can have multiple intermediate steps (even for the working branches) where you can run unit tests which would prevent you to upload failing builds (whenever you merge non-working code)
For the first problem, do a git status on production to see which files that git sees as changed or added and consider adding them to your .gitignore file (which itself should be a part of your repo). Laravel generally has good defaults for these, but you might have added things or deviated from them in the process of upgrading Laravel.
For the deployment, the best practice is to have something that is consistent, reproducible, loggable, and revertable. For this, I would recommend choosing a deployment utility. These usually do pretty much the same thing:
You define deployment parameters in code, which you can commit as a part of your repo (not passwords, of course, but things like the server name, deploy path, and deploy tasks).
You initiate a deploy directly from your local computer.
The script/utility SSH's into your target server and pulls the latest code from the remote git repo (authorized via SSH key forwarded into the server) into a 'release' folder.
The script does any additional tasks you define (composer install, npm run prod, systemctl restart php-fpm, soft-linking shared files like .env, and etc.)
The script soft-links the document root to your new 'release' folder, which results in an essentially zero-downtime deployment. If any of the previous steps fail, or you find a bug in the latest release, you just soft-link to the previous release folder and your site still works.
Here are some solutions you can check out that all do this sort of thing:
Laravel Envoyer: A 1st-party (paid) service that allows you to deploy via a web-based GUI.
Laravel Envoy: A 1st-party (free) package that allows you to connect to your prod server and script deployment tasks. It's very bare-bones in that you have to write all of the commands yourself, but some may prefer that.
Capistrano: This is (free) a tried-and-tested popular ruby-based deployment utility.
Deployer: The (free) PHP equivalent of Capistrano. Easier to use, has a lot of built-in tasks (including a Laravel one), and doesn't require ruby.
Using these utilities is not necessarily exclusive of doing CI/CD if you want to go that route. You can use these tools to define the CD step in your pipeline while still doing other steps beforehand.

How to deploy Go program from windows to CentOS server

I have a Go package running on Windows and is working fine but now I'm at a stage where I would like to test this on production CentOS 6.5 server.
What is the best practice to deploy this from Windows to CentOS?
Would I have to use my Git repo to distribute to Linux operating system, compile then deploy the binary to the server?
Also I have multiple files, so I would imagine go build *.go would suffice or are there better options for doing compilation?
What is the best practice to deploy this from Windows to CentOS?
As far as best practices go I would recommend using continuous integration. You can setup jenkins, or there are some cloud options out there: codeship.io, travis-ci.org, drone.io, wercker.com, ... Some of them have free plans available.
Basically you'd commit your code to git and push that out to Github (or Bitbucket if you want free private repos). The continuous integration server will be notified whenever you push out changes, and will build, test and create a release tar archive of your project. You can then take this resulting tar and download it to your CentOS box. In 6.5 you'll need to create an init.d script to keep your program up and running. You can see an example here (the system v script).
CentoOS 7 uses systemd now which would be slightly easier to setup.
Taking this one step further it's also possible to setup continuos deployment, in which the download, extraction and installation can also be automated. Depending on your project it may or may not make sense to set up continuous deployment. (Auto-pushing to production might be a little too automatic) You can find an example in wercker here.
Although there is an an up-front cost to setting up continuous integration if this is a project that other people will contribute too, or one that you intend to work on long-term, the cost will definitely be worth it. (Future you will be greatful when you come back to this project 6 months from now, change 1 line of code, and don't have to remember all the manual steps it took to deploy)

How to run the publish/clickonce build step in Jenkins (VS2012)

We have a simple C# solution (VS 2012) that has a publish step/click once wizard - that uses ftp.
I've set up a jenkins build project to build this on SVN trigger. (via MSBuild)
I have NOT been able to get it to build (via MSBuild) the publish/click once installer and upload to my server. I have looked around and searched but i see no way to do this. It seems silly that this would be a manual step.
Hopefully this is something simple that I am overlooking.
Any command line app would be suitable - or if there are scripts that can do the same thing that VS2012 does in the wizard that is fine.
I guess you need this then:
msbuild /target:publish
see more here: Building ClickOnce Applications from the Command Line
this will create "publish" folder - which you have to copy to your server, or network share - whatever you are using for the distribution of your app.
Another problem you have to take care of is to increase the version before the build - you need to update csproj file eventually.

Maven - copying properties file from SVN to Linux based App Server machines

We are using Maven and Jenkins for our automated Build and Deployment needs. Our Build Engineer has left and it is now up to me (Java Architect) to implement a few remaining stuff. I tried a lot of things to resolve this issue we are having. The problem statement is -
We have made a separate project in Eclipse to store properties files. The Developers check-in the properties file into SVN once they make any changes to it. Now we want that Maven, when triggered to do a deploy, to do the following -
1. Take the latest properties files from the SVN from the project used to store properties files.
2. Copy the same onto the Linux based JBoss App Server's /conf/ folder
3. Carry on with its deployment task.
We would like to have solution to point 1 and 2 above.
I dont know the exact answer. But it is quite doable. Quick google search did not show up any svn related plugin to retrieve properties. But you can always write your own maven plugin to do that task. For an example, if you want to retrieve properties file from a svn location to a local file system, just write a simple maven plugin[1] using the svn-kit [2].
we can use maven-wagon plugin[3] to transfer any artifact to a destination. Given that it supports SCP i would go with that. (just like a doing a scp to a remote Linux machine)
HTH.
[1] http://maven.apache.org/guides/plugin/guide-java-plugin-development.html
[2] http://svnkit.com/
[3] http://mojo.codehaus.org/wagon-maven-plugin/usage.html

Resources