How to set up git and maven to work together? - linux

I'm new to both of these tools, and I'm also very new to Linux system administration, so I apologize ahead of time for what may seem like a total n00b question.
Basically, I'm starting a whole new project from scratch. Yaaay! Exciting! However, I'm a little lost on how to set up the project. I've installed both git and maven on my dev machine and run through some tutorials. I've also set up git on my server, and have successfully pushed code to it and pulled code from it.
So, first question : Is it even a good idea to use git and maven together? Git seems like the best source control system, and Maven seems like the best build system. Are they known to work well together? Or am I needlessly creating trouble for myself at this early (and precarious) stage of the project? I've used ant enough to know that I don't want to use it, and I'm not really a fan of svn, although I'll use it if I have to.
Second question : Given that these two tools work well together, what's the Best Practices way of setting them up? I know that git is "peer-to-peer", although I suppose nothing is stopping you from setting up a single repository for the git user and having all the devs sync up with that repo when it's time to do a build. Is that the right way to go? How about Maven? Maven seems kinda single-user oriented. Like, everybody sets up Maven on their own machine and has their own Maven repo, right? Or wrong? Would it make sense to create a "Maven user" on my server, and have that user do all my builds from the "main" git repo?
Apologies if I'm totally mistaken on how to use these tools. As I said, I'm pretty new to these things. Any help you have is appreciated.
(also, I'm working on Linux, doing Java dev work in Eclipse, using Spring for the framework, mysql for the data store, and Hibernate as an ORM. Don't know of any of that matters)
Thanks!

Q1: Yes, git will work well with any build systems. Usually your VCS is well abstracted with any modern build system. Ensure that you set up your .gitignore file so that you are not tracking any artifacts from builds.
Q2: The best practice is to have an integration branch to build from. While developing, use topic or feature branches. When ready, merge into the integration branch and push that up to the central repository where maven can build from. Google git-flow for more ideas. You generally want a central build server if you are working on a team to ensure you are building on the same machine. This is not the case if you are working alone or maybe just one developer.
Hope this helps.

Related

I have a GitLab self hosted running, but how does the frontend work?

I set up a GitLab self-hosted instance and its working fine, my problem right now is that I don't really understand how the frontend works. Mostly because I've been focusing on the backend and because I couldn't find documentation about it either. I wish to understand how I can comment out things I don't want to show for the user or in the overall design, change aspects and text, and overall have control of the frontend.
I'm running on Debian 9, the setup was made with Bitnami using Google VM. As far as I understand I have to manually change the files I want, but I really don't understand the structure of this type of frontend.
What language do I need to know here and where should I find the documentation, how to find the correct directory and files, etc.?
While GitLab doesn't officially support any type of "custom frontend", what you can do is:
Fork GitLab
Use the GitLab Development Kit to implement your changes
Run a Source Install of your fork
The frontend is mostly written in HAML (for the server-side bits) and Vue.js (for the client-side bits).
Note: Even an Omnibus install copies raw ruby and javascript files somewhere, and since they’re physically on the system, they can be manually manipulated and hotpatched, but that’s not really a sustainable way of introducing changes to the codebase.

Sharing Meteor APIs/schemas across multiple projects (git repos)

I am working on a project that has a few different code-bases (mostly Meteor), but they all use some of the same API code (schemas, publications and methods).
It doesn't seem very intuitive to have one repository and use sub-trees/sub-modules in my case. Maybe I'm wrong and somebody could help clear it up.
An example structure of my project:
project-landing-page (meteor app for collecting leads and offering basic account management)
project-app (an angular/ionic/cordova/meteor app that is distributed via the app store)
project-worker (a set of cron-like scripts that are executed in the background to manage the data in the mongo instance)
They all share the same schemas, and the two meteor apps use the same methods and publications. It seems a bit cluttered to have one repo for all of this code. Making a branch for the app would also branch the code for the worker scripts. That just seems messy.
Would it be okay to have another repo called "project-apis", that provides the shared code an could be cloned into the other projects? What are the drawbacks? Other than having to run git pull when the "project-apis" repo is updated, I can't really see any.
Would any git-wizards be able to chime in?
Thanks!

Can i add two independant maven projects into one Project in Gitlab

I have a GitLab project, and I want to store multiple maven projects (logically related) into that project. Would that be okay in a single GitLab Project?
A GitLab project is just a Git repository with some (very nice!) bells and whistles attached. There's no hard requirement for the entire project to produce one artifact, have a single build process, or even have a build process at all.
The recommended best practice is indeed to have a single Maven project per GitLab project in order to better utilize GitLab's CI tools, but that is not a requirement.

Do I need to be careful which Maven Repositories I hook into?

Generally speaking, should one only add the central Maven Repository to a pom.xml + optionally any local Maven Repositories ? In theory (I think?) anybody can set up a repository - is there a 'Maven Repository<->Maven Repository' circle of trust or something ?
How do I know for instance that I'm really downloading (say) the log4j compiled JARs and not some bastardized / evil version ?
Few things you can do to feel comfortable:
Use a local repository manager like Nexus or JFrog, and proxy any repositories that you want to use. There are few benefits to this:
A local manager can keep track of the SHA hashes to make sure that a jar didn't change under your feet.
You can limit the repositories that your developers can access.
Stick with Maven Central when you can - so many people use it that if someone switched out the log4j version with something untrustworthy everyone would know very quickly (because the hashes wouldn't line up). Generally this argument will also hold true for any other repositories that hold popular libraries (eg sourceforge, google code, codehaus, etc)
Things are only likely to get risky if you're using some dude's repo who wrote some library that's not very popular out in the wild. In practice this rarely happens. In those cases, maybe you can just build the code yourself to be sure.
Best practice is not to add any repository into the pom.xml. The best solution is to configure either into the settings.xml or the best solution is to use a repository manager. Furthermore the best thing is to work with maven central if you don't have a repo-manager, but for that you don't need to configure anything, cause Maven Central is the default within Maven itself. Maven Central is control adminstrative by people of Sonatype and it is not that simple to get something into Maven Central. What you can do to secure the transport a little bit more is to turn on the checksum checking which is controled by a configuration in the settings.xml.

Deploy repository code to multiple machines at once

My question is: How do you guys deploy the same code from whatever [D]VCS you use on multiple machines? Do you have an automated deployment system and if so what's that? Is it built in-house? Are there out there any tools that can do this automatically? I am asking because I am pretty bored updating up to 20 machines every time I make some modifications.
P.S.: Probably this belongs on ServerFault, but I am asking here because I am thinking at writing my own custom-made deployment system.
Roll your own rpm/deb/whatever for your package, set up your own repo, and have your machines pull on a regular basis. Its really not that hard to do and its already built-in to your system, is well tested, and loaded with features. You could use something like Func if you needed to push instead.
Depending on your situation deploying straight from the versioning system might not always be the best idea. You can only so much by just updating files, and mixing deployment and development probably will make the development use of the versioning system less free.
I see two alternatives that might be interesting.
Deploy from your continuous integration server. (add a task that runs after every successful build, copies over files and executes some remote commands, I'm using this to deploy to a testserver and would find it to tricky to upgrade production in such a way)
Deploy using an existing package manager. You can set up your own apt (or equivalent) repository and package the updates using apt. Have your continuous build system build apt packages but let an admin decide if the should be pushed to the update server. I think this is the only safe solution for production machines.
We use Capistrano for deployment & Puppet for maintaining the servers and avoiding the inevitable 'configuration drift' when many developers/engineers tinker with the package lists and configuration files.
Both of these programs are written in Ruby, but we use them for our PHP codebase stored in a git repository.
I use a combination of deb packages with puppet to deploy code and configure a bunch of machines.
In most projects i have been involved with the final stage has always been an scripted rsync deployment to live. so the multiple targets are built into this process.

Resources