Maven Security Concerns - security

Are there security concerns with using Maven? I use Ant today for my main project, but I do use Maven for my "samples" project where I write program spikes. I do like some parts of Maven, but have a concern with downloading my jars through the tool. Is this an unfounded concern? How secure is "http://repo1.maven.org/maven2/"? Is there a more secure way of using the tool?
Thanks.

It's pretty secure and standard. If ever the security of http://repo1.maven.org/maven2/ is compromised, there will have big repercussion in Java devs. I never heard that this site is hacked.
That said, you are not bounded by default repository. You can configure your own repository using Nexus, Artifactory and install safe artifacts manually to them. You may also block remote repositories using Nexus/Artifactory setting. Although, I never needed to do this. But look here, perhaps it's possible.
Please note that you will have to block your local repository to use "repo1", else the local repository will download artifacts from there by default.
Edit 1: added missing link

Related

Docker security scan detects vulnerability in gradle 7.4.1

Creating a docker image with gradle 7.4.1 triggers the security scan which shows vulnerability CVE-2020-36518. How can this particular jar file within the gradle package be updated?
I would just reject the security issue, explaining that it is not possible to exploit the vulnerability as the Gradle build runs isolated on controlled input, and is not accessible by any potential attackers.
(Assuming this is the case, of cause, and you don't have a custom Gradle plugin that reads untrusted JSON documents using Jackson from the Gradle classpath. But even then, all you are risking is a denial-of-service on the build.)
Fiddling around with jar files in external tools could easily lead to problems hard to debug later. But if you like, you could create an issue for them, asking if they could bump the Jackson version to avoid unnecessary noise from security scans like this. There is an example of that here.

Use SVN in opencms to keep versioning of site content

I am working with 10.5.4. I would like to use SVN to keep versioning of my site contents so as editors to be able to commit their changes. I have read a lot about mounting the VFS and that stuff, however with that way everyone coulld access and commit others changes
I would appreciate any further information about using versioning not only for module developments but also for site contents
I know, this is probably not exactly what you are looking for, but you may have a look at the OpenCms Maven plugin.
The main focus of this plugin is, to ease the development process of OpenCms projects. But with a good setup and Maven configuration you might use it to put the contents of your production site under version control. If you are interested, you'll find more information on the official website of the plugin.

Do I need to be careful which Maven Repositories I hook into?

Generally speaking, should one only add the central Maven Repository to a pom.xml + optionally any local Maven Repositories ? In theory (I think?) anybody can set up a repository - is there a 'Maven Repository<->Maven Repository' circle of trust or something ?
How do I know for instance that I'm really downloading (say) the log4j compiled JARs and not some bastardized / evil version ?
Few things you can do to feel comfortable:
Use a local repository manager like Nexus or JFrog, and proxy any repositories that you want to use. There are few benefits to this:
A local manager can keep track of the SHA hashes to make sure that a jar didn't change under your feet.
You can limit the repositories that your developers can access.
Stick with Maven Central when you can - so many people use it that if someone switched out the log4j version with something untrustworthy everyone would know very quickly (because the hashes wouldn't line up). Generally this argument will also hold true for any other repositories that hold popular libraries (eg sourceforge, google code, codehaus, etc)
Things are only likely to get risky if you're using some dude's repo who wrote some library that's not very popular out in the wild. In practice this rarely happens. In those cases, maybe you can just build the code yourself to be sure.
Best practice is not to add any repository into the pom.xml. The best solution is to configure either into the settings.xml or the best solution is to use a repository manager. Furthermore the best thing is to work with maven central if you don't have a repo-manager, but for that you don't need to configure anything, cause Maven Central is the default within Maven itself. Maven Central is control adminstrative by people of Sonatype and it is not that simple to get something into Maven Central. What you can do to secure the transport a little bit more is to turn on the checksum checking which is controled by a configuration in the settings.xml.

How to set up git and maven to work together?

I'm new to both of these tools, and I'm also very new to Linux system administration, so I apologize ahead of time for what may seem like a total n00b question.
Basically, I'm starting a whole new project from scratch. Yaaay! Exciting! However, I'm a little lost on how to set up the project. I've installed both git and maven on my dev machine and run through some tutorials. I've also set up git on my server, and have successfully pushed code to it and pulled code from it.
So, first question : Is it even a good idea to use git and maven together? Git seems like the best source control system, and Maven seems like the best build system. Are they known to work well together? Or am I needlessly creating trouble for myself at this early (and precarious) stage of the project? I've used ant enough to know that I don't want to use it, and I'm not really a fan of svn, although I'll use it if I have to.
Second question : Given that these two tools work well together, what's the Best Practices way of setting them up? I know that git is "peer-to-peer", although I suppose nothing is stopping you from setting up a single repository for the git user and having all the devs sync up with that repo when it's time to do a build. Is that the right way to go? How about Maven? Maven seems kinda single-user oriented. Like, everybody sets up Maven on their own machine and has their own Maven repo, right? Or wrong? Would it make sense to create a "Maven user" on my server, and have that user do all my builds from the "main" git repo?
Apologies if I'm totally mistaken on how to use these tools. As I said, I'm pretty new to these things. Any help you have is appreciated.
(also, I'm working on Linux, doing Java dev work in Eclipse, using Spring for the framework, mysql for the data store, and Hibernate as an ORM. Don't know of any of that matters)
Thanks!
Q1: Yes, git will work well with any build systems. Usually your VCS is well abstracted with any modern build system. Ensure that you set up your .gitignore file so that you are not tracking any artifacts from builds.
Q2: The best practice is to have an integration branch to build from. While developing, use topic or feature branches. When ready, merge into the integration branch and push that up to the central repository where maven can build from. Google git-flow for more ideas. You generally want a central build server if you are working on a team to ensure you are building on the same machine. This is not the case if you are working alone or maybe just one developer.
Hope this helps.

Deploy repository code to multiple machines at once

My question is: How do you guys deploy the same code from whatever [D]VCS you use on multiple machines? Do you have an automated deployment system and if so what's that? Is it built in-house? Are there out there any tools that can do this automatically? I am asking because I am pretty bored updating up to 20 machines every time I make some modifications.
P.S.: Probably this belongs on ServerFault, but I am asking here because I am thinking at writing my own custom-made deployment system.
Roll your own rpm/deb/whatever for your package, set up your own repo, and have your machines pull on a regular basis. Its really not that hard to do and its already built-in to your system, is well tested, and loaded with features. You could use something like Func if you needed to push instead.
Depending on your situation deploying straight from the versioning system might not always be the best idea. You can only so much by just updating files, and mixing deployment and development probably will make the development use of the versioning system less free.
I see two alternatives that might be interesting.
Deploy from your continuous integration server. (add a task that runs after every successful build, copies over files and executes some remote commands, I'm using this to deploy to a testserver and would find it to tricky to upgrade production in such a way)
Deploy using an existing package manager. You can set up your own apt (or equivalent) repository and package the updates using apt. Have your continuous build system build apt packages but let an admin decide if the should be pushed to the update server. I think this is the only safe solution for production machines.
We use Capistrano for deployment & Puppet for maintaining the servers and avoiding the inevitable 'configuration drift' when many developers/engineers tinker with the package lists and configuration files.
Both of these programs are written in Ruby, but we use them for our PHP codebase stored in a git repository.
I use a combination of deb packages with puppet to deploy code and configure a bunch of machines.
In most projects i have been involved with the final stage has always been an scripted rsync deployment to live. so the multiple targets are built into this process.

Resources