I have a pipeline where I checkout code from git, build artifacts and publish the artifacts to Nexus.
Now before I deploy the artifacts, I want to scan them for any vulnerability. How I can achieve it. Are there some tools available.
What you described is basically the use of Nexus Lifecycle and Firewall tools. They scan items as they enter your repository manager.
I suggest reading more about them here:
https://www.sonatype.com/product-nexus-lifecycle
https://www.sonatype.com/product-nexus-firewall
Note neither of them are free services, they require a license.
Nexus is to late. You should scan the artefacts in Jenkins when you have both the sources and binaries to your disposal. There should be tutorials how to do something like Jenkins-Sonar Qube integration.
Imagine following. You have scanned an artefact and you see vulnerabilities. What are you doing then? Are you willing to remove it from repository? Imagine the problems you will generate this way. And the dev team will say, that they did not see the results of your scan so they could not act upon it and now this artefact is going to be deployed...
After RnD, I found a possible solutions. Answer provided by #joedragons is also useful. I guess jforg xray is also good a solution.
https://jfrog.com/xray/
JFrog Xray is a continuous security and universal artifact analysis tool, providing multilayer analysis of containers and software artifacts for vulnerabilities, license compliance, and quality assurance. Deep recursive scanning provides insight into your components graph and shows the impact that any issue has on all your software artifacts.
Related
I've requested to my Team Lead that we start integrating a CI/CD pipeline into most, if not all, of our projects. Our newest project relies heavily on our own, external class library that is referenced in the solution ; it is under "Dependencies" as a project reference.
The project runs fine when I build it in my machine using Visual Studio 2019, and before we needed to integrate an external library, it would build and release fine using our Azure DevOps pipelines.
However, with the addition of an external class library, when I try to run a build through Azure DevOps, I get the following error:
The project file ....csproj was not found.
I fully understand why it can't find it - because I need to pull in the external class library and build that first! There doesn't seem to be a lot of online material (not that I could find anyway!) that describes solutions to this other than "use nuget" ; unfortunately, it is a requirement from my Team Lead that this is not a route we go down - which has lead to a long couple of days!
With this in mind, I can't find another way to do this in Azure DevOps. I have looked into some sort of PowerShell command but to no avail thus far.
Has anyone run into this issue before with external class libraries in DevOps and can give me advice on the best way to approach it?
Generally speaking in 99,99% cases keeping a direct reference to the project is not a good idea. You can end up with really unmaintainable CI/CD logic and/or with dll versions mismatches during deployments. Actually I am an Architect in the project where I tried to fix that issue by migrating all dependencies to the NuGet server.
Azure Artifacts
You mentioned, that you are using Azure DevOps as main CI/CD tool, so this is a great opportunity to introduce Azure Artifacts as internal nuget server which is a part of Azure DevOps. For the first 2 GB it is free, here you have pricing details.
Alternatives
If for some reason you cant use Azure Artifacts, I recommend some alernatives:
MyGet
ProGet
Own nuget server
More information about alternatives you can find in this article.
I am looking for a solution to implement security-scanning of the application code-base at the time of a build. The idea is to capture a list of security vulnerabilities early in the software development life cycle.
I have a simple java project which uses a maven build. The java project specifies a number of .jar dependencies and comes up with a .war file as a build output.
I came across (and was able to configure) the dependency-check maven plugin (http://jeremylong.github.io/DependencyCheck/dependency-check-maven/index.html). However, though it scans the dependency jars and comes up with a vulnerability report, it doesn't seem to scan the final artifact - which in my case is the .war file.
How do I ensure that the .war is scanned as well? Is the dependency-check plugin the right tool for this?
dependency-check isn't the right tool for checking your own code. It uses a list of known vulnerability reports to determine if any of your dependancies have known flaws. It does not do an active scan of the code. see Plugin wiki
For checking your own code, HP's Fortify is a decent commercial solution, but if you are working in more of a DIY software setting, I would recommend Sonar. There are certainly many static code analysis tools out there. All have advantages and disadvantages.
I'm new to both of these tools, and I'm also very new to Linux system administration, so I apologize ahead of time for what may seem like a total n00b question.
Basically, I'm starting a whole new project from scratch. Yaaay! Exciting! However, I'm a little lost on how to set up the project. I've installed both git and maven on my dev machine and run through some tutorials. I've also set up git on my server, and have successfully pushed code to it and pulled code from it.
So, first question : Is it even a good idea to use git and maven together? Git seems like the best source control system, and Maven seems like the best build system. Are they known to work well together? Or am I needlessly creating trouble for myself at this early (and precarious) stage of the project? I've used ant enough to know that I don't want to use it, and I'm not really a fan of svn, although I'll use it if I have to.
Second question : Given that these two tools work well together, what's the Best Practices way of setting them up? I know that git is "peer-to-peer", although I suppose nothing is stopping you from setting up a single repository for the git user and having all the devs sync up with that repo when it's time to do a build. Is that the right way to go? How about Maven? Maven seems kinda single-user oriented. Like, everybody sets up Maven on their own machine and has their own Maven repo, right? Or wrong? Would it make sense to create a "Maven user" on my server, and have that user do all my builds from the "main" git repo?
Apologies if I'm totally mistaken on how to use these tools. As I said, I'm pretty new to these things. Any help you have is appreciated.
(also, I'm working on Linux, doing Java dev work in Eclipse, using Spring for the framework, mysql for the data store, and Hibernate as an ORM. Don't know of any of that matters)
Thanks!
Q1: Yes, git will work well with any build systems. Usually your VCS is well abstracted with any modern build system. Ensure that you set up your .gitignore file so that you are not tracking any artifacts from builds.
Q2: The best practice is to have an integration branch to build from. While developing, use topic or feature branches. When ready, merge into the integration branch and push that up to the central repository where maven can build from. Google git-flow for more ideas. You generally want a central build server if you are working on a team to ensure you are building on the same machine. This is not the case if you are working alone or maybe just one developer.
Hope this helps.
Are there security concerns with using Maven? I use Ant today for my main project, but I do use Maven for my "samples" project where I write program spikes. I do like some parts of Maven, but have a concern with downloading my jars through the tool. Is this an unfounded concern? How secure is "http://repo1.maven.org/maven2/"? Is there a more secure way of using the tool?
Thanks.
It's pretty secure and standard. If ever the security of http://repo1.maven.org/maven2/ is compromised, there will have big repercussion in Java devs. I never heard that this site is hacked.
That said, you are not bounded by default repository. You can configure your own repository using Nexus, Artifactory and install safe artifacts manually to them. You may also block remote repositories using Nexus/Artifactory setting. Although, I never needed to do this. But look here, perhaps it's possible.
Please note that you will have to block your local repository to use "repo1", else the local repository will download artifacts from there by default.
Edit 1: added missing link
My question is: How do you guys deploy the same code from whatever [D]VCS you use on multiple machines? Do you have an automated deployment system and if so what's that? Is it built in-house? Are there out there any tools that can do this automatically? I am asking because I am pretty bored updating up to 20 machines every time I make some modifications.
P.S.: Probably this belongs on ServerFault, but I am asking here because I am thinking at writing my own custom-made deployment system.
Roll your own rpm/deb/whatever for your package, set up your own repo, and have your machines pull on a regular basis. Its really not that hard to do and its already built-in to your system, is well tested, and loaded with features. You could use something like Func if you needed to push instead.
Depending on your situation deploying straight from the versioning system might not always be the best idea. You can only so much by just updating files, and mixing deployment and development probably will make the development use of the versioning system less free.
I see two alternatives that might be interesting.
Deploy from your continuous integration server. (add a task that runs after every successful build, copies over files and executes some remote commands, I'm using this to deploy to a testserver and would find it to tricky to upgrade production in such a way)
Deploy using an existing package manager. You can set up your own apt (or equivalent) repository and package the updates using apt. Have your continuous build system build apt packages but let an admin decide if the should be pushed to the update server. I think this is the only safe solution for production machines.
We use Capistrano for deployment & Puppet for maintaining the servers and avoiding the inevitable 'configuration drift' when many developers/engineers tinker with the package lists and configuration files.
Both of these programs are written in Ruby, but we use them for our PHP codebase stored in a git repository.
I use a combination of deb packages with puppet to deploy code and configure a bunch of machines.
In most projects i have been involved with the final stage has always been an scripted rsync deployment to live. so the multiple targets are built into this process.