What is the best way to test server builds? [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 8 years ago.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
This question does not appear to be about programming within the scope defined in the help center.
Improve this question
I've got to build about a half-dozen instances of a very large & complex linux database server.
I can't set up the machines myself - instead I've got to request the config to be built. Additionally, I can't mandate an automation tool such as Chef for the configuration. I've provided the sysadmins with extremely clear spreadsheets that describe the config - but it's very error-prone to build, time-consuming to test and they make a wreck of it each time.
So, I'd like have a tool that will test the following:
system info: installed software & versions, default configs, security policies, sudo list, etc
user info: userids, group membership, primary group, umask, ulimits, home directories, home directory privs, etc
storage info: raid configurations & extent sizes, volumes, file systems, file system types, mount points, ownership & privs, etc
database info: installed software & versions, installation locations, database config, tablespaces, bufferpools, and database objects
I don't need a single tool to do it all, I'd be happy enough to find a few tools that together could pull this together. And I'll write at least parts of it if I have to - but would be bummed if I spent a week writing something when there was something better already available.

You can create puppet recipes to configure your own independent servers on VMs.
Then copy those puppet manifests on the actual servers and run puppet in "noop" mode, referencing the local manifest files. That should give you answers as to whats missing.

Not sure if it is same as what #bash suggested.
In VMWare world, you can first create a server, then convert it in to a template.
So, you first create a server and make sure you have all the required settings done on it - i.e; validate once to make sure if it 100% as you wanted.
Convert this server to a template. Now you can stamp as man clones out of this template as you wish - and all of them are guaranteed to look identical.

Related

Can I make a DIY Cloud server for windows? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
My old parents have been hacked/virus-ed for the nth time.
I have an old HP server.
I thought of rebuilding it with VMWare (free version) or Oracle virtualbox and having them use windows in a controlled environment. I would back it up and patch it, etc. Maybe they RDC to my server.
I assume I would need a Windows server license to allow multiple connections. (I could also use it for myself to host Plex media server.)
At a 10,000 foot level, is this possible or just a technology quagmire?
Super User SE might be a better place for this.
Anyway: Are they using it for anything windows-specific? My parents used to use my Linux-based computer for web browsing, now they use an Android tablet for the same. Running a virtualised Windows on top of the former could've been an alternative. Also, backing up and rolling back is easier if you use virtualisation, just use something else for permanent data storage. Maybe a remote storage with backup and rollback (for ransomware) either your own infrastucture or in the cloud. (like syncthing, owncloud, etc.)
I'm assuming here they don't have trade secrets or plans of a home-built nuclear plant or anything that kind.

How to get recursively dependencies for package with versions? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
I need to install a package on a system without internet access (the package contains a driver for network card).
System A has internet connection and runs Ubuntu 14; System B has no internet connection and runs Ubuntu 16.
How can I download all dependencies recursively with the correct version on system A, that could be next installed on system B?
I would suggest that you run a docker container (or some type of virtualization) with Ubuntu 16.04 on System A. After that, you can update the packages index (apt update) then install the desired packages on that system. Finally, you may copy the packages index from /var/lib/apt, and the packages themselves from /var/cache/apt/archives to System B.
It's a good practice to restrict hosts from internet access. However, as a patch management solution, you should setup a local mirror - this will centralize your patching needs for the entire organization. It's not just limited to ubuntu but you could host multiple linux distro mirrors. The only thing you really need is a large capacity disk, maybe mirror it for some non-critical resiliency. This will also cut-back on a multi-server environment using bandwidth, limiting the bandwidth to a single host pulling updates to it's mirror one-time. Just make sure you have a process or script to run to regularly check for updates. That way your hosts are ready for patching when you need it, assuming you stay on top of emerging threats and vulnerability management for various *Nix platforms.
I'm not a huge fan of reinventing the wheel so.. here's a couple how-to references.
How-to: Setup a local Ubuntu Aptitude Repo (Can setup to mirror Ubuntu 14,15,16 to support all your linux hosts.)
How-to: Set a local CentOS YUM Repo (just incase you have some RH based servers)
What you're going to have to do afterwards is change your /etc/apt/sources.list to point to your new internal repository for aptitude repos. You can just copy the lines existing there, change the server domain name to your local server. Then you don't need any of your linux hosts to communicate to any hosts outside of your network, the one server pulling from the mirrors can. It will definitely help you refine your security needs.
For RHEL based or yum, it's configured in /etc/yum.repos.d/{reponame}.repo

How do I install to my Linux laptop a software project hosted on git? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
How do I install, to my Linux laptop, a software project hosted on GitHub? I am not attempting to set up a project/repository myself. My intention is to download a project, in order to use/execute it. What is the name of this process? How is it done? If it helps, I am trying to install the command line interface for Last Pass. Their instructions seem to only assist me with downloading the dependencies. Downloading the zip file just provides a folder filed with random files. I am used to installers, so perhaps I am missing some steps that would aid my understanding.
This is off-topic for StackOverflow, as it's not about programming. It also has nothing to do with git, you just got the files from github, but git has nothing to do with installing it.
The first thing you should do is check if the software is available already from your Linux distribution, using the system's software management tools. Installing it from there will be simpler and will mean the software gets updated automatically in future.
If it isn't packaged by your distro, and the project don't provide binaries, then you may need to build the software yourself. Among the files you cloned from github should be a README or INSTALL file with instructions on building and installing it.
Typically this involves running a configure script, or maybe cmake, to set up the build process correctly for your computer, then running make and make install.
Depending whether you want to install it just for your user or system-wide you might need to run the configuration step with an option specifying where to install to.
Depending on the programming language used you might need to install some additional tools, such as a compiler and linker. These will definitely be available from your distro's software management tool.

Linux/Debian based application won't compile [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
My question is related to my grade project. Its about mobile video transmission using DVB-H link layer. Its a comprehensive project itself, and there are separate parts. I mainly construct system in Matlab Simulink, but there is one part, responsible for encapsulation/decapsulation of the stream packet, which was designed in Linux environment.
I didn't want to install new OS just for one application, so I run Linux on VMware Player as a virtual machine. So here is my problem - the applications (for encapsulation and decapsulation) won't compile (install) completely. I see mainly missing library problems. I tried to install necessary libraries, but the original application still couldn't see some of them. I feel like I'm missing something small, but clear to rather experienced Linux user. Here is the link to the programs
http://sp.cs.tut.fi/mobile3dtv/download/
"DECAPS - DVB-H Decapsulator Software" is the one (and FATCAPS link is there).
I couldn't find alternative encapsulator/decapsulator in for Windows environment. Its my last and only choice. If please some of the Linux users could try to run them in Linux environment, maybe its because I'm using virtual machine? Its also noted that the application was designed for Debian based systems, but I also did install Debian as a virtual machine and application won't even configure. Please help, guys, I'm really stuck here.
You'll probably need to install the development versions of the libraries -- under Debian, the'll be named the same as the original package, but with a "-dev" suffix.

Is there a standard way to keep modified Linux configuration files separate and identifiable? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 9 years ago.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Improve this question
Often I need to edit many configuration files under /etc, however, I don't want these changes to get lost when I'll perform my next system upgrade.
Right now, I've housed all configuration files as well as some of my maintenance scripts in /opt/admin, and symlinked /etc targets there, but that doesn't seem right according to standards I've seen. Another option I thought of, is housing these in /usr/local. The aforementioned document says it is for use by the system administrator when installing software locally. That's the closest I've got. However, also /usr/local gets clobbered when you install new non-packaged software.
Is there a standard/largely followed best-practice as to how to maintain these?
Since this is not a discussion page, answers should be definite and with an article or two to support them.
EDIT
Since this was rightfully put on hold I thought I'd redirect you to a similar question in Server Fault.
There seem to be two general solutions - either use symlinks or maintain a list of files (see ptman's answer in linked question).
For now, I'm using a hybrid solution: maintain a list of configuration files (/opt/admin/config/FILES). A script (/opt/admin/scripts/link-config-files.sh) creates a symlinked hierarchy (under /opt/admin/config). Files are now easily accessible from one place, where it's clear where they're really located, there's a central list of files which is easily maintained (no need to manually link), and a simple backup of /opt/admin includes all configuration. Here's my script.
Thanks Jim, for your answer.
I keep a parallel directory tree under my user directory
/home/myuserid/config/hostname/
var
....
etc
sysconfig
network
and so on. I have a deploy shell script that will install files to the correct place. I back up everything to a private externally hosted git account.
I do all editing on copies in the above parallel tree and deploy the modified files.
If you have more than a few systems to maintain this way, tools like Puppet or Chef seem to be possible options, though they come with their own headaches. Disciplined, repeatable system administration is not trivial.

Resources