Manage one remote server using my mac as master with Puppet - puppet

I have just started studying Puppet. I understand that puppet can work in solo mode and in the master/agents configuration.
Ideal Use Case
I have a server on Digital Ocean, I would like to use the Puppet GUI to manage the remote server from my MAC, (if possible without running a VM). Is this possible or I am obliged to rent another server that run as a master (if I want to use the Puppet GUI)?

Related

How to run puppet forge modules in linux ubuntu machine

I'm new to Puppet. I want to install any package or software on my new linux machine where Ubuntu installed. I have gone through puppet forge modules in their portal.
There are plenty of modules available but I'm not getting how to run them.
Looks like in all puppet forge modules, puppet language script used. I guess we need to install puppet first in linux machine.
I came to know that we have server and client puppet master and puppet agent. Do we need to install both on my linux machine to run puppet forge scripts?
How to install puppet on linux ubuntu machine and where to run puppet forge module scripts among master and agent?
Do we need 2 linux machines each for puppet server and client?
Puppet is targeted at managing multi-computer installations. It can be used on an isolated machine (you would install both the master and the agent on that machine), but you are likely to make more work for yourself that way, not less, especially given that you have no prior experience with Puppet.
Looks like in all puppet forge modules, puppet language script used.I guess we need to install puppet first in linux machine.
Pedantically, the Puppet language is not a scripting language. But yes, Puppet modules are written primarily in Puppet's domain-specific language. You need Puppet to use them.
I came to know that we have server and client puppet master and puppet
agent.Do we need to install both on my linux machine to run puppet
forge scripts.
Unless you want to set up a second machine for the master to run on, yes, you would need to install both the master and the agent on your machine. Puppet used to support a direct-apply mode, but that is no longer an option.
How to install puppet on linux ubuntu machine and where to run puppet
forge module scripts among master and agent.
Puppet has extensive online documentation. The section on installing Puppet is here: https://puppet.com/docs/puppet/latest/installing_and_upgrading.html.
Note also that installing the software is not all you would need to do. Puppet modules are not programs. They are somewhat like subroutines. You would also need at least to write some Puppet code of your own to specify just how (using the modules of your choice) you want Puppet to configure your machine.
Do we need 2 linux machines each for puppet server and client.
No. You can run the agent on the machine that hosts the master. Many sites do that, in fact, but it is rare for that to be the only place where the agent runs.
Generally speaking, you need to have several machines under Puppet management to achieve a net win relative to managing your machines directly. It really doesn't sound to me like Puppet would be a good fit for you.
For your use case, it seems like using Puppet Bolt is the better option.
As stated by John Bollinger, Puppet has very good online documentation on their products, and it's no different with Bolt:
Installing Bolt on Ubuntu
Once Bolt is installed, you can use its built-in package task to manage packages on your machine, e.g. Apache, by running:
bolt task run package action=install name=apache2
(you can find more examples here)
But if you intend to use Puppet Forge Apache module with Bolt, you can start by installing the module, but this is a more advanced use case, as you'd probably would have to write a plan or manifest to actually use the module's full potential, and you'd still have to deal with some limitations.
As you're new to Puppet and Bolt, I'd recommend you start simple and also take this hand-on lab provided by PuppetLabs.
I hope that gets you going!

How do I get a Jenkins server to push bash code to a different server?

I have Jenkins installed on a Linux server. It can run builds on itself. I want to create either a Freestyle Project or an External Job that transfers a bash script and runs it on two separate linux servers. Where in the GUI do I configure the destination server when I create a build? I have added "nodes" in the GUI. I can see the free space of the servers in the Jenkins GUI, so I know the credentials work. But when I create a build, I see no field that would tell Jenkins to push the bash scripts and run them on certain servers.
Are Jenkins nodes just servers that lend computing power to the master server? Or are they the targets of Jenkins builds? I believe that Jenkins "slaves" provide computing power to the Jenkins master server.
Normally Jenkins is used to integrate code. What do you call the servers that Jenkins pushes code into? They would be called Chef clients or Puppet agents if I was using Chef or Puppet for integrating code. I've been doing my own research, but I don't seem to know the specific vocabulary.
I've been working with such tools for several years. And for as far as I know there isn't a Ubiquitous Language for this.
The node's you can configure in Jenkins itself to add 'computing power' are indeed called build slaves.
Usually, external machines you will copy to, deploy to or otherwise use in jobs are called "target machine". As it will be the target of an action in your job.
Nodes can be used in several forms, you can use agents, which will require a small installation on the node machine. Which will create a running agent service with which Jenkins can communicate.
Another way is simply allow Jenkins to connect to a machine via ssh and let it execute commands there. Both are called nodes and could be called build slaves. But the first are usually dedicated nodes while the second can be any kind of machine as long as the ssh user can execute the build.
I also have not found any different terms for these two types.
It's probably not a real answer to your questions, but I do hope it helped.

Connect and use Grunt on a remote server

I've installed node.js, grunt.js plugin and a less plugin on my own computer, and now I'm about to connect it to a remote server and use it there. I have a couple of newbie (perhaps some of them are dumb) questions regarding this:
Do I have to install node.js, grunt.js etc. in a remote server folder also? Or is it enough that it's installed on my own computer?
I use windows 8.1 (if that is relevant) and I wish to connect to a FTP-server, or a SSH-server (I've been provided with the server name, the FTP-Port number and the SSH-port). Which type of port should I use?
The actual script file that is going to be executed by the npm, should they be placed in a folder on that remote server as well?
You local development environment is indeed completely separate from your remote server environment.
Yes, in order to run node scripts or grunt tasks, it is required that your remote server environment also has node and grunt installed.
FTP is commonly port 21 while SFTP is commonly port 22. However, you should probably be using an IDE or FTP client that supports your development transactions. IE: uploading new code to a remote server. You should also consider version control and/or continuous integration / code deployment routines.
Yes, you should essentially mimic your local environment on your remote server environment. While they may not always be 1 for 1, especially depending on what operating system your remote server is, ultimately they are completely separate. You might be interested in vmware, virtualbox, vagrant, or docker.

Is there any linux scripts to for uploading nodejs app to myown linux server? Like appfog or heroku

Is there any linux scripts to for uploading nodejs app to myown linux server?
Like appfog or heroku. I have dedicated linux server and working on linux too.
I want upload my nodejs application to server and restart nodejs with one shell command.
I can write script, but maybe don't need to invent bycicle?
Popular choices using SSH:
rsync
fabric
For serious stuff you really should look at configuration management and server provisioning applications like (in no particular order):
Chef
Puppet
Ansible (+1 for the name, "Enders Game" is one of my favorite books)
Most revision control systems allows for "after/before-commit" hooks; sometimes I use these hooks to run tests before and automatically deploy to the acceptance environment after commits.
See also Jenkins CI (Continuous Integration is a hot topic).
I use fleet from substack to manage deployment. Fleet is a git-based tool that allows you to deploy code and manage your node processes running on remote servers.
Adding in seaport and either bouncy or node-http-proxy is a great way to build an application that is made up of lots of small components that work together.

automated deployment on production with puppet

I would like to know how automated deployment to production works with puppet.
Do I need a puppet-slave on my production server? If thats the case, is that insecure and what rights do puppet get with that?
A use-case could be to get a package from a repository manager and then to deploy it to the production server. What are the main steps on this way with puppet?
Puppet can run in solo-mode where you apply a set of configurations in config file on the host in which you run it, as long as puppet (client/agent) is already installed there.
You can also run puppet in a client-server mode, where an agent runs on your production server and obtains configuration details from a puppet server (or puppet master)
If you run in client-server mode, how do you ensure security?
Well, in client-server mode, you pre-register a client/agent to a server you nominate and the exchange ssl certificates before any actions can be applied on that agent. Again, you would have to (on your pupper server or master) associate a set of actions or manifests to the production server running the agent. I suppose that provides sufficient security, assuming you already took care of standard OS security for both systems in the first instance.
Also, additional security can be provided by the puppet file server as suggested in the link suggested by bagheera. If you are even more paranoid than that, then you would need to consider using puppet librarian with a Puppetfile that is assembled and used at run time.
In either case, the bigger challenge for you is that the set of instructions (or manifests) applied have undergone testing (on a test or staging server) before being applied to a production system.
So, you need to be sure what you are doing when you start trying to apply puppet manifests to production servers. I would not recommend just downloading puppet modules and using them without a decent insight into what you are doing and a clear understanding of what each module you intend to use does.
Puppetlabs have great introduction documentation for using puppet, and that would be an excellent place to start learning more about puppet. A good book would also be useful.

Resources