upload and install to aws server - node.js

I have a test server that checks out my source from github and deploys it locally to my test server.
Now I have it running and tested. I want to upload that working directory to AWS server's, how do I do this?
I have access to AWS via putty.
Then when I have it all on the aws server, can I install it as I would on any other ubuntu server?

There are many ways to do this:
Make a tar ball, scp to your server, untar and install
Use Ansible to checkout code on your server
Best option: Use AWS Code Deploy See this Using AWS CodeDeploy to Deploy an Application from GitHub

Another option, which worked best for me.
Is create an AWS instance, log into it.
Set that up as you would any other server.
then you can save the image and use as a base in future.
I used Ubuntu, but there are numerous AWS instances available.

Related

How to refresh application deployed in Linux machine

I have deployed angular application in Sandbox Linux machine. When replacing data in assets folder those changes are not reflecting in website. Though I am using sudo service restart httpd command.
I am using Putty command prompt and connecting to server via ssh
How can I reflect the changes or recompile code/application using commands?
It's going to depend on how your build/deploy toolchain for angular works.
Basically, httpd reads the files on the filesystem. When you update the files, you don't need to restart the httpd service. It will serve up whatever is there.
However, angular is another story. You're probably on the right path that you probably need to recompile your angular application, but with what you've provided I don't think we can answer that for you, other than to say:
Here are the docs about deploying angular apps: https://angular.io/guide/deployment

Is it possible to use Git extension or Git GUI to manage the files on a remote linux ftp server like a local repository

I have my site stored on a remote Linux server and I use FTP to download, edit, and re-upload them. I am currently able to use the git bash to connect and upload the files (like a local repository) to bitbucket (my remote repository) but I was wondering if its possible to use one of the graphical git programs to connect to the server and be able to push/pull/manage the changes just to make my life simpler.
If you have a local clone of that Linux server repo, you can use it in combination of any GUI (SourceTree, GitKraken, Tower, ...) installed locally.
That is easier than trying to install a GUI in an environment where you only have headless (ie shell or ftp) access.

Pulling a git repo from a startup script on google cloud compute engine

To show my team how the app that I am building is progressing, I created a small dev server on google cloud compute engine. This server is usually switched off to save cost and only is switched on when we are working together. I am developing and pushing to a git repo when the server is not on. When I start the server, the latest changes should be pulled, the node packages installed and the node server should be started. To do this I have created the following startup script:
#! /bin/bash
cd /to/my/server/folder
git pull
sudo npm install --no-progress
nohup node src/ &
I have created an ssh key and added that as a read only deploy key in my gitlab account on this particular repo. The script is tested on the server and works totally fine. Now the fun part.
When the script is run as a startup script (https://cloud.google.com/compute/docs/startupscript) it doesn't work. The error:
permission denied (public key)
fatal: could not read from repo make sure it exists.
I tried these fixes:
Getting permission denied (public key) on gitlab. The problem being that they can not pull git repos in general. In my case it works fine from command line, it works fine from shell script, but it just doesn't work from startup script.
I also tried a whole bunch of other stuff on the whole spectrum from 'could be it' to 'a wild guess'. Clearly there is something I am missing here. Could anyone help me out?
Finally found the answer here: https://superuser.com/a/868699/852795. Apparently something goes wrong with the SSH keys that are used in a google startup script. The solution is to explicitly tell git what key to use. Like this: GIT_SSH_COMMAND="ssh -i ~/.ssh/id_rsa" git pull.

How do I install Nexus 3 in a Docker container with no internet connection?

I want to install Nexus 3 in a Docker container on CentOS. But my CentOS server with Docker installed on it has no access to the internet. I want to use this command:
Docker pull sonatype/nexus3
Is there a standalone, offline file or group of files to give me what I need?
I have only Windows machines with no Docker installed that can access the internet.
You could try and setup your own Docker registry server on your windows machine and then have your centos server talk to that server to get the files that it needs. This seems like overkill though.
Here is the link to set that up: https://docs.docker.com/registry/deploying/
You could also use something like virtualbox and create a centos server and then setup docker in there on the windows machine. This would allow you to have centos + docker + internet.
Yes, you can save the image to a file and then load it on the server:
Download the image to your workstation with docker pull sonatype/nexus3
Save the image to a tar file with docker save sonatype/nexus3 > nexus3.tar - Docs Save Docs
Transfer the image to the server via USB/LAN/etc
Import the image on the CentOS server with docker load --input nexus3.tar - Docker Load Docs
Docker Save
Produces a tarred repository to the standard output stream. Contains all parent layers, and all tags + versions, or specified repo:tag, for each argument provided.
Docker Load
Loads a tarred repository from a file or the standard input stream. Restores both images and tags.
You will now have the image loaded on your machine. There are probably other ways, but this is the simplest I can think of and involves no 3rd party tools. You can also gzip the file, per the documentation.

How do I compile Redis so that I can upload and run it on shared hosting?

I need to run Redis on my shared hosting account, but I am unable to compile on the server because of the nature of shared hosting. I have SSH access, but my hosting provider told me that I would need to compile Redis first and then upload it to the server.
I'm not sure how to go about this, and the only other person that asked this question on here never got a response.
So: how do I compile Redis so that I can upload it to and run it on my shared hosting account?
In my opinion the safest bet would be to statically compile redis.
I just did something similar for a CentOS 5 server. To be 100% sure I created a minimal CentOS 5 VM on my workstation, then I followed these steps (everything has been done on the CentOS 5 VM)
download redis and tcl 8.5
wget http://download.redis.io/releases/redis-3.0.2.tar.gz
wget http://prdownloads.sourceforge.net/tcl/tcl8.5.18-src.tar.gz
install tcl 8.5
tar xfz tcl8.5.18-src.tar.gz
cd tcl8.5.18/unix
./configure
make
make test
make install
compile redis
make CFLAGS="-static" EXEEXT="-static" LDFLAGS="-I/usr/local/include/"
test redis
cd src
./redis-server
copy the resulting binaries on the target server. The binaries can be found under the src folder.

Resources