How do I connect a somata client to a remote registry? - node.js

I'm using somata as my microservices platform for the web apps I'm building. I have successfully set up multiple clients on one machine with the somata registry running on the same machine. Now I want to have a client on one machine connect to a registry on another machine. How do I connect a client to a remote registry?

The simplest way is to use the environment variables SOMATA_REGISTRY_HOST (default "127.0.0.1") and SOMATA_REGISTRY_PORT (default 8420) when running your script:
SOMATA_REGISTRY_HOST=55.44.33.21 node test.js
The somata Client constructor also lets you connect to specific registries with the options registry_host and registry_port:
var client = new somata.Client({
registry_host: '55.44.33.21',
registry_port: 5858
})
Note: To allow connections from remote hosts, somata-registry will have to be run with its binding port as "0.0.0.0" instead of the default "127.0.0.1", which can be accomplished with the -h flag or SOMATA_REGISTRY_BIND_HOST environment variable when starting the registry. The -p flag and SOMATA_REGISTRY_BIND_PORT are also available for listening on a custom port.
somata-registry -h 0.0.0.0
or
SOMATA_REGISTRY_BIND_HOST=0.0.0.0 somata-registry
And of course you'll need access to the host and port from the remote machine.

Related

Running taurus command from master node on azure containers which is unable to reach slave node due to error in method java.rmi.MarshalException

Error at master node trying to connect to remote jmter slave node in same network
You need to ensure that at least port 1099 is open, check out How to open ports to a virtual machine with the Azure portal article for more details.
Apart from port 1099 you need to open:
The port you specify as the server.rmi.localport on slaves
The port you specify as the client.rmi.localport on master
More information:
Remote hosts and RMI configuration
JMeter Distributed Testing with Docker
JMeter Remote Testing: Using a different port

"The connection was reset" after starting my server [duplicate]

I'm running a webpack-dev-server application inside a Docker container (node:4.2.1). If I try to connect to the server port from within the container - it works fine. However, trying to connect it from the host computer results in reset connection (the port is published, of course). How can I fix it?
This issue is not a docker problem.
Add --host=0.0.0.0 to your webpack command.
You need to connect to your page like this:
http://host:port/webpack-dev-server/index.html
Look to the iframe mode
You need to make sure:
you docker container has mapped the EXPOSE'd port to a host port
docker run -p x:y
your VM (if you are using docker machine with a VM) has forwarded that mapped port to the actual host (the host of the VM).
See "How to access tomcat running in docker container from browser?"

Exposing node Server running on docker doesn't work

I am running a angular app on node server and in server.js I have specified app.listen(8084,localhost)..So when i run npm start in the docker container and try to -p 8084:8084 in docker run I was not able to get anything, even though the curl command inside my container curl localhost:8084 was giving me right result.
So i change the app.listen(8084) and the -p 8084:8084 started working..I am not sure why ?
When you open socket, you need to bind it to some interface in your system. There are predefined values:
0.0.0.0 - all interfaces, your service will be available from any interface
locahost, 127.0.0.1 - bind locally. That means service is NOT available from oustide -- this is your case.
You also can specify particular interface IP address to bind to it.
When you start your container, by default docker start default bridge network, so your container is being put into separate network and to access it, you need to allow incoming remote connections in container.
You bind your service to localhost into a container, so no communication is possible outside the container. localhost for your node server is not the same than localhost for your container.

Remote debugging NodeJS Container on AWS

Running a NodeJS Docker Container on an EC2 instance, I'm trying to remote debug it, but keep getting "connection refused" from the instance.
What I've tried -
Opening ports in EC2 security groups
Exposing ports in Dockerfile, both the port the app is listening on and the debug port
Forwarding the port within the Docker run command using the -p flag
Making sure the app is accessible directly through the port it's configured to listen to
After trying all of these, the debug port is still inaccessible by the remote debugger or even telnet.
Any ideas what could cause this?

How should my local server communicate with an EC2 server?

I have a node.js server running on ec2. I'd like for that server to automatically push data to another node.js server that is running on my laptop.
What is the best way to do something like this?
You could use a service like showoff.io to create an entry point to access your local laptop, or you could just create an SSH tunnel by running this command on your laptop:
ssh -R port:localhost:remoteport ec2-host
That will allow port on the loopback interface of your EC2 server to connect to remoteport on your laptop.
Then just modify your code to connect to the node.js program running on your laptop via the IP of 127.0.0.1 and port of port.
You could have the EC2 node.js call a function from the local node.js, and pass the data as variables

Resources