Unable to reach service using DNS - node.js

I am using the request package from NPM to handle some internal communication between services. I have also set the DNS server to the correct one (using Hashicorp Consul as my SD and DNS).
I can do a dig on my local machine (where the services are running) to the consul DNS server and I am able to get back the correct response (an IP and port number.
How I setup DNS in my app.js file6
dns.setServers([ `${config.consul.host}:8600` ]);
Set in a different file than app.js
options = {
baseUrl: `http://auth.service.consul`,
json: { '': '' },
headers: { authorization: '' }
};
Same file as options above
request.post(req.path, options, (error, response, body) => {
console.log(error);
if (error) throw error;
res.status(response.statusCode).json(body);
});
Error message:
at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:56:26)
errno: 'ENOTFOUND',
code: 'ENOTFOUND',
syscall: 'getaddrinfo',
hostname: 'auth.service.consul',
host: 'auth.service.consul',
port: 80 }
I am wanting to be able to make requests to the 'auth' service when using Consul as my DNS server. I currently have a very hacky way of doing this but really would like to use DNS.
I did find this but it is pertaining to the axios package not the request one I am trying to use even though it produces the same error the solution there didn't help.
Consul service discovery with DNS on Nodejs

You might have a chicken and egg problem. Try resolving ${config.consul.host} into an IP address and then call dns.setServers with that IP address.

Related

Nodemailer cannot send email from within Docker container

I've been searching/reading/trying everywhere on the Internet for about 3 weeks before posting here ...
Context:
developing little website app
technologies:
Next JS (ReactJs, HTML, CSS) for both frontend and backend (Node)
Linux as host (Ubuntu 20.04 LTS)
Docker's container to encapsulate app (based on node:alpine image) (Docker version 20.10.6)
Nodemailer Node's module to send email
this is the code using Nodemailer to send the e-mail message:
import type { NextApiRequest, NextApiResponse } from "next";
import * as nodemailer from "nodemailer";
export default async (req: NextApiRequest, res: NextApiResponse) => {
res.statusCode = 200;
let transporter = nodemailer.createTransport({
host: process.env.NM_HOST,
port: parseInt(process.env.NM_PORT),
secure: true,
auth: {
user: process.env.NM_USER,
pass: process.env.NM_PASS,
},
tls: {
rejectUnauthorized: false,
},
});
// console.log("User:");
// console.log(process.env.NM_USER);
let info = await transporter.sendMail({
from: "Website <xxx#xxx.com>",
to: "Website <xxx#xxx.com>",
subject: "New contact",
text: "NAME:\n" + req.body.data.name + "\n----------\nEMAIL:\n" + req.body.data.email + "\n----------\nBODY:\n" + req.body.data.body,
}, function (err, info) {
if (err) {
console.log(err)
} else {
console.log(info);
}
});
console.log("Message sent: %s", info);
res.json({
a: req.body.data.name,
b: req.body.data.email,
c: req.body.data.body,
});
};
Issue:
when I try to send e-mail using Nodemailer launching my app from Linux host as "npm run start" or "npm run dev", mails get delivered
when I try to send e-mail using Nodemailer launching my app from Docker's container, i get following error (from app's output itself)
Error: connect ECONNREFUSED 127.0.0.1:465
at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1133:16) {
errno: -111,
code: 'ESOCKET',
syscall: 'connect',
address: '127.0.0.1',
port: 465,
command: 'CONN'
}
What I already tried and what I observed:
ping google.com (and many others) works from within container (using docker exec -ti container-name sh command)
starting container with docker run --dns 8.8.8.8 ... -> same result (error above)
container's and host' /etc/resolv.conf are different (but I think that this might not be the point, as ping command correctly resolves, but feel free to say me wrong if I am)
I am not a sys admin (i am a developer), so I don't know if iptables or ufw (firewall) may be implied in this thing (btw, it's difficult to install non pre-installed packages on node:alpine)
Email server authentication is correct (both username, hostname, password) as it works correctly when i launch my app as npm run start or npm run dev
switch container's network between bridge (default) bridge (custom with docker-compose) and host ... same issue (error above)
Anyone willing to help is really appreciated.
Found out what wasn't working: I was using docker-compose WITHOUT --env-file option.
That way all the environment variables (e.g. PORT, HOST, PSWD, USR) I was trying to access within my app, were left undefined (this was because those environment variables weren't already built in during the building step - design choice, but rather accessed at runtime with process.env)
SOLUTION (change .env file part as suits your situation):
docker-compose --env-file ./.env.production
Useful official resource (docker-compose)
Docker-compose using --env-file option

Right way to connect to Google Cloud SQL from Node.JS

I followed the example on how to set up Node.JS to work with Cloud SQL, and generally got it to work, but with some workarounds on how to connect to the SQL server. I am unable to connect in the proper way passing the INSTANCE_CONNECTION_NAME to the socketPath option of the options variable for the createConnection() method. Instead, as a temporary workaround, I currently specify the server's IP address and put my VM IP address into the server's firewall settings to let it through.
This all works, but I'm now trying put it together properly before publishing to AppEngine.
How can I get it to work?
The following code works fine:
function getConnection ()
{
const options =
{
host: "111.11.11.11", //IP address of my Cloud SQL Server
user: 'root',
password: 'somePassword',
database: 'DatabaseName'
};
return mysql.createConnection(options);
}
But the following code, which I am combining from the Tutorial and from the Github page, which is referred to in the Tutorial, is giving errors:
function getConnection ()
{
const options =
{
user: 'root',
password: 'somePassword',
database: 'DatabaseName',
socketPath: '/cloudsql/project-name-123456:europe-west1:sql-instance-name'
};
return mysql.createConnection(options);
}
Here's the error that I'm getting:
{ [Error: connect ENOENT /cloudsql/project-name-123456:europe-west1:sql-instance-name]
code: 'ENOENT',
errno: 'ENOENT',
syscall: 'connect',
address: 'cloudsql/project-name-123456:europe-west1:sql-instance-name',
fatal: true }
What am I doing wrong? I am concerned that if I publish the app to AppEngine with the IP address, I won't be able to allow the incoming traffic into the SQL server?
I met similar error while testing 'coud sql'.
error message : Error: connect ENOENT /cloudsql/xxx-proj:us-central1:xxx-instance
solution :
+----------------------------------------------------------+ wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O
cloud_sql_proxy chmod +x cloud_sql_proxy sudo mkdir /cloudsql;
sudo chmod 777 /cloudsql ./cloud_sql_proxy -dir=/cloudsql &
=> now node js server can connect to mysql
refer to guide : https://cloud.google.com/appengine/docs/flexible/nodejs/using-cloud-sql
Are you deploying your AppEngine app to the same region as the SQL database? (europe-west1)
The documentation at https://cloud.google.com/sql/docs/mysql/connect-app-engine states "Your application must be in the same region as your Cloud SQL instance."

Node.js Net.Socket() to connect to net.tcp web service

So I have a web service running on port 7001 over TCP on:
net.tcp://www.myurl.com:7001/my/webservice
and I want to connect to it using net.Socket:
client.connect(7001, 'myurl.mine.com/my/webservice', function () {
console.log('Connected');
client.write(msg);
});
When i do the above, it gives an exception:
Error: getaddrinfo ENOTFOUND <<my url>>
at GetAddrInfoReqWrap.onLookup [as oncomplete]
when i try connect to it without the /my/webservice, it connects fine and doesnt give an error at the client.connect() stage, but obviously cant find that endpoint so it gives another error when i try to do the client.write()
Any one any idea how to use net.Socket against a web service with a url that isnt just myurl.com:7001 but actually contains a route like myurl.com:7001/my/webservice?
According to node.js docs the version of the method you are using expects only a hostname for a second argument. You are providing hostname + path
In your case I would recommend using a more generalized version of net.connect which accepts an object with parameters. So your code will look something like this:
client.connect({
host: 'myurl.mine.com',
port: 7001,
path: '/my/webservice'
},
function () {
console.log('Connected');
client.write(msg);
}
);

Bluemix NodeJS ENOTFOUND

I have a bluemix node js application which communicates with a server. I have test and production environment. On the development environment we communicate to the test server, and I get a node js error.
When I change the server URL to the production server everything is ok.
When I run the app on localhost and connect to the test server everything is ok too.
So my problem is only on bluemix environment with communication to my company test server. Error is:
{
"code": "ENOTFOUND",
"errno": "ENOTFOUND",
"syscall": "getaddrinfo",
"hostname": "www.xxxxxxxxx.cz"
}
Hostname in error is masked.
From the exception, I think the failing code is doing a dns lookup. I wrote this sample code and found that the error is similar or same.
var dns = require('dns');
dns.lookup('non-existent server', function(e, a) {
console.log(e);
});
And the output is:
bash-4.1$ node h.js
{ [Error: getaddrinfo ENOTFOUND non-existent server]
code: 'ENOTFOUND',
errno: 'ENOTFOUND',
syscall: 'getaddrinfo',
hostname: 'non-existent server' }
bash-4.1$
Problem determination steps would be:
ping your target server from a machine which has outbound access - to make sure the server is present. If not, resolve that problem.
Logon to bluemix debug console
ping your target server. If it does not respond, there is a wall between bluemix and the target. If it responds, try this test case. If that too works, we will have to debug further, I can be of further help.
Bluemix debug console is obtained through:
Export an environment variable "BLUEMIX_APP_MGMT_ENABLE" with value "shell"
Restage the app.
Login into the web shell in browser at https://your-app-url/bluemix-debug/shell/ using your Bluemix user credentials
Hope this helps.

rest call from node fails

I am building a node.js/express app which makes remote calls to other internal web application (ASP.NET WEB API) to consume json from it. We are in a corporate network. Here is the strange issue.
On my Mac OSX (Mavericks), I can curl from shell and get json from http://our_internal_host:9991/connections. I can also type this URL and see the JSON response from the browser.
When I run this express app locally and request the route which makes the remote call, I see this error on the console. The route handler logs this message below and browser hangs.
{ [Error: connect ECONNREFUSED]
code: 'ECONNREFUSED',
errno: 'ECONNREFUSED',
syscall: 'connect' }
Node process cannot make a connection to that address. I also have a Windows 7 machine at work and I do not encounter this issue on my Windows 7 machine. When I run the same node app on Windows 7, I don't have any issues.
I am not sure how to troubleshoot the issue...
PS: A colleague of mine who has the same setup doesn't have this issue. We compared DNS configs but our setup looks to be same.
Any pointers to troubleshoot this issue is much appreciated. I know this is environment specific issue but not sure where to start.
Thanks
EDIT #1
Route handler making the remote call which logs the error above...
var http = require('http');
var options = {
host: 'our_internal_host',
port: 9991,
path: '/analytics'
};
http.get(options, function(res) {
// removed response code
console.log(res);
}).on('error', function(e) {
console.log(e);
});
It turns out my Mac's IP was on a different subnet than other machines.

Resources