NodeJS APN module cannot find certificate file on Ubuntu - node.js

I am using the apn module (https://github.com/argon/node-apn) to send push notifications to iPhones from NodeJS.
My code works fine on my development machine (Mac OSX) and is successfully pushing notifications through the Apple sandbox gateway (gateway.sandbox.push.apple.com), but when I move it to the staging server (which is running Ubuntu) pushing notifications fails with the message:
Error: ENOENT, no such file or directory 'apns-dev-cert.pem'
I am setting up the NodeJS apn object as such:
var options = {
cert: "apns-dev-cert.pem",
key: "apns-key.pem",
passphrase: null,
gateway: "gateway.sandbox.push.apple.com",
port: 2195,
enhanced: true,
errorCallback: undefined,
cacheLength: 5
};
On my development Mac OSX machine, the cert is installed in the Keychain. From my limited understanding of Ubuntu, the equivalent would be to copy the cert file to /etc/ssl/certs. I tried doing this, and also changing the path to "/etc/ssl/certs/apn-dev-cert.pem" in my NodeJS code, but the same error message shows up.
Any ideas?

I struggled a lot with this issue until I realized that I hadn't understood how the fs module reads files. Apparently, it reads them from the directory where you started your node process. So the path to your .pem files should be relative to wherever you're doing that.
You might wanna check out __dirname too, which might make it easier to specify your paths.

You try to setup as below:
var options = {
//cert: "apns-dev-cert.pem",
//key: "apns-key.pem",
pfx: '<path>/apns-key.pem',
passphrase: null,
gateway: "gateway.sandbox.push.apple.com",
port: 2195,
enhanced: true,
errorCallback: undefined,
cacheLength: 5
};

Related

Server runs locally but crashes on Heroku

I deployed my server on Heroku but when I make any requests it returns a "500 Internal Server" error. It runs fine locally though. Could anyone help figure out what's going on?
When I check my logs this is what I'm getting.
2021-06-08T18:43:09.715406+00:00 app[web.1]: error: no pg_hba.conf entry for host "3.90.138.215", user "detmvsbueicsez", database "da9nlve42hcp91", SSL off
Repo Link: https://github.com/zcason/Restaurant-Review-Server
Live App: https://restaurant-review-phi.vercel.app/
As mentioned here on Heroku help, this indicate that there was a failed authentication attempt to the database, so the connection couldn't be established. This can happen because of different reasons.
In your case i suspect it's something related to not using ssl.
So after taking a look on the code provided in the github repo i noticed you are using knex and getting the connection string from .env
Try this :
Just add this ?ssl=true and append it to the end of DATABASE_URL in your .env file.
Edit your server.js (i didn't take a good look at the code so you need to add this ssl: { rejectUnauthorized: false } in your connection config) :
const db = knex({
client: 'pg',
connection: {
connectionString: DATABASE_URL,
ssl: { rejectUnauthorized: false }
}
});
Also make sure you're using the wright user and password and database name etc
OR Alternatively :
Run this command heroku config:set PGSSLMODE=no-verify in terminal to omit the ssl configuration object and set PGSSLMODE to no-verify

Is there a way to expose debug lines during scp2 upload operation in nodejs?

I am using this package in a nodejs app: https://www.npmjs.com/package/scp2 and it is calling upload(src, dest) to a unix server. It is not connecting to the unix server and the only error I see is Exit code 255 while establishing SFTP session. I realize it is not connecting due to something on the Unix servers OpenSSL package changing but since its rollback, it is still not working as it was before.
Is there a way to display the debug lines for scp2.upload()? I am stepping through the scp2 package in vscode debug mode but not finding anything.
I added this in the scp2 configs: debug: console.log as such:
this.sshCreds = {
host: this.host,
username: this.username,
password: this.password,
debug: console.log,
};
this.sshClient = new SshClient(this.sshCreds);

How do I securely store a .pem file when working with git-tracked heroku project?

I've got a git-tracked repo and am setting it up to work with APN for IOS push notifications. I'm looking at implementing the npm module node-apn in a similar way as PushNotificationSample
In this code, there is
var options = {
gateway: 'gateway.sandbox.push.apple.com', // this URL is different for Apple's Production Servers and changes when you go to production
errorCallback: callback,
cert: 'your-cert.pem', // ** NEED TO SET TO YOURS - see this tutorial - http://www.raywenderlich.com/32960/apple-push-notification-services-in-ios-6-tutorial-part-1
key: 'your-key.pem', // ** NEED TO SET TO YOURS
passphrase: 'your-pw', // ** NEED TO SET TO YOURS
port: 2195,
enhanced: true,
cacheLength: 100
}
However, how am I meant to reference my .pem files without committing them to Github?
At the moment, I'm deploying to Heroku.
Do this via Heroku's (environment) config variables.
If you're using node-apn or something similar, you should be able to pass in the certificate and key content instead of a path. Use ENV vars to pass in that the key content, as recommended by Heroku.
cert: process.env.APN_CERT,
key: process.env.APN_KEY,
passphrase: process.env.APN_PASSPHRASE,
Since you can't set multi-line values for app config in the web interface, you'll have to use the command line to set APN_CERT and APN_KEY:
$ heroku config:set APN_CERT="-----BEGIN CERTIFICATE-----
> MIIDOjCCAiICCQCZTWzQNz6sqTANBgkqhkiG9w0BAQsFADBfMQswCQYDVQQGEwJB
> VTETMBEGA1UECAwKU29tZS1TdGF0ZTEhMB8GA1UECgwYSW50ZXJuZXQgV2lkZ2l0
...

node application give proxy error when in production mode

I have a mean.js app which I've deployed to my production server.
it worked well when I had it in development mode, but since I switched it to production mode I'm getting a 502 proxy error.
the same happens whether I run it with node server.js or pm2.
I'm running on linux/debian with apache2.
I a newbie in this environment, how do I find the problem.
It turned out to be an SSL issue.
in my production.js file I commented the following:
module.exports = {
//secure: {
// ssl: false,
// privateKey: './config/sslcerts/key.pem',
// certificate: './config/sslcerts/cert.pem'
//},
...
this is of course a temp solution until the application really goes live

gulp-sftp with AWS

I am using gulp for my node.js project. I have an AWS ubuntu server where I want to copy some files using gulp.
I am using the following code in gulp
const sftp = require('gulp-sftp');
gulp.task('deploy', () => {
return gulp.src('deploy/bundle.zip')
.pipe(sftp({
host: 'ec2-x-x-x-x.us-x.compute.amazonaws.com',
key: {
location: '~/mykey.pem'
}
}));
});
However, I am getting the following error when I run gulp-deploy
[18:07:29] Using gulpfile ~/src/gulpfile.js
[18:07:29] Starting 'deploy'...
[18:07:29] Authenticating with private key.
[18:07:33] 'deploy' errored after 3.45 s
[18:07:33] Error in plugin 'gulp-sftp'
Message:
Authentication failure. Available authentication methods: publickey
Details:
level: authentication
partial: false
[18:07:33] gulp-sftp SFTP abrupt closure
[18:07:33] Connection :: close
I don't understand how to proceed further to troubleshoot. Please guide.
Looks like you're missing user as part of your options. It should either be root or ubuntu if you're linux or ubuntu, respectively.
Also for gulp-sftp, "Authentication failure. Available authentication methods: publickey" is a catch-all error even if your key location is invalid (in my case, it was). So make sure your path is correct as well.
My code:
.pipe(sftp({
host: 'serverurl.com',
user: 'ubuntu',
key: 'D:/path/to/key.pem'
}))

Resources