How to call Hapi plugin function from another - node.js

In Hapi (v17 if it makes any difference), what is the correct way to call a function in a plugin from another ?
Let's say I've started writing a wrapper plugin around Nodemailer :
'use strict';
const Nodemailer = require('nodemailer');
exports.plugin = {
name: 'mailerWrapperPlugin',
version: '0.0.1',
register: async function (server, options) {
}
};
What would be the correct way to expose plugin functions elsewhere in Hapi (i.e to the Hapi instance itself, but perhaps more importantly, to other plugins loaded by Hapi).
I'm finding the Hapi documentation a bit sparse, especially in relation to plugins.
So, for example, if my Nodemailer wrapper had a sendMail() function, how would I make that available in another plugin I've written ?
P.S. I'm a bit of a Node.JS/Hapi newbie, so treat me gently ! I'm testing this out because I'm thinking of migrating from PHP to Hapi for future backend applications.

You can use server.methods object. The doc says:
Server methods are functions registered with the server and used
throughout the application as a common utility. Their advantage is in
the ability to configure them to use the built-in cache and share
across multiple request handlers without having to create a common
module.
Now this is your first plugin:
const Nodemailer = require('nodemailer');
exports.plugin = {
name: 'mailerWrapperPlugin',
version: '0.0.1',
register: async function (server, options) {
server.method('sendMail', (subject, to, body) => {
// compose and send mail here
});
}
};
and this is your second, and make sure this is loaded after the first one.
exports.plugin = {
name: 'anotherPlugin',
version: '0.0.1',
register: async function (server, options) {
server.methods.sendMail("Hello", "test#test.com", "Sup?");
}
};
That's it.
Also you can use server.decorate object as well. Its doc says:
Extends various framework interfaces with custom methods where:
server.decorate may add methods to several other objects like routes, request, server and response object.
If we go back to your plugin:
exports.plugin = {
name: 'mailerWrapperPlugin',
version: '0.0.1',
register: async function (server, options) {
server.decorate('server', 'sendMail', (subject, to, body) => {
});
}
};
and your second plugin which was loaded after the first one!
exports.plugin = {
name: 'anotherPlugin',
version: '0.0.1',
register: async function (server, options) {
server.sendMail("Hello", "test#test.com", "Sup?");
}
};
The difference between them, the server.methods object add custom fields to server.methods object but with the server.decorate you can directly extend your server or request objects. Use which one is more clear to you, I prefer server.decorate generally.

Related

Is it possible to create two identical routes with a different param name using Fastify?

In my application, I would like to have two routes that match the following paths
fastify.get('/thing/:id', async () => {})
fastify.get('/thing/:name', async () => {})
I tried giving the ':id' path a regex, but Fastify still recognized them as duplicate routes. Is it possible to accomplish this aside from running a regex in the handler to determine if it is an id or a name?
EDIT: I failed to mention that I am familiar with the Fastify documentation. I was hoping to discover something that is not currently documented.
A nice alternative would be if it were possible to declare alternate params at the same path level and to have the schema determine which param applies. For example:
'/thing/:id|:name' and then have the params schema or be the determining factor, or separate regexs in the route like, '/thing/:id()|:name()'.
Anyway, would be cool.
That is not possible. Each has to be different from other somehow. But what you can do is Route Prefixing.
Like this
// server.js
const fastify = require('fastify')()
fastify.register(require('./routes/v1/things'), { prefix: '/v1' })
fastify.register(require('./routes/v2/things'), { prefix: '/v2' })
fastify.listen({ port: 3000 })
// routes/v1/things.js
module.exports = function (fastify, opts, done) {
fastify.get('/thing/:id', handler_v1)
done()
}
// routes/v2/things.js
module.exports = function (fastify, opts, done) {
fastify.get('/thing/:name', handler_v2)
done()
}
Fastify will not complain because you are using the same name for two different routes, because at compilation time it will handle the prefix automatically (this also means that the performance will not be affected at all!).
Now your clients will have access to the following routes:
/v1/thing/:id
/v2/thing/:name
You can do this as many times as you want, it also works for nested register, and route parameters are supported as well.
You can read more about fastify routes here.

How do I add parameters to long-form return requests in module.exports routes?

I'm coding for an API connection area, that's predominately graphql but needs to have some REST connections for certain things, and have equivalent to the following code:
foo.js
module.exports = {
routes: () => {
return [
{
method: 'GET',
path: '/existing_endpoint',
handler: module.exports.existing_endpoint
},
{
method: 'POST',
path: '/new_endpoint',
handler: module.exports.new_endpoint // <--- this not passing variables
}
]
},
existing_endpoint: async () => {
/* endpoint that isn't the concern of this */
},
new_endpoint: async (req, res) => {
console.log({req, res})
return 1
}
}
The existing GET endpoint works fine, but my POST endpoint always errors out with the console of {} where {req, res} should have been passed in by the router, I suspect because the POST isn't receiving. I've tried changing the POST declaration in the routes to module.exports.new_endpoint(req, res), but it tells me the variables aren't found, and the lead-in server.js does have the file (it looks more like this...), and doing similar with the server.js, also getting similar results, implying that's probably wrong too. Also, we have a really strict eslint setup, so I can't really change the format of the call.
Every example I've seen online using these libraries is some short form, or includes the function in the routes call, and isn't some long form like this. How do I do a POST in this format?
/* hapi, environment variables, apollog server, log engine, etc. */
/* preceeding library inclusions */
const foo = require('./routes/foo')
const other_route = require('./routes/other_route')
const startServer = async () => {
const server = Hapi.server({port, host})
server.route(other_route.routes())
server.route(foo.routes())
}
This is a bug with Hapi in node v16. I just opened an issue.
Your current solutions are either:
Upgrade to Hapi v20
Use n or another method to downgrade to node v14.16 for this project. I can confirm that POST requests do not hang in this version.

How to integrate OIDC Provider in Node jS

I tried to Integrate OIDC Provider to Node JS and I have a Sample Code. So, I run this Sample code it's throwing an error(unrecognized route or not allowed method (GET on /api/v1/.well-known/openid-configuration)).The problem is Issuer(https://localhost:3000) this Issuer is working fine. but i will change this Issuer((https://localhost:3000/api/v1/)) it's not working How to fix this Issue and I facing another issue also when I implement oldc-provider in node js. They Routes are override how to fix this issue
Sample.js
const { Provider } = require('oidc-provider');
const configuration = {
// ... see available options /docs
clients: [{
client_id: 'foo',
client_secret: 'bar',
redirect_uris: ['http://localhost:3000/api/v1/'],
true_provider: "pcc"
// + other client properties
}],
};
const oidc = new Provider('http://localhost:3000/api/v1/', configuration);
// express/nodejs style application callback (req, res, next) for use with express apps, see /examples/express.js
oidc.callback()
// or just expose a server standalone, see /examples/standalone.js
const server = oidc.listen(3000, () => {
console.log('oidc-provider listening on port 3000, check http://localhost:3000/api/v1/.well-known/openid-configuration');
});
Error
Defining Issuer Identifier with a path component does not affect anything route-wise.
You have two options, either mount the provider to a path (see docs), or define the actual paths you want for each endpoint to be prefixed (see docs).
I think you're looking for a way to mount, so the first one.

Is declaring a Node.js redis client as a const in multiple helpers a safe way to use it?

This is a little hard articulate so I hope my title isn't too terrible.
I have a frontend/backend React/Node.js(REST API) Web app that I want to add Redis support to for storing retrieving app global settings and per-user specific settings (like language preference, last login, etc... simple stuff) So I was considering adding a /settings branch to my backend REST API to push/pull this information from a redis instance.
This is where my Node.js inexperience comes through. I'm looking at using the ioredis client and it seems too easy. If I have a couple of helpers (more than one .js which will call upon redis) will constructing the client as a const in each be safe to do? Or is there another recommended approach to reusing a single instance of it be the way to go?
Here's a sample of what I'm thinking of doing. Imagine if I had 3 helper modules that require access to the redis client. Should I declare them as const in each? Or centralize them in a single helper module, and get the client from it? Is there a dis-advantage to doing either?
const config = require('config.json');
const redis_url = config.redis_url;
//redis setup
const Redis = require('ioredis');
const redis = new Redis(redis_url);
module.exports = {
test
}
async function test(id) {
redis.get(id, function (err, result) {
if (err) {
console.error(err);
throw(err);
} else {
return result;
}
});
Thank you.
If no redis conflicts...
If the different "helper" modules you are referring to have no conflicts when interacting with redis, such as overwriting / using the same redis keys, then I can't see any reason not to use the same redis instance (as outlined by garlicman) and export this to the different modules in which it is used.
Otherwise use separate redis databases...
If you do require separate redis database connections, redis ships with 16 databases so you can specify which to connect to when creating a new instance - see below:
const redis = new Redis({ // SET UP CONFIG FOR CONNECTION TO REDIS
port: 6379, // Redis port
host: 127.0.0.1, // Redis host
family: 4, // 4 (IPv4) or 6 (IPv6)
db: 10, // Redis database to connect to
});
Normally what I would do (in Java say) is implement any explicit class with singleton access the hold the connection and any connection error/reconnect handling.
All modules in Node.js are already singletons I believe, but what I will probably go with will be a client class to hold it and my own access related methods. Something like:
const config = require('config.json');
const Redis = require("ioredis");
class Redis {
constructor(){
client = new Redis(config.redis_url);
}
get(key) {
return this.client.get(key);
}
set(key, value, ttl) {
let rp;
if (ttl === 0) {
rp = this.client.set(key, value);
}
else {
rp = this.client.set(key, value)
.then(function(res) {
this.client.expire(key, ttl);
});
}
return rp;
}
}
module.exports = new Redis;
I'll probably include a data_init() method to check and preload an initial key/value structure on first connect.

Examples and documentation for couchnode

I am trying to integrate couchbase into my NodeJS application with couchnode module. Looks like it lacks of documentation. I see a lot of methods with parameters there in the source code but I can't find much information about how they work. Could you please share me with some, may be examples of code? Or should I read about these methods from other languages' documentation as there are chances they are the same?
To make development easier, I wrote a little helper (lib/couchbase.js):
var cb = require('couchbase'),
config;
if(process.env.NODE_ENV === 'production') {
config = require('../lib/config');
} else {
config = require('../lib/localconfig');
}
module.exports = function(bucket, callback) {
config.couchbase.bucket = bucket;
cb.connect(config.couchbase, callback);
};
Here's some example code for a view and async/each get operation. Instead of 'default' you can use different buckets.
var couchbase = require('../lib/couchbase');
couchbase('default', function(error, cb) {
cb.view('doc', 'view', {
stale: false
}, function(error, docs) {
async.each(docs, function(doc, fn) {
cb.get(doc.id, function(error, info) {
// do something
fn();
}
}, function(errors) {
// do something
});
});
});
I adapted an AngularJS and Node.js web application that another developer wrote for querying and editing Microsoft Azure DocumentDB documents to let it work with Couchbase:
https://github.com/rrutt/cb-bread
Here is the specific Node.js module that performs all the calls to the Couchbase Node SDK version 2.0.x:
https://github.com/rrutt/cb-bread/blob/dev/api/lib/couchbaseWrapper.js
Hopefully this provides some help in understanding how to configure arguments for many of the Couchbase API methods.

Resources