FeathersJS mount service at root - node.js

I am building a microservices architecture using FeathersJS and I only need one service per application, so I will mount that service to the root (/) of each app.
I have tried to do that using / as a path when I'm generating the service (Mongoose) and deleting the app.use('/', express.static(app.get('public'))); line from app.js and it works as it should when I'm accessing the base path (it lists all entries), but when I try /421jsadi23o1sj to find an entry, it returns 404.
I think it gets that ID as being a service's path and looks for it.
This is my businesses.service.js file:
const createService = require('feathers-mongoose');
const createModel = require('../../models/businesses.model');
const hooks = require('./businesses.hooks');
module.exports = function (app) {
const Model = createModel(app);
const paginate = app.get('paginate');
const options = {
name: 'businesses',
Model,
paginate
};
// Initialize our service with any options it requires
app.use('/', createService(options));
// Get our initialized service so that we can register hooks and filters
const service = app.service('/');
service.hooks(hooks);
app.publish(() => {
// Here you can add event publishers to channels set up in `channels.js`
// To publish only for a specific event use `app.publish(eventname, () => {})`
// e.g. to publish all service events to all authenticated users use
// return app.channel('authenticated');
});
};
Had anyone came across this issue? Any ideas about how it can be solved?

Related

Data goes null after few successful requests - Node service App pool caching

I have a service built using Node and Express and MongoDB as database. Service is hosted on IIS.
There is a side filter panel section in the application. Since that filters' master information does not change often (Data Size is in KBs), I use basic Node caching technique(no npm package) to avoid going to database on each page load request. Below is the sample Node code:
//main index.js file
SetFiltersList() function is called as Node service is first initialized on IIS, or, when app pool recycles.
(async () => {
await init.SetFiltersList();
})();
//init.js (utility file)
let filtersList = null; // filterList object that keeps list of Filters as cached object
const SetFiltersList = async (_error) => {
//This is a MongoDB database call
result = await defaultState.DEFAULT_STATE.GET("FiltersList");
filtersList = result.filters;
}
//Get filters call
const getFiltersList = () => filtersList;
module.exports = {
FiltersList: getFiltersList
};
//Controller.js
const GETFILTERLIST = async (req, res, next) => {
res.send(init.FiltersList());
}
//Controller Route
approuter.route('/GetFilterList/')
.get(Controller.GETFILTERLIST);
Problem
After few calls, Filters start returning null and strangely when I recycle the Application pool, the Filters starts coming again for sometime and this repeats after period of time.
Any thoughts whats going wrong here and how I can overcome this?

Can Open Telemetry Instrument Two Express Services in the Same Node Program?

Let's say I have a NodeJS program that has two separate instances of an express server running.
const express = require('express')
const app1 = express()
app1.listen(3000, () => { //... })
//...
const app2 = express()
app2.listen(3001, () => { //... })
I've been able to instrument a program like this via open telemetry, and have my spans sent/exported successfully to Zipkin. All I needed to do is/was add code like the following to the start of my program.
const { NodeTracerProvider } = require('#opentelemetry/node');
const { ZipkinExporter } = require('#opentelemetry/exporter-zipkin');
const provider = new NodeTracerProvider({
plugins: {
express: {
enabled: true,
},
http: {
requestHook: (span, request) => {
span.setAttribute("custom request hook attribute", "request");
},
},
}
});
const options = {
url: 'http://localhost:9411/api/v2/spans',
serviceName: 'service-main'
}
const zipkinExporter = new ZipkinExporter(options);
provider.addSpanProcessor(new SimpleSpanProcessor(zipkinExporter))
provider.register();
and make sure that the express and http plugins were/are installed
npm install #opentelemetry/plugin-http #opentelemetry/plugin-express
This all works great -- except for one thing. Open Telemetry sees both my express services running as the same service-main service.
When I instrumented these services directly with Zipkin -- I would add the Zipkin middleware to each running express server
app1.use(zipkinMiddleware({tracer: tracer1}));
app2.use(zipkinMiddleware({tracer: tracer2}));
Each tracer could be instantiated with its own service name, which allowed each service to have its individual name and show up as a different service in Zipkin.
(/main, /hello, and /goobye are all service via a different express service in the above URL)
Is this sort of thing (instrumenting two services in one program) possible with Open Telemetry? Or would I need to separate these two services out into separate programs in order to have each services have an individual name? This question is less about solving a particular problem, and more about understanding the semantics of Open Telemetry.
It is possible to create two separate tracer providers. Only one of them will be the global tracer provider, which the API will use if you call API methods. You can't use the plugins in this configuration, which means you will have to manually instrument your application. If this is a use-case which is important to you, I suggest you create an issue on the github repo.
yes, you can have multiple express running in the same node process (thats how clustering works in node as well)
but you will need to have them running on different ports.;
# const express = require('express')
const app1 = express()
app1.listen(3001, () => { //... })
//...
const app2 = express()
app2.listen(3002, () => { //... })

Get my Action’s server URL in (JavaScript) fulfilment code

I am using actionssdk and I build my Action fulfilments using Javascript and node.js + Express.
I am looking for a way to get the url (protocol + host name + port) of the server where the fulfilment is hosted.
Is there a simple way to do this? E.g. in the MAIN intent? Is there some conv-property I can use? Can I get hold of a req-parameter in the MAIN-intent, from which I can deduct hostname etc?
const express = require('express');
const expressApp = express();
const { actionssdk, ... } = require('actions-on-google');
const app = actionssdk({
ordersv3: true,
clientId: ...
});
expressApp.post('/fulfilment', app);
app.intent('actions.intent.MAIN', (conv) => {
let myUrl: string = ... // ???????
...
});
(background: obviously I know myself to where I deployed my fulfilment code. But I have a reusable template for fulfilment code in which I want to refer to the host url, and I do not want to type that in manually each time I develop a new Action).
You can get access to the request object in a middleware via Framework Metadata which is by default of type BuiltinFrameworkMetadata which contains objects used by Express
For example, you can use it like this, which will be ran before each request:
app.middleware((conv, framework) => {
console.log(framework.express.request.headers.host)
})

How to properly use dataloaders across multiple users?

In caching per request the following example is given that shows how to use dataloaders in express.
function createLoaders(authToken) {
return {
users: new DataLoader(ids => genUsers(authToken, ids)),
}
}
var app = express()
app.get('/', function(req, res) {
var authToken = authenticateUser(req)
var loaders = createLoaders(authToken)
res.send(renderPage(req, loaders))
})
app.listen()
I'm confused about passing authToken to genUsers batch function. How should a batch function be composed to use authToken and to return each user corresponding results??
What the example is saying that genUsers should use the credentials of the current request's user (identified by their auth token) to ensure they can only fetch data that they're allowed to see. Essentially, the loader gets initialised at the start of the request, and then discarded at the end, and never recycled between requests.

NodeJS Express Dependency Injection and Database Connections

Coming from a non Node background, my first instinct is to define my service as such
MyService.js
module.exports = new function(dbConnection)
{
// service uses the db
}
Now, I want one open db connection per request, so I define in middleware:
res.locals.db = openDbConnection();
And in some consuming Express api code:
api.js
var MyService = require(./services/MyService')
...
router.get('/foo/:id?', function (req, res) {
var service = new MyService(res.locals.db);
});
Now, being that Node's preferred method of dependency injection is via the require(...) statement, it seems that I shouldn't be using the constructor of MyService for injection of the db.
So let's say I want to have
var db = require('db');
at the top of MyService and then use somehow like db.current....but how would I tie the db to the current res.locals object now that db is a module itself? What's a recommended way of handling this kind of thin in Node?
Updated Answer: 05/02/15
If you want to attach a DB connection to each request object, then use that connection in your service, the connection will have to be passed to myService some how. The example below shows one way of doing that. If we try to use db.current or something to that effect, we'll be storing state in our DB module. In my experience, that will lead to trouble.
Alternatively, I lay out the approach I've used (and still use) in this previous answer. What this means for this example is the following:
// api.js
var MyService = require(./services/MyService')
...
router.get('/foo/:id?', function (req, res) {
MyService.performTask(req.params.id);
});
// MyService.js
var db = require('db');
module.exports = {
performTask: function(id)
{
var connection = db.getOpenConnection();
// Do whatever you want with the connection.
}
}
With this approach, we've decoupled the DB module from the api/app/router modules and only the module that actually uses it will know it exists.
Previous Answer: 05/01/15
What you're talking about could be done using an express middleware. Here's what it might look like:
var db = require('db');
// Attach a DB connection to each request coming in
router.use(req, res, next){
req.locals.db = db.getOpenConnection();
next();
}
// Later on..
router.get('/foo/:id?', function (req, res) {
// We should now have something attached to res.locals.db!
var service = new MyService(res.locals.db);
});
I personally have never seen something like new MyService before in express applications. That doesn't mean it can't be done, but you might consider an approach like this
// router.js
var MyService = require('MyService');
router.get('/foo/:id?', function (req, res) {
MyService.foo(res.locals.db);
});
// MyService.js
module.exports.foo(connection){
// I have a connection!
}

Resources