How do you generate typings for protobuf files for use with GRPC? - node.js

I am trying to use GRPC with TypeScript, and I am trying to make sure I have all the types set (rather than just adding my own mapping or using any.
I've gotten as far as with problems I am experiencing noted in the comments.
import { Server, loadPackageDefinition } from "grpc";
import { loadSync } from "#grpc/proto-loader";
const packageDefinition = loadSync(__dirname + "/protos/artifact.proto");
const artifacts = loadPackageDefinition(packageDefinition).artifacts;
// what are the types here?
function SignedUrlPutObject(call, callback) {
}
const server = new Server();
// There's no ArtifactUpload defined in artifacts
server.addService(artifacts.ArtifactUpload.service, { SignedUrlPutObject })
Another approach I tried was to use pbjs and pbts.
"protobuf": "pbjs -t static-module -w commonjs -o protos.js protos/artifact-upload.proto && pbts -o protos.d.ts protos.js",
This generated the typings file, but I can't get it to work with grpc. Here's a github issue I found that may be related https://github.com/protobufjs/protobuf.js/issues/1381

There's 3 main tools you can use:
ts-protoc-gen
#grpc/proto-loader
grpc_tools_node_protoc_ts
I recommend using proto-loader:
npm i #grpc/proto-loader
You can then generate the types like so:
./node_modules/.bin/proto-loader-gen-types --longs=String --enums=String --defaults --oneofs --grpcLib=#grpc/grpc-js --outDir=proto/ proto/*.proto
Here's the proto file I use for this example:
syntax = "proto3";
package example_package;
message ServerMessage {
string server_message = 1;
}
message ClientMessage {
string client_message = 1;
}
service Example {
rpc unaryCall(ClientMessage) returns (ServerMessage) {}
rpc serverStreamingCall(ClientMessage) returns (stream ServerMessage) {}
rpc clientStreamingCall(stream ClientMessage) returns (ServerMessage) {}
rpc bidirectionalStreamingCall(stream ClientMessage) returns (stream ServerMessage) {}
}
Once the types are generated, you can consume them like so:
import * as grpc from '#grpc/grpc-js';
import * as protoLoader from '#grpc/proto-loader';
import { ProtoGrpcType } from './proto/example';
import { ClientMessage } from './proto/example_package/ClientMessage';
import { ExampleHandlers } from './proto/example_package/Example';
import { ServerMessage } from './proto/example_package/ServerMessage';
const host = '0.0.0.0:9090';
const exampleServer: ExampleHandlers = {
unaryCall(
call: grpc.ServerUnaryCall<ClientMessage, ServerMessage>,
callback: grpc.sendUnaryData<ServerMessage>
) {
if (call.request) {
console.log(`(server) Got client message: ${call.request.clientMessage}`);
}
callback(null, {
serverMessage: 'Message from server',
});
},
serverStreamingCall(
call: grpc.ServerWritableStream<ClientMessage, ServerMessage>
) {
call.write({
serverMessage: 'Message from server',
});
},
clientStreamingCall(
call: grpc.ServerReadableStream<ClientMessage, ServerMessage>
) {
call.on('data', (clientMessage: ClientMessage) => {
console.log(
`(server) Got client message: ${clientMessage.clientMessage}`
);
});
},
bidirectionalStreamingCall(
call: grpc.ServerDuplexStream<ClientMessage, ServerMessage>
) {
call.write({
serverMessage: 'Message from server',
});
call.on('data', (clientMessage: ClientMessage) => {
console.log(
`(server) Got client message: ${clientMessage.clientMessage}`
);
});
},
};
function getServer(): grpc.Server {
const packageDefinition = protoLoader.loadSync('./proto/example.proto');
const proto = (grpc.loadPackageDefinition(
packageDefinition
) as unknown) as ProtoGrpcType;
const server = new grpc.Server();
server.addService(proto.example_package.Example.service, exampleServer);
return server;
}
if (require.main === module) {
const server = getServer();
server.bindAsync(
host,
grpc.ServerCredentials.createInsecure(),
(err: Error | null, port: number) => {
if (err) {
console.error(`Server error: ${err.message}`);
} else {
console.log(`Server bound on port: ${port}`);
server.start();
}
}
);
}
I've created various examples of how to use gRPC with TypeScript here: https://github.com/badsyntax/grpc-js-types

Related

How to make kuzzle-device-manager plugin API actions works?

I successfully installed and loaded kuzzle-device-manager in the backend file:
import { Backend } from 'kuzzle';
import { DeviceManagerPlugin } from 'kuzzle-device-manager';
const app = new Backend('playground');
console.log(app.config);
const deviceManager = new DeviceManagerPlugin();
const mappings = {
updatedAt: { type: 'date' },
payloadUuid: { type: 'keyword' },
value: { type: 'float' }
}
deviceManager.devices.registerMeasure('humidity', mappings)
app.plugin.use(deviceManager)
app.start()
.then(async () => {
// Interact with Kuzzle API to create a new index if it does not already exist
console.log(' started!');
})
.catch(console.error);
But when i try to use controllers from that plugin for example device-manager/device with create action i get an error output.
Here is my "client" code in js:
const { Kuzzle, WebSocket } = require("kuzzle-sdk")
const kuzzle = new Kuzzle(
new WebSocket('KUZZLE_IP')
)
kuzzle.on('networkError', error => {
console.error('Network Error: ', error);
})
const run = async () => {
try {
// Connects to the Kuzzle server
await kuzzle.connect();
// Creates an index
const result = await kuzzle.query({
index: "nyc-open-data",
controller: "device-manager/device",
action: "create",
body: {
model: "model-1234",
reference: "reference-1234"
}
}, {
queuable: false
})
console.log(result)
} catch (error) {
console.error(error.message);
} finally {
kuzzle.disconnect();
}
};
run();
And the result log:
API action "device-manager/device":"create" not found
Note: The nyc-open-data index exists and is empty.
We apologize for this mistake in the documentation, the device-manager/device:create method is not available because the plugin is using auto-provisioning until the v2.
You should send a payload to your decoder, the plugin will automatically provision the device if it does not exists https://docs.kuzzle.io/official-plugins/device-manager/1/guides/decoders/#receive-payloads

Jest Test suite failed to run - AssertionError [ERR_ASSERTION]: opts.entryPoint required

We had a test running successfully for our code that uses the notifications-node-client to send emails, eg
import { NotifyClient } from 'notifications-node-client';
import notify from './notify';
import config from '../../../config';
jest.mock('fs');
jest.mock('notifications-node-client');
describe('notify', () => {
const EMAIL = 'user#test.com';
const REGION_GB = 'gb';
const REGION_NI = 'ni';
const feedbackOptions = {
personalisation: {
satisfactionLevel: 'satisfied',
improvements: 'improvements',
},
};
test('for original GB submission sendEmail should be called and true returned', async () => {
NotifyClient.prototype.sendEmail.mockReturnValue({
body: {
id: 1,
},
});
expect(await notify(EMAIL, 'submissionSuccess', REGION_GB)).toBeTruthy();
expect(NotifyClient.prototype.sendEmail).toHaveBeenCalledWith(
config.notify.templates.submissionSuccess.gb,
EMAIL,
undefined,
);
});
...
Having swapped out our Winston logging solution to use #dwp/node-logger, the 'notify' tests now do not run, failing with
● Test suite failed to run
AssertionError [ERR_ASSERTION]: opts.entryPoint required
at new Thrift (node_modules/thriftrw/thrift.js:64:5)
at Object.<anonymous> (node_modules/jaeger-client/src/thrift.js:21:20)
at Object.<anonymous> (node_modules/jaeger-client/src/reporters/remote_reporter.js:15:1)
All of the other test suites in the project still run successfully.
Could anyone point me in the right direction about what change to make?
The code that we're testing is
import { NotifyClient } from 'notifications-node-client';
import config from '../../../config/index';
import { RegionKeys } from '../../../config/types';
import logger from '../../../xxxx/lib/logger';
type TemplateId = 'submissionSuccess' | 'supportingEvidence' | 'feedbackTemplateId';
type FeedbackOptions = {
personalisation: {
satisfactionLevel: string,
improvements: string,
},
reference?: string,
}
const { apiKey, templates, proxy } = config.notify;
export default async function notify(
email: string,
templateId: TemplateId,
region: RegionKeys,
options?: FeedbackOptions,
): Promise<boolean> {
let notifyTemplate;
let notifyApiKey;
let notifyOptions;
try {
if (templateId === 'feedbackTemplateId' && !options) {
throw new Error(`Unable to send email - mismatch between template ID (${templateId}) and options supplied`);
} else if (options && templateId !== 'feedbackTemplateId') {
notifyOptions = {};
} else {
notifyOptions = options;
}
const notifyClient = new NotifyClient(apiKey[region]);
notifyClient.setProxy(proxy);
notifyTemplate = templateId === 'feedbackTemplateId' ? templates.feedbackTemplateId : templates[templateId][region];
logger.debug(`apiKey: ${notifyApiKey}`);
logger.debug(`notify template Id: ${notifyTemplate}`);
logger.debug(`proxy: ${JSON.stringify(proxy)}`);
const response = await notifyClient.sendEmail(notifyTemplate, email, notifyOptions);
if (response.body) {
logger.info(`confirmation email sent to ${email} and relates to message id ${response.body.id}`);
}
return true;
} catch (err: any) {
logger.error(`there was an error sending the message: ${err.message}`);
return false;
}
}

How do you to implement a GRPC server in TypeScript?

I am trying to use #grpc/proto-loader to do dynamic code generation of the protobuf files to implement a simple server but in Typescript.
I've gotten as far as
import { Server, loadPackageDefinition, ServerCredentials } from "grpc";
import { loadSync } from "#grpc/proto-loader";
const packageDefinition = loadSync(__dirname + "/protos/ArtifactUpload.proto");
const protoDescriptor = loadPackageDefinition(packageDefinition);
const impl = {
};
const server = new Server();
server.addService(protoDescriptor.ArtifactUpload, impl);
server.bind('0.0.0.0:50051', ServerCredentials.createInsecure());
server.start();
So I have two problems
in the Javascript examples they use protoDescriptor.XXX.service however, there's no service property in protoDescriptor.ArtifactUpload
if I try to add implementation methods in impl, the compiler also fails to compile.
Since the Javascript example works, I am thinking that questions along the line of add new property in Typescript object may be able to add the necessary service type. However, I haven't had luck so far.
My Protobuf is
syntax = "proto3";
service ArtifactUpload {
rpc SignedUrlPutObject (UploadRequest) returns (SignedUrlPutObjectResponse) {}
}
message UploadRequest {
string message = 1;
}
message SignedUrlPutObjectResponse {
string reply = 1;
}
[Updated on 14 May 2021]: TypeScript generation via #grpc/proto-loader is now released with version 0.6.0! I've updated my example here to reflect this. You can now install the latest version of proto loader with npm i #grpc/proto-loader which will contain the TS generation script. The instructions below are still valid.
You can use the proto-loader to generate types.
First, install the proto-loader:
npm i #grpc/proto-loader
You can then generate the types like so:
./node_modules/.bin/proto-loader-gen-types --longs=String --enums=String --defaults --oneofs --grpcLib=#grpc/grpc-js --outDir=proto/ proto/*.proto
Here's the proto file I use for this example:
syntax = "proto3";
package example_package;
message ServerMessage {
string server_message = 1;
}
message ClientMessage {
string client_message = 1;
}
service Example {
rpc unaryCall(ClientMessage) returns (ServerMessage) {}
rpc serverStreamingCall(ClientMessage) returns (stream ServerMessage) {}
rpc clientStreamingCall(stream ClientMessage) returns (ServerMessage) {}
rpc bidirectionalStreamingCall(stream ClientMessage) returns (stream ServerMessage) {}
}
Once the types are generated, you can consume them like so:
import * as grpc from '#grpc/grpc-js';
import * as protoLoader from '#grpc/proto-loader';
import { ProtoGrpcType } from './proto/example';
import { ClientMessage } from './proto/example_package/ClientMessage';
import { ExampleHandlers } from './proto/example_package/Example';
import { ServerMessage } from './proto/example_package/ServerMessage';
const host = '0.0.0.0:9090';
const exampleServer: ExampleHandlers = {
unaryCall(
call: grpc.ServerUnaryCall<ClientMessage, ServerMessage>,
callback: grpc.sendUnaryData<ServerMessage>
) {
if (call.request) {
console.log(`(server) Got client message: ${call.request.clientMessage}`);
}
callback(null, {
serverMessage: 'Message from server',
});
},
serverStreamingCall(
call: grpc.ServerWritableStream<ClientMessage, ServerMessage>
) {
call.write({
serverMessage: 'Message from server',
});
},
clientStreamingCall(
call: grpc.ServerReadableStream<ClientMessage, ServerMessage>
) {
call.on('data', (clientMessage: ClientMessage) => {
console.log(
`(server) Got client message: ${clientMessage.clientMessage}`
);
});
},
bidirectionalStreamingCall(
call: grpc.ServerDuplexStream<ClientMessage, ServerMessage>
) {
call.write({
serverMessage: 'Message from server',
});
call.on('data', (clientMessage: ClientMessage) => {
console.log(
`(server) Got client message: ${clientMessage.clientMessage}`
);
});
},
};
function getServer(): grpc.Server {
const packageDefinition = protoLoader.loadSync('./proto/example.proto');
const proto = (grpc.loadPackageDefinition(
packageDefinition
) as unknown) as ProtoGrpcType;
const server = new grpc.Server();
server.addService(proto.example_package.Example.service, exampleServer);
return server;
}
if (require.main === module) {
const server = getServer();
server.bindAsync(
host,
grpc.ServerCredentials.createInsecure(),
(err: Error | null, port: number) => {
if (err) {
console.error(`Server error: ${err.message}`);
} else {
console.log(`Server bound on port: ${port}`);
server.start();
}
}
);
}
I've created various examples of how to use gRPC with TypeScript here: https://github.com/badsyntax/grpc-js-typescript
I got it working in the end as follows:
In package.json I had the following:
{
...
"scripts": {
"start": "node index.js",
"build": "pbjs -t static-module -w commonjs -o protos.js protos/*.proto && pbts -o protos.d.ts protos.js && tsc",
},
"dependencies": {
"#grpc/proto-loader": "^0.5.5",
"google-protobuf": "^3.13.0",
"grpc": "^1.24.4",
"typescript": "^4.0.5"
},
"devDependencies": {
"#types/node": "^14.14.7",
"protobufjs": "^6.10.1"
}
}
import { Server, loadPackageDefinition, ServerCredentials, GrpcObject, ServiceDefinition, handleUnaryCall } from "grpc";
import { ISignedUrlPutObjectResponse, IUploadRequest, SignedUrlPutObjectResponse } from "./protos";
import { loadSync } from "#grpc/proto-loader";
const packageDefinition = loadSync(__dirname + "/protos/ArtifactUpload.proto");
interface IArtifactUpload {
signedUrlPutObject: handleUnaryCall<IUploadRequest, ISignedUrlPutObjectResponse>;
}
interface ServerDefinition extends GrpcObject {
service: any
}
interface ServerPackage extends GrpcObject {
[name: string]: ServerDefinition
}
const protoDescriptor = loadPackageDefinition(packageDefinition) as ServerPackage;
const server = new Server();
server.addService<IArtifactUpload>(protoDescriptor.ArtifactUpload.service, {
signedUrlPutObject(call, callback) {
console.log(call.request.message);
console.log(callback);
callback(null, SignedUrlPutObjectResponse.create({ reply: "hello " + call.request.message }));
}
});
server.bind('0.0.0.0:50051', ServerCredentials.createInsecure());
server.start();
I use protobufjs to build some of the typings though they are mostly unused as it is not fully compatible with GRPC. However, it does save time with the request and response typings.
I still needed to create the server typings and apply it to the protoDescriptor. Repeating it here for emphasis.
interface IArtifactUpload {
signedUrlPutObject(call: ServerUnaryCall<IUploadRequest>, callback: ArtifactUpload.SignedUrlPutObjectCallback): void;
}
interface ServerDefinition extends GrpcObject {
service: any;
}
interface ServerPackage extends GrpcObject {
[name: string]: ServerDefinition
}
I used any for the service as it was the only one that allowed me to avoid putting in anything specific to IArtifactUpload Ideally the typing for GrpcObject which at present is
export interface GrpcObject {
[name: string]: GrpcObject | typeof Client | ProtobufMessage;
}
should try to provide an object that represents the server.
I linked my solution to https://github.com/protobufjs/protobuf.js/issues/1017#issuecomment-725064230 in case there's a better way that I am missing.

i have to clone node modesl module to make it work in a class or function

i'm writing a nodejs class to play with modesl ( freswitch events module )
'use strict';
const esl = require('modesl');
class eslClass {
connect() {
this.fswcon = new esl.Connection(config.fswEslHost, config.fswEslPort, 'ClueCon', () => {
this._listen();
});
}
}
// connect() will failed with a 'this.once' is undefined ( .once is coming from eventEmitter2, i believe )
i've to 'clone' object to cancel error
const eslClone = esl.Connection;
class eslClass {
connect() {
this.fswcon = new eslClone (config.fswEslHost, config.fswEslPort, 'ClueCon', () => {
this._listen();
});
}
}
// error is gone away !!

socket io on sails js as API and node+react as Frontend

I have an API build using sailsjs and a react redux attach to a nodejs backend, and i am trying to implement socket.io for a realtime communication, how does this work?
is it
socket.io client on the react side that connects to a socket.io server on its nodejs backend that connects to a socket.io server on the API
socket.io client on the react side and on its nodejs backend that connects to a socket.io server on the API
i have tried looking around for some answers, but none seems to meet my requirements.
to try things out, i put the hello endpoint on my API, using the sailsjs realtime documentation, but when i do a sails lift i got this error Could not fetch session, since connecting socket has no cookie (is this a cross-origin socket?) i figure that i need to pass an auth code inside the request headers Authorization property.
Assuming i went for my #1 question, and by using redux-socket.io,
In my redux middleware i created a socketMiddleware
import createSocketIoMiddleware from 'redux-socket.io'
import io from 'socket.io-client'
import config from '../../../config'
const socket = io(config.host)
export default function socketMiddleware() {
return createSocketIoMiddleware(
socket,
() => next => (action) => {
const { nextAction, shuttle, ...rest } = action
if (!shuttle) {
return next(action)
}
const { socket_url: shuttleUrl = '' } = config
const apiParams = {
data: shuttle,
shuttleUrl,
}
const nextParams = {
...rest,
promise: api => api.post(apiParams),
nextAction,
}
return next(nextParams)
},
)
}
and in my redux store
import { createStore, applyMiddleware, compose } from 'redux'
import createSocketIoMiddleware from 'redux-socket.io'
...
import rootReducers from '../reducer'
import socketMiddleware from '../middleware/socketMiddleware'
import promiseMiddleware from '../middleware/promiseMiddleware'
...
import config from '../../../config'
export default function configStore(initialState) {
const socket = socketMiddleware()
...
const promise = promiseMiddleware(new ApiCall())
const middleware = [
applyMiddleware(socket),
...
applyMiddleware(promise),
]
if (config.env !== 'production') {
middleware.push(DevTools.instrument())
}
const createStoreWithMiddleware = compose(...middleware)
const store = createStoreWithMiddleware(createStore)(rootReducers, initialState)
...
return store
}
in my promiseMiddleware
export default function promiseMiddleware(api) {
return () => next => (action) => {
const { nextAction, promise, type, ...rest } = action
if (!promise) {
return next(action)
}
const [REQUEST, SUCCESS, FAILURE] = type
next({ ...rest, type: REQUEST })
function success(res) {
next({ ...rest, payload: res, type: SUCCESS })
if (nextAction) {
nextAction(res)
}
}
function error(err) {
next({ ...rest, payload: err, type: FAILURE })
if (nextAction) {
nextAction({}, err)
}
}
return promise(api)
.then(success, error)
.catch((err) => {
console.error('ERROR ON THE MIDDLEWARE: ', REQUEST, err) // eslint-disable-line no-console
next({ ...rest, payload: err, type: FAILURE })
})
}
}
my ApiCall
/* eslint-disable camelcase */
import superagent from 'superagent'
...
const methods = ['get', 'post', 'put', 'patch', 'del']
export default class ApiCall {
constructor() {
methods.forEach(method =>
this[method] = ({ params, data, shuttleUrl, savePath, mediaType, files } = {}) =>
new Promise((resolve, reject) => {
const request = superagent[method](shuttleUrl)
if (params) {
request.query(params)
}
...
if (data) {
request.send(data)
}
request.end((err, { body } = {}) => err ? reject(body || err) : resolve(body))
},
))
}
}
All this relation between the middlewares and the store works well on regular http api call. My question is, am i on the right path? if i am, then what should i write on this reactjs server part to communicate with the api socket? should i also use socket.io-client?
You need to add sails.io.js at your node server. Sails socket behavior it's quite tricky. Since, it's not using on method to listen the event.
Create sails endpoint which handle socket request. The documentation is here. The documentation is such a pain in the ass, but please bear with it.
On your node server. You can use it like
import socketIOClient from 'socket.io-client'
import sailsIOClient from 'sails.io.js'
const ioClient = sailsIOClient(socketIOClient)
ioClient.sails.url = "YOUR SOCKET SERVER URL"
ioClient.socket.get("SAILS ENDPOINT WHICH HANDLE SOCKET", function(data) {
console.log('Socket Data', data);
})

Resources