Cannot import google's proto with #grpc/proto-loader - node.js

I have the following proto:
syntax = "proto3";
import "google/rpc/status.proto";
message Response {
google.rpc.Status status = 1;
}
message Request {
Type name = 1;
}
service Service {
rpc SomeMethod (Request) returns (Response);
}
And I am writing a client in node:
const path = require('path');
const grpc = require('grpc');
const protoLoader = require('#grpc/proto-loader');
const protoFiles = require('google-proto-files');
const PROTO_PATH = path.join(__dirname, '/proto/myproto.proto');
const packageDefinition = protoLoader.loadSync(
PROTO_PATH,
{
keepCase: true,
longs: String,
enums: String,
defaults: true,
oneofs: true,
includeDirs: [protoFiles('rpc')],
},
);
const proto = grpc.loadPackageDefinition(packageDefinition);
const client = new proto.Service('localhost:1111', grpc.credentials.createInsecure());
When I run the client, I get the following error: TypeError: proto.Service is not a constructor. I found it's related to the import of status.proto. What is the right way of importing google protos using proto-loader? The server is in Java.

Olga, you cannot use the absolute path in the PROTO_PATH if you are using includeDirs. Apparently you need to put both path, i.e. path to myproto.proto AND path to the google-proto-files into includeDirs and use just file name as PROTO_PATH then it works just fine. See here:
https://github.com/grpc/grpc-node/issues/470
Here is modified code that works. Please note that I also had to replace "Type" with "int32" in the myproto.proto.
const path = require('path');
const grpc = require('grpc');
const protoLoader = require('#grpc/proto-loader');
const protoFiles = require('google-proto-files');
const PROTO_PATH = 'myproto.proto';
const packageDefinition = protoLoader.loadSync(
PROTO_PATH,
{
keepCase: true,
longs: String,
enums: String,
defaults: true,
oneofs: true,
includeDirs: ['node_modules/google-proto-files', 'proto']
},
);
const protoDescriptor = grpc.loadPackageDefinition(packageDefinition);
const client = new protoDescriptor.Service('localhost:1111', grpc.credentials.createInsecure());
Hope it helps. :)

The problem here is that the path that protoFiles('rpc') returns doesn't work with the import line in your .proto file. That import line means that #grpc/proto-loader is looking for an include directory that contains google/rpc/status.proto, but protoFiles('rpc') returns a directory that directly contains status.proto. So, you have to change one or both of those things so that the relative directories match up properly.

Related

TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string or an instance of Buffer or URL. Received an instance of Objec

I want to convert the data taken from the internet on a node js to json file and convert it to txt, but I get an error
const interactionCreate = require("../events/interactionCreate");
const Parser = require('rss-parser');
const { SlashCommandBuilder } = require("discord.js");
const parser = new Parser()
const { Routes } = require('discord-api-types/v9')
const { request } = require('undici')
const { EmbedBuilder } = require('discord.js');
const fs = require("fs")
const jsonToTxt = require("json-to-txt");
module.exports = {
name: "adsoyad",
description: "ad soyad bilgi ",
options:[
{
name: "adı",
description: "sorgulanacak kişi ismi",
type: 3,
required: true,
},
{
name: "soyadı",
description : "sorgulanacak kişi soyadı",
type: 3,
required: true,
}
],
run: async (client, interaction) =>{
//const ad = interaction.options.getString("ad");
// const soyad = interaction.options.getString("soyad");
//const feed = await parser.parseURL(`[URL]http://141.11.127.168/ucretsizapi/adsoyad.php?ad=&soyad=sezer&auth=propenthia[/URL]`)
const ad = interaction.options.getString('adı');
const soyad = interaction.options.getString('soyadı');
const query = new URLSearchParams({ ad });
const query2 = new URLSearchParams({ soyad });
const dictResult = await request(`http://141.11.127.168/ucretsizapi/adsoyad.php?${query}&${query2}&auth=propenthia`);
//const list = await dictResult.body.text();
const asa = await dictResult.body.json();
const dataInString = jsonToTxt({ filePath: asa });
await interaction.reply("sonuclar")
//const ms = await JSON.stringify(list)
//var mars = JSON.parse(ms)
}
}
When I enter the command in Discord, I want it to be sent as txt.
the error i got;
The error I get when I try to translate
node:internal/errors:490
ErrorCaptureStackTrace(err);
^
TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string or an instance of Buffer or URL. Received an instance of Object
at Object.open (node:fs:561:10)
at Object.writeFile (node:fs:2185:6)
at Object.run (C:\Users\trwor\OneDrive\Masaüstü\%64 bot Project\commands\adsoyad.js:53:12)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
TypeError [ERR_INVALID_ARG_TYPE]: The "path" argument must be of type string or an instance of Buffer or URL. Received an instance of Object
at Object.openSync (node:fs:592:10)
at Object.readFileSync (node:fs:468:35)
at getData (C:\Users\trwor\OneDrive\Masaüstü\%64 bot Project\node_modules\json-to-txt\src\lib.js:63:26)
at main (C:\Users\trwor\OneDrive\Masaüstü\%64 bot Project\node_modules\json-to-txt\json_to_txt.js:5:16)
at Object.run (C:\Users\trwor\OneDrive\Masaüstü\%64 bot Project\commands\adsoyad.js:53:30)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
code: 'ERR_INVALID_ARG_TYPE'
}
I want to solve my problem fast
Thank you in advance to those who are interested.
I actually want to take json and convert it to txt. I want to specify the txty I translated
Based on the error message, it looks like the argument passed to jsonToTxt is an object, but it is expected to be a file path string or a buffer. To solve the issue, you need to provide a valid file path string to jsonToTxt instead of the JSON data itself.
U need the JSON data to write data to a file, then converts it to a text file:
const interactionCreate = require("../events/interactionCreate");
const Parser = require('rss-parser');
const { SlashCommandBuilder } = require("discord.js");
const parser = new Parser()
const { Routes } = require('discord-api-types/v9')
const { request } = require('undici')
const { EmbedBuilder } = require('discord.js');
const fs = require("fs")
const jsonToTxt = require("json-to-txt");
module.exports = {
name: "adsoyad",
description: "ad soyad bilgi ",
options:[
{
name: "adı",
description: "sorgulanacak kişi ismi",
type: 3,
required: true,
},
{
name: "soyadı",
description : "sorgulanacak kişi soyadı",
type: 3,
required: true,
}
],
run: async (client, interaction) =>{
//const ad = interaction.options.getString("ad");
// const soyad = interaction.options.getString("soyad");
//const feed = await parser.parseURL(`[URL]http://141.11.127.168/ucretsizapi/adsoyad.php?ad=&soyad=sezer&auth=propenthia[/URL]`)
const ad = interaction.options.getString('adı');
const soyad = interaction.options.getString('soyadı');
const query = new URLSearchParams({ ad });
const query2 = new URLSearchParams({ soyad });
const dictResult = await request(`http://141.11.127.168/ucretsizapi/adsoyad.php?${query}&${query2}&auth=propenthia`);
//const list = await dictResult.body.text();
const asa = await dictResult.body.json();
// Write the JSON data to a file
fs.writeFileSync("data.json", JSON.stringify(asa));
// Convert the JSON data to a text file
jsonToTxt("data.json", "data.txt");
await interaction.reply("sonuclar")
}
}

How to run Node.js module 'windows-1252' along with require statements

Trying to correctly write a .json file from data.response.stream from a POST request using Node.js and Newman on Windows 10 AWS EC2. The default encoding is cp1252, but the response encoding is utf-8, and after attempts using iconv, iconv-lite, futzing with Buffer, I can't seem to arrive at a satisfactory result.
Here's the code I'm using:
const newman = require('newman'); // require Newman in the project
const fs = require('fs'); // require the Node.js module 'File System'
var url = require('url');
const https = require('https');
var path = require('path');
const { Iconv } = require('iconv').Iconv;
const iconvlite = require('iconv-lite');
const utf8 = require('utf8');
const windows1252 = require('windows-1252');
var requirejs = require('requirejs');
//import {encode, decode, labels} from 'windows-1252';
//import * as windows1252 from 'windows-1252';
//const collectionURL = 'https://www.getpostman.com/collections/mycollection';
let pageNumber = 16287;
// call newman.run to pass `options` object and wait for callback
newman.run({
collection: require('./postman-collection.json'),
iterationData: './iteration-data.csv',
color: 'on',
verbose: 'on',
exportCollection: './/after_pmRuns',
delayRequest: 500,
environment: require('./postman-environment.json'),
reporters: 'cli',
}).on('request', (error, data) => {
if (error) {
console.log(error);
return;
}
console.log('Request name: ' + data.item.name);
console.log(data.response.stream.toString());
var currentPageNumber = pageNumber++;
const requestName = data.item.name.replace(/[^a-z0-9]/gi, '-');
const randomString = Math.random().toString(36).substring(7);
const fileName = `./results/_00${currentPageNumber}-response-${requestName}-${randomString}.json`;
const encodedData = windows1252.encode(data.response.stream.toString(), {
mode: 'fatal'
});
const decodedData = iconvlite.decode(encodedData, 'utf-8');
//var iconv = new Iconv('windows-1252', 'UTF-8//TRANSLIT//IGNORE');
//var content = data.response.stream;
//var buffer = iconv.convert(content);
//var new_content = buffer.toString('utf8')
//const win = iconvlite.encode(data.response.stream, "windows1252");
//const utfStr = iconvlite.decode(win, "utf-8");
//const requestContent = data.response.stream.toString();
//return str.toString();
fs.writeFileSync(fileName, decodedData, function(error) {
if (error) {
console.error(error);
}
});
});
I keep getting ascii encoded .json files after opening in Notepad. And occasionally I'm getting replacement characters like \ufffd or variations of that.
When I try to adjust the package.json, I Newman throws an error since it's in a require statement, but when I try to import windows-1252 it says it's undefined.
Any ideas on how I can workaround this?
I hope we don't need encode or decode response data, we can simply use "parse" for buffur data to response json.
JSON.parse(responseData);

GRPC client using with node js facing issue

I am facing an issue when I am trying to create a grpc client call using node js. when I use import "google/api/annotations.proto" in proto file I get an below error. if I remove it it works file. May I know what I am missing from my client.js
Error: unresolvable extensions: 'extend google.protobuf.MethodOptions' in .google.api
at Root.resolveAll (src/github.com/workspace/explorer/node_modules/protobufjs/src/root.js:255:15)
at Object.loadSync (/src/github.com/workspace/explorer/node_modules/#grpc/proto-loader/build/src/index.js:224:16)
at Object. (/src/github.com/workspace/explorer/server/grpc/client.js:3:37)
syntax = 'proto3';
import "google/api/annotations.proto";
import "google/protobuf/timestamp.proto";
package chain;
service chain {
rpc GetHeight(HeightRequest) returns(HeightResponse) { option (google.api.http).get = "/api/height/{height}";}
}
message HeightRequest {
string hash = 1;
}
message HeightResponse {
int64 height=1;
}
client.js
var PROTO_PATH = __dirname + '/proto/chain.proto';
var parseArgs = require('minimist');
var grpc = require('#grpc/grpc-js');
var protoLoader = require('#grpc/proto-loader');
var packageDefinition = protoLoader.loadSync(
PROTO_PATH,
{
keepCase: true,
longs: String,
enums: String,
defaults: true,
oneofs: true,
});
var chain_proto = grpc.loadPackageDefinition(packageDefinition).chain;
function main() {
var argv = parseArgs(process.argv.slice(2), {
string: 'target'
});
var target;
if (argv.target) {
target = argv.target;
} else {
target = 'localhost:9040';
}
var client = new chain_proto.chain(target,
grpc.credentials.createInsecure());
client.GetHeight(function (err, response) {
console.log('height:', response);
});
}
main();
I found the solution to the above error, you need to create a folder inside the project directory googleapis->google->api then need to add an annotation.proto file from grpc-gateway GitHub like mention in this link
Grpc-gateway
Next need to add a path as shown below.
PROTO_PATH,
{
keepCase: true,
longs: String,
enums: String,
defaults: true,
oneofs: true,
includeDirs: [
__dirname + '/googleapis',
]
});

pino-pretty, how to add file name to log line

i need to add file name to pino-pretty line output,
now i'm using:
const pino = require('pino');
const logger = pino({
prettyPrint: {
colorize: true,
translateTime: 'yyyy-mm-dd HH:MM:ss',
ignore: 'pid,hostname'
}
})
and have this output:
[2020-05-14 16:25:45] INFO : Network is private
but i want something like this:
[2020-05-14 16:25:45] INFO myFile.js: Network is private
i.e. i want see filename in line witch was launch, i try play with customPrettifiers option but can't get hoped result,
for example i try this:
const pino = require('pino');
const path = require('path');
const logger = pino({
prettyPrint: {
colorize: true,
translateTime: 'yyyy-mm-dd HH:MM:ss',
ignore: 'pid,hostname',
customPrettifiers: {
filename: path.basename(__filename)
}
}
})
I think the closest you can get is as follows:
const path = require('path');
const pino = require('pino');
const logger = pino({
prettyPrint: {
// Adds the filename property to the message
messageFormat: '{filename}: {msg}',
// need to ignore 'filename' otherwise it appears beneath each log
ignore: 'pid,hostname,filename',
},
}).child({ filename: path.basename(__filename) });
Note that you can't style the filename differently to the message, but hopefully that's good enough.
It's probably also better to have a separate logger.js file where the default pino options are passed e.g.:
// logger.js
const logger = require('pino')({
prettyPrint: {
messageFormat: '{filename}: {msg}',
ignore: 'pid,hostname,filename',
},
});
module.exports = logger;
// file_with_logging.js
const parentLogger = require('./logger.js');
const logger = parentLogger.child({ filename: path.basename(__filename) });

Can't use #cypher in GraphQL schema when using ApolloWebserver

I want to query a field on a node using the #cypher directive in my GraphQL schema.
However when I query the field I get Resolve function for \"Link.x\" returned undefined.
My schema with the directive on x from Link is the following
scalar URI
interface IDisplayable{
"Minimal data necessary for the object to appear on screen"
id: ID!
label: String
story: URI
}
interface ILink{
"""
A link must know to what nodes it is connected to
"""
x: Node! #cypher(statement: "MATCH (this)-[:X_NODE]->(n:Node) RETURN n")
y: Node!
"""
if optional=true then sequence MAY be used to define a set of options
"""
optional: Boolean
}
interface INode{
synchronous: Boolean
unreliable: Boolean
}
type Node implements INode & IDisplayable{
id: ID!
label: String!
story: URI
synchronous: Boolean
unreliable: Boolean
}
type Link implements ILink & IDisplayable{
id: ID!
label: String!
x: Node! #cypher(statement: "MATCH (this)-[:X_NODE]->(n:Node) RETURN n")
y: Node!
story: URI
optional: Boolean
}
When querying for a a link and its x property I get undefined. With the custom resolver that I wrote for y however it works. Of course I could leave the hand written resolvers but its a lot of code that is not necessary.
This is index.js:
require( 'dotenv' ).config();
const express = require( 'express' );
const { ApolloServer } = require( 'apollo-server-express' );
const neo4j = require( 'neo4j-driver' );
const cors = require( 'cors' );
const { makeAugmentedSchema } = require( 'neo4j-graphql-js' );
const typeDefs = require( './graphql-schema' );
const resolvers = require( './resolvers' );
const app = express();
app.use( cors() );
const URI = `bolt://${ process.env.DB_HOST }:${ process.env.DB_PORT }`;
const driver = neo4j.driver(
URI,
neo4j.auth.basic( process.env.DB_USER, process.env.DB_PW ),
);
const schema = makeAugmentedSchema( { typeDefs, resolvers } );
const server = new ApolloServer( {
context: { driver },
schema,
formatError: ( err ) => {
return {
message: err.message,
code: err.extensions.code,
success: false,
stack: err.path,
};
},
} );
const port = process.env.PORT;
const path = process.env.ENDPOINT;
server.applyMiddleware( { app, path } );
app.listen( { port, path }, () => {
console.log( `Server listening at http://localhost:${ port }${ path }` );
} );
With "graphql-schema.js" being
const fs = require( 'fs' );
const path = require( 'path' );
const schema = './schemas/schema.graphql';
const encoding = 'utf-8';
let typeDefs = '';
typeDefs += fs.readFileSync( path.join( __dirname, schema ) )
.toString( encoding );
module.exports = typeDefs;
Thanks for any tips
I found out that, if I write a custom resolver for a query, the directives provided by Apollo do not work.
However I realized that I can let Apollo create the queries that I needed so I just deleted my custom implementations, which works for me.
So in my resolvers I had to remove implementations for queries that would fetch fields annotated with a #cypher query, then I could put the directive into my schema and they worked fine.

Resources