Node: Sending Protobuf message to Kafka Error - node.js

I am trying to use the HDFS kafka connector to send protobuf messages from kafka to the HDFS. My connector config looks like the following
{
"name": "hdfs3-connector-test",
"config": {
"connector.class": "io.confluent.connect.hdfs3.Hdfs3SinkConnector",
"tasks.max": "1",
"topics": "test-topic",
"hdfs.url": "hdfs://10.8.0.1:9000",
"flush.size": "3",
"key.converter": "org.apache.kafka.connect.storage.StringConverter",
"value.converter": "io.confluent.connect.protobuf.ProtobufConverter",
"value.converter.schema.registry.url":"http://10.8.0.1:8081",
"confluent.topic.bootstrap.servers": "10.8.0.1:9092",
"confluent.topic.replication.factor": "1"
}
}
In order to test this, I am trying to send protobuf serialized messages in a small node application. Here are my files:
// data.proto
syntax = "proto3";
package awesomepackage;
message SearchRequest {
string query = 1;
int32 page = 2;
}
and my node app
const { Kafka } = require('kafkajs')
const protobuf = require('protobufjs')
const kafka = new Kafka({
clientId: 'my-app',
brokers: ['10.8.0.1:9092']
})
const producer = kafka.producer()
const run = async () => {
await producer.connect()
protobuf.load('data.proto', async (err, root) => {
console.log("TESTING")
console.log(err)
let SearchRequest = root.lookupType('awesomepackage.SearchRequest')
let payload = {query: "test", page: 2}
var errMsg = SearchRequest.verify(payload);
console.log(errMsg)
let msg = SearchRequest.create(payload)
var buffer = SearchRequest.encode(msg).finish();
console.log(buffer)
await producer.send({
topic: 'test-topic',
messages: [
{key: 'key1', value: buffer}
]
})
})
}
run()
However, when I run this I get the following errors:
Failed to deserialize data for topic test-topic to Protobuf
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Protobuf message for id -1
Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte!
How do I fix this? My guess is that my protobuf schema is not registered in the kafka schema registry, but I am unsure. If this is the case, is there a way to send the schema to the registry from node?

io.confluent.connect.protobuf.ProtobufConverter requires the Schema Registry, not plain serialized Protobuf. In other words, you're missing the Schema Registry part (or a manual byte creation of a "wrapped" Proto message) in the Node code
Refer Wire Format - Confluent
If you would like to not use the Schema Registry, you can use the BlueApron Protobuf Converter, but seems like you are using one, so best to go with the Confluent converter

Related

Datadog APM Resource column is not giving correct values

I am facing trouble where datadog RESOURCE column is not giving the correct value as shown in the image. Really need some help here.
My assumption is that, it is happening because http tags are not appearing correctly. I think datadog itself add the http tags and it's value.
The http.path_group & http.route should have this value "/api-pim2/v1/attribute/search" but for some reason it's not coming correctly.
I am using this library dd-trace at backend. The tracers options which i provided are these
{"logInjection":true,"logLevel":"debug","runtimeMetrics":true,"analytics":true,"debug":true,"startupLogs":true,"tags":{"env":"dev02","region":"us-east-1","service":"fabric-gateway-pim-ecs"}}
The initialising code looks like this which ran at the start of my app
app/lib/tracer.js:
const config = require('config')
const tracerOptions = config.get('dataDog.tracer.options')
const logger = require('app/lib/logger')
const tracer = require('dd-trace').init({
...tracerOptions,
enabled: true,
logger
})
module.exports = tracer
I also tried to set the http.path_group & http.route tag manually but still it's not updating the values. Though i can add the new tags like http.test which has the same value which i was trying to override in http.path_group & http.route
const addTagsToRootSpan = tags => {
const span = tracer.scope().active()
if (span) {
const root = span.context()._trace.started[0]
for (const tag of tags) {
root.setTag(tag.key, tag.value)
}
log.debug('Tags added')
} else {
log.debug('Trace span could not be found')
}
}
...
const tags = [
{ key: 'http.path_group', value: request.originalUrl },
{ key: 'http.route', value: request.originalUrl },
{ key: 'http.test', value: request.originalUrl }
addTagsToRootSpan(tags)
...
I was requiring tracer.js file at the start of my app where server was listening.
require('app/lib/tracer')
app.listen(port, err => {
if (err) {
log.error(err)
return err
}
log.info(`Your server is ready for ${config.get('stage')} on PORT ${port}!`)
})
By enabling the debug option in datadog tracer init function. I can see the tracer logs and what values are passed by the library for http.route and resource.
I was confused by this line according to the data dog tracers doc you should init first before importing any instrumented module.
// This line must come before importing any instrumented module.
const tracer = require('dd-trace').init();
But for me http.route & resource value get correct if i initialise it on my routing file. They start giving me complete route "/api-pim2/v1/attribute/search" instead of only "/api-pim2"
routes/index.js:
const router = require('express').Router()
require('app/lib/tracer')
const attributeRouter = require('app/routes/v1/attribute')
router.use('/v1/attribute', attributeRouter)
module.exports = router
I am not accepting this answer yet because i am still confused where to initialise the tracer. Maybe someone can explain better. I am just writing this answer if someone else facing the issue can try this. Might just resolve their problem.

REST API with node-rtsp-stream NPM

I am trying to stream a RTSP in HTML5 pages using node-rtsp-stream NPM. Here I can see the live stream in HTML page. But the thing is when I try to do REST API with this it throws TypeError: stream is not a constructor. when I call my post method first time its working properly. when I try to do the same again it throws error.
here is my API:
RTSPRouter.post('/getPreview', (req, res) => {
// stream.mpeg1Muxer.kill();
stream = new stream({
name: 'name',
streamUrl: req.body.RTSPURL,
wsPort: 9999,
ffmpegOptions: {
'-r': 30
}
})
res.send(stream)
})
API for Kill :
RTSPRouter.get('/killPreview', (req, res) => {
process.kill(req.body.pid1)
stream.prototype.stop()// this method also not working
})
Even I killed the stream alone using the PID it's throwing the same error.
Kindly help me to fix this problem, thanks in advance!
you're getting a typeError because you haven't imported Stream.
Your code should look like this:
Stream = require("node-rtsp-stream");
stream = new Stream({
name: "name",
streamUrl: "rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov",
wsPort: 9999,
ffmpegOptions: {
// options ffmpeg flags
"-stats": "", // an option with no neccessary value uses a blank string
"-r": 30, // options with required values specify the value after the key
},
});
res.send(stream);
Also don't forget to run npm install node-rtsp-stream.
After a few days I found the answer for this question.
You should not create stream like this:
var stream = require('node-rtsp-stream')
stream = new stream({
name: 'name',
streamUrl: req.body.RTSPURL,
wsPort: 9999,
ffmpegOptions: {
'-r': 30
}
})
Instead of this, try the below code:
var stream = require('node-rtsp-stream')
var streamObj;
streamObj = new stream({
name: 'name',
streamUrl: req.body.RTSP,
wsPort: 9999,
ffmpegOptions: { // options ffmpeg flags
'-stats': '', // an option with no neccessary value uses a blank string
'-r': 30 // options with required values specify the value after the key
}
})
The actual error is that I had imported a stream with the name stream, and tried to initiate the stream with same variable name stream. You have to use a different variable name to initialize the stream. In my case I used streamobj.

Xcode 13 Warning - [NSKeyedUnarchiver validateAllowedClass:forKey:]

I am using File Storage system for saving some data models confirming to Codable Protocol.
My Save function is as below:
func save<T: Encodable>(value: T, for key: String, on path: URL) throws {
let url = path.appendingPathComponent(key, isDirectory: false)
do {
try ANFileManager.createDirectoryAtPath(path: url.deletingLastPathComponent())
let archiver = NSKeyedArchiver(requiringSecureCoding: true)
archiver.outputFormat = .binary
try archiver.encodeEncodable(value, forKey: NSKeyedArchiveRootObjectKey)
archiver.finishEncoding()
// then you can use encoded data
try archiver.encodedData.write(to: url)
} catch {
throw StorageError.cantWrite(error)
}
}
My fetch function is as below:
func fetchValue<T: Decodable>(for key: String, from path: URL) throws -> T {
let url = path.appendingPathComponent(key)
let data = try Data(contentsOf: url)
let unarchiver = try NSKeyedUnarchiver(forReadingFrom: data)
unarchiver.decodingFailurePolicy = .setErrorAndReturn
guard let decoded = unarchiver.decodeDecodable(T.self, forKey:
NSKeyedArchiveRootObjectKey) else {
throw StorageError.notFound
}
unarchiver.finishDecoding()
if let error = unarchiver.error {
throw StorageError.cantRead(error)
}
else {
return decoded
}
}
Save and fetch are working fine but at runtime seeing some below warning in xcode console.
*** -[NSKeyedUnarchiver validateAllowedClass:forKey:] allowed unarchiving safe plist type ''NSString' (0x7fff863014d0) [/Applications/Xcode_13.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/Foundation.framework]' for key 'NS.keys', even though it was not explicitly included in the client allowed classes set: '{(
"'NSDictionary' (0x7fff862db9a0) [/Applications/Xcode_13.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/CoreFoundation.framework]",
"'NSDate' (0x7fff862db798) [/Applications/Xcode_13.app/Contents/Developer/Platforms/iPhoneOS.platform/Library/Developer/CoreSimulator/Profiles/Runtimes/iOS.simruntime/Contents/Resources/RuntimeRoot/System/Library/Frameworks/CoreFoundation.framework]"
)}'. This will be disallowed in the future.
What should be done to suppress the warning ?
The problem is the failure to require secure coding on the unarchiver:
https://developer.apple.com/documentation/foundation/nskeyedunarchiver/1410824-requiressecurecoding
But more broadly it is very odd to pass through a keyed archiver when Codable is already saveable directly.

How to unpack an google.protobuf.Any type in gRPC nodejs client?

My protobuf file is like this:
syntax = "proto3"; import "google/protobuf/any.proto";
service RoomService {
getTestAny (Hotelid) returns (google.protobuf.Any); }
message Hotelid {
string hotelid = 1;
}
message HtlInDate {
Hotelid hotelid = 1;
string date = 2;
}
My java-gRPC-server code is like that:
#Override
public void getTestAny(Roomservice.Hotelid request, StreamObserver<Any> responseObserver) {
Roomservice.Hotelid hotelid = Roomservice.Hotelid.newBuilder()
.setHotelid("This is Hotelid")
.build();
Roomservice.HtlInDate htlDate = Roomservice.HtlInDate.newBuilder()
.setHotelid(hotelid)
.setDate("This is Data")
.build();
responseObserver.onNext(Any.pack(htlDate));
responseObserver.onCompleted();
}
And I make a request from a nodejs-gRPC-client, which code is like that:
function () {
var client = new services.RoomServiceClient('localhost:6565',
grpc.credentials.createInsecure());
var request = new messages.Hotelid();
var hotelid = "ignore";
request.setHotelid(hotelid);
var call = client.getTestAny(request, function (err, response) {
var obj = response.toObject();
console.log(obj);
});
}
The response in nodejs-gRPC-client is a type of Any. And it contains a data array:
array:["type.googleapis.com/HtlInDate", Uint8Array[10,17,10...]]
I try to use response.toObject() to get HtlInDate instance but I just get like this:
obj:{
typeUrl:"type.googleapis.com/HtlInDate",
value:"ChEKD1RoaXMgaXMgSG90ZWxpZBIMVGhpcyBpcyBEYXRh"
}
So how can I unpack the Any type response and get the HtlInDate instance exactly? Thanks a lot if you have any idea about this!
Currently, the google.protobuf.Any type is not supported in Node.js, either in Protobuf.js, which gRPC uses by default, or by google-protobuf, which is the official first party protobuf implementation.
From documentation:
https://developers.google.com/protocol-buffers/docs/reference/javascript-generated#message
// Storing an arbitrary message type in Any.
const status = new proto.foo.ErrorStatus();
const any = new Any();
const binarySerialized = ...;
any.pack(binarySerialized, 'foo.Bar');
console.log(any.getTypeName()); // foo.Bar
// Reading an arbitrary message from Any.
const bar = any.unpack(proto.foo.Bar.deserializeBinary, 'foo.Bar');
Please take a note that for browser support you need to use webpack(probably with babel loader) or browserify
As found in google-protobuf tests Any is bundled with pack and unpack functions.
Your code could be unpacked like this:
function () {
var client = new services.RoomServiceClient('localhost:6565',
grpc.credentials.createInsecure());
var request = new messages.Hotelid();
var hotelid = "ignore";
request.setHotelid(hotelid);
var call = client.getTestAny(request, function (err, response) {
var obj = response.toObject();
console.log('Any content', obj);
var date = response.unpack(messages.HtlInDate.deserializeBinary, response.getTypeName());
console.log('HtlInDate', date.toObject());
});
}
This will deserialize the bytes received in the Any object.
You could also build some Any using pack function for wrapping TypeUrl and Value:
var someAny = new Any();
someAny.pack(date.serializeBinary(), 'HtlInDate')

How to decompress in node data that was compressed in ruby with gzip?

We have a ruby instance that sends a message to a node instance via rabbitmq (bunny and amqplib) like below:
{ :type => data, :data => msg }.to_bson.to_s
This seems to be going pretty well, but msg's are sometimes long and we are sending them across data centers. zlib would help a lot.
doing smth like this in the ruby sender:
encoded_data = Zlib::Deflate.deflate(msg).force_encoding(msg.encoding)
and then reading it inside node:
data = zlib.inflateSync(encoded_data)
returns
"\x9C" from ASCII-8BIT to UTF-8
Is what I'm trying to do possible?
I am not a Ruby dev, so I will write the Ruby part in more or less pseudo code.
Ruby code (run online at https://repl.it/BoRD/0)
require 'json'
require 'zlib'
car = {:make => "bmw", :year => "2003"}
car_str = car.to_json
puts "car_str", car_str
car_byte = Zlib::Deflate.deflate(car_str)
# If you try to `puts car_byte`, it will crash with the following error:
# "\x9C" from ASCII-8BIT to UTF-8
#(repl):14:in `puts'
#(repl):14:in `puts'
#(repl):14:in `initialize'
car_str_dec = Zlib::Inflate.inflate(car_byte)
puts "car_str_dec", car_str_dec
# You can check that the decoded message is the same as the source.
# somehow send `car_byte`, the encoded bytes to RabbitMQ.
Node code
var zlib = require('zlib');
// somehow get the message from RabbitMQ.
var data = '...';
zlib.inflate(data, function (err, buffer) {
if (err) {
// Handle the error.
} else {
// If source didn't have any encoding,
// no need to specify the encoding.
console.log(buffer.toString());
}
});
I also suggest you to stick with async functions in Node instead of their sync alternatives.

Resources