Serialize Python class using AvroProducer confluent-kafka - python-3.x

I am pretty new to the confluent-kafka and python, just would like to know if there a way in python we could serialize the python class to an kafka message using avro schema.
I am currently using AvroProducer provided by confluent-kafka, however, i am only able tot serialize a json string. And the downstream application is using Java, and downstream would like to deserialize the message back to an Java Object.
producer = AvroProducer(self.producer_config, default_key_schema=key_schema, default_value_schema=value_schema)
value = ast.literal_eval(str(message))
try:
producer.produce(topic=topic, partition=partition, key=str(message['Id']),
value=value)
I believe this is the part that is causing issue:
value = ast.literal_eval(str(message))
The message is originally a dictionary. I have to convert it to a json string in order to serialize it. It just looks really weird.
Could you help advise how to serialie a python class, instead of serialize a json string using AvroProducer? Thanks!

Related

How do I convert an struct of type Data to an object of type CNGroup?

I'm having trouble converting a CNGroup object to a Data object and back to a CNGroup object. I decided to start rethinking the problem again. Somewhere along the way I decided that I should use the Data class to save a CNGroup object to CloudKit. I also learned that the field type to use in my CKRecord object would be of the type bytes.
Am I correct so far?
I am able to convert a CNGroup object to a Data object and back again unless I store the Data object in CloudKit and then retrieve it before I convert the Data object back to a CNGroup object, in which case I get an error when I try to access the pointee property of the typed pointer. That would be an UnsafeBufferPointer, an UnsafeMutableBufferPointer, or an UnsafePointer.
I've tried a lot of different code using different ways. It is impractical to put so much code in my post. I have used the copyBytes method and the withUnsafeBytes method of the Data object.
There is one simple code, and that is when I converted the CNGroup object to a Data object:
func convertCNGroupToData(fromCNGroup group: inout CNGroup) -> Data {
return Data(bytes: &group, count: MemoryLayout.size(ofValue: group))
}
I am looking for a simple way to do what I want. I am relooking at Apple's documentation of Data and NSData.
I am not able to be more specific with this question. I appreciate any effort to help me with this.

No attribute error passing broadcast variable from PySpark to Java function

I have a java class registered in PySpark, and Im trying to pass a Broadcast variable from PySpark to a method in this class. Like so:
from py4j.java_gateway import java_import
java_import(spark.sparkContext._jvm, "net.a.b.c.MyClass")
myPythonGateway = spark.sparkContext._jvm.MyClass()
with open("tests/fixtures/file.txt", "rb") as binary_file:
data = spark.sparkContext.broadcast(binary_file.read())
myPythonGateway.setData(data)
But this is throwing:
AttributeError: 'Broadcast' object has no attribute '_get_object_id'
However, if I pass the byte[] directly, without wrapping it in broadcast(), it works fine. But I need this variable to be broadcast, as it will be used repeatedly.
According to the py4j docs, the above error will be thrown if you try to pass a Python collection to a method that expects a Java collection. The docs give the following solution:
You can explicitly convert Python collections using one of the following converter located in the py4j.java_collections module: SetConverter, MapConverter, ListConverter.
An example is provided there also.
Presumably, this error is occurring when py4j tries to convert the value attribute of the Broadcast object, so converting this may fix the problem e.g.
converted_data = ListConverter().convert(binary_file.read(),spark.sparkContext._jvm._gateway_client)
broadcast_data = spark.sparkContext.broadcast(converted_data)
myPythonGateway.setData(broadcast_data)

How do i convert a custom object to the bytes data type and back again?

I am trying to follow the websocket tutorial that can be found here in the readme of this python github repo:
https://github.com/aaugustin/websockets
For my use case, I want the client to not pass a string to the websocket server but rather an object. When I tried replacing the generic "Hello World!" parameter that the client sends to the server though I get the following error:
TypeError: data must be bytes or str
Ok, makes sense. Obviously websocket requires a string or a bytes object to be passed from client to server. My question is how do i easily convert a generic object of some custom class I've created to the bytes/string type using the best practice problem. Obviously, I would also like to be able to convert the object back from the bytes class to the original class type I have declared.
When searching I couldn't find anything talking about how to do this (only how to do this for strings) and tried hard casting by passing my object into the bytes() method but this threw an error.
Thoughts?
I am an idiot. Converting the object to JSON works.

How to transform a LinkedHashMap Payload to Object Payload in mule?

I want to transform a LinkedHashMap Payload to Object Payload in mule , i used the Byte Array to Object transformer but it dosent work for me , any idea guys ?
you can use dataweave to transform a payload of generic type (=java.util.Map) to a specific type (foo.bar.Type in the example):
%dw 1.0
%output application/java
---
payload as :object {
class: "foo.bar.Type"
}
You seem to mention specifically the Object type. A LinkedHashMap is already an instance of Object: every Java instances inherit from the root class Object.
If you want to transform your HashMap into a specific object such as JSON or a custom object such as com.mycompany.CompData, you have several possibilities depending on your use case:
use DataWeave as mentioned in other answers (require EE)
use a built-in transformer such as Object-to-JSON
implement your own Transformer by extending AbstractTransformer
Se the docs for details: https://docs.mulesoft.com/mule-user-guide/v/3.8/using-transformers
If you may be more specific as to what your use case is I'll gladly refine my answer ;)
You can use either dataweave or json to object transformer.

Spring Integration Object To Map Transformer

I am using SI 4.0 and trying to use object-to-map-transformer as below
<integration:object-to-map-transformer input-channel="inputChannel"
output-channel="outChannel" >
</integration:object-to-map-transformer>
I am sending a object like Person class on the inputChannel. But the moment I fire my test it fails with following error
Caused by: java.lang.IllegalStateException: Neither jackson-databind.jar,
nor jackson-mapper-asl.jar aren't presented in the classpath. at
org.springframework.integration.support.json.JacksonJsonUtils.<clinit>(JacksonJsonUtils.java:41)
I dont understand why it needs jackson. I looked at SI code and can see it needs Jackson class but why is this need - when I simply need to map a simple object to Map?
Thanks
The code to convert object to map looks like:
Map<String,Object> result = this.jsonObjectMapper.fromJson(this.jsonObjectMapper.toJson(payload), Map.class);
Since the out of the box implementation for the JsonObjectMapper is Jackson, it requires that the last one should be presented in the classpath.
We decided to use JSON notation for the Map presentation, since any object in JSON has map-based structure.
If you have another algorithm to do the same, the contribution is welcome!
Or you can simply implement your own Transformer with that logic and use it from generic <transformer>.

Resources