I want to know how to subscribe and print the published payload to another device
I tried with sending the payload data which i used to publish by converting it to byte array for parameter of the function receivecallback
Related
I am attempting to route device telemetry data for a device connected to Azure IoTHub.
I have defined the custom endpoint in message routing to a storage account with the Encoding format set to JSON and routing query set to true.
This has successfully sent the data to the storage account but the telemetry data in the message body is in base 64 shown below
{"EnqueuedTimeUtc":"2022-07-13T13:03:28.4770000Z","Properties":{},"SystemProperties":{"connectionDeviceId":"SensorTile","connectionAuthMethod":"{\"scope\":\"device\",\"type\":\"sas\",\"issuer\":\"iothub\",\"acceptingIpFilterRule\":null}","connectionDeviceGenerationId":"6*********971","enqueuedTime":"2022-07-13T13:03:28.4770000Z"},"Body":"eyJBY2NlbGVyb21ldGVyIjp7IlkiOi0xNSwiWCI6MTAsIloiOjEwMzZ9LCJ0cyI6IjIwMjItMDctMTNUMTU6MDM6MjguNDAwKzAyMDAiLCJpZCI6IlNlbnNvclRpbGUifQ=="}
Reading the documentation
"When using JSON encoding, you must set the contentType to application/json and contentEncoding to UTF-8 in the message system properties. Both of these values are case-insensitive. If the content encoding is not set, then IoT Hub will write the messages in base 64 encoded format."
I understand it is possible to translate the data to a UTF-8 format by setting the systemProperties contentType to application/json and contentEncoding to UTF-8 but I am unsure where and how to actually do this or can I use another service such as Stream Analytics/Fuctions/EventHub to achive this?
Also is it possible to filter messages via route query so that only telemetry data is routed ignoring the rest?
Any help is greatly appreciated
The content type and content encoding need to be set when sending the message. That means the device is in charge of setting the properties. Here's a few examples in different languages.
C#
using var eventMessage = new Message(Encoding.UTF8.GetBytes(messagePayload))
{
ContentEncoding = Encoding.UTF8.ToString(),
ContentType = "application/json",
};
C
IOTHUB_MESSAGE_HANDLE message_handle = IoTHubMessage_CreateFromString(message);
(void)IoTHubMessage_SetContentTypeSystemProperty(message_handle, "application%2fjson");
(void)IoTHubMessage_SetContentEncodingSystemProperty(message_handle, "utf-8");
Java
Message msg = new Message(msgStr);
msg.setContentType("application/json");
msg.setProperty("temperatureAlert", temperature > 28 ? "true" : "false");
Setting content type and content encoding is the responsibility of the IoT device that is sending messages to the IoT hub. Setting these values depends on the language-specific device SDK used by the IoT device.
Without these settings, the message routing queries based on the message body won't work as mentioned in this link. Also to filter messages based on the telemetry data, you don't need filtering queries. You can create a route with 'Data source' - 'Device Telemetry Messages' so that only device telemetry data will be routed. Please find the attached screenshot for reference:
I am able to send events to Event Hub as a batch using the Nuget package Azure.Messaging.EventHubs.But I want to use REST API.
I am able to send single events to Event Hub using REST API using Postman. But as per documentation if I need to send batch events in REST API I need add header Content-Type:application/vnd.microsoft.servicebus.json and Also the message should be enclosed with the "Body"
like [{"Body":"Message1"},{"Body":"Message2"},{"Body":"Message3"}]
So if I need to send json as event then should I create a json string and send it?
Sample:
[
{
"Body":"{\"ID\":\"1\",\"Name\":\"abc1\"}"
},
{
"Body":"{\"ID\":\"2\",\"Name\":\"def1\"}"
},
{
"Body":"{\"ID\":\"3\",\"Name\":\"xyz1\"}"
}
]
OR is there any other option to send event as Batch using the REST API to Event Hub.
"Body" is a string content then yes, you must escape your JSON content first. Your sample looks OK to me.
Yes, you could create an events list -
List<Event> events = new List<Event>(){
new Event(1,"abc1"),
new Event(2,"def1"),
new Event(3,"xyz1"),
}
and send it via POST
I have set up a HTTP Receive (req-response) adapter and the message appears to be getting to the message box. When I create an orchestration using a direct bound logical port, I am getting the message but everything I have tried to read the message body has failed (using passthrough pipeline, XML pipeline with allow unrecognized files = true) but I get exceptions any time I try to use the incoming message (message assignments, sending the message to a custom module to try to read the part(s)).
Rather than go into details on exceptions, can anyone point to instructions on what the proper way to access/use the body of the HTTP Get messages within an orchestration? To explain what I am trying to do, I want to take the query string (body) and send it verbatim to another orchestration for processing, so I simply want to extract the body (query string) from the message.
For a GET request without a body you need to use the WCF-WebHttp adapter rather than the deprecated BTSHTTPReceive.dll
With the WCF-WebHttp you can use the Variable Mapping to populate message context properties with the URI parameters.
So the answer was to NOT use the HTTP adapter for GET requests. I did not realize the HTTP adapter has effectively been deprecated. For basic GET requests I had to switch to the WCF-WebHTTP adapter and make sure to include the property in the property schema and then make sure to set the schema in the variable mapping as the property schema, not the message type schema of the incoming message. I wish the Microsoft documentation was more clear that the HTTP adapter cannot be used for very basic GET requests in which a body is not provided in the request.
Im using pubnub encrypted messages https://www.pubnub.com/docs/javascript/api-reference-sdk-v4#init via cipher key between two clientes, now i start intercepting those messages with PubNub BLOCKS but can't find a way to decrypt them, i receive a long Base64 string and there is no tool to decrypt it via either the provided crypto module or the provided pubnub module, the block is super simple
export default (request) => {
console.log(request); // Log the request envelope passed when tested with a payload its shown when a real message goes through is a base64 string of an encrypted message
return request.ok();
}
Publishing Unencrypted Meta Data with Encrypted Message Paylaod
Currently, if you are using AES encryption the assumption is that you would want end to end encryption and not be able to decrypt the message in-flight by a BLOCK.
However if there is information you want to act on, you pass the information using the Meta Argument. This meta information is outside the message payload and not encrypted and therefore accessible by a BLOCK.
The PubNub PM team would love to hear more about your use case and why you would want to decrypt this message in flight to see if this is something we need to add to the roadmap. Please send a message to PubNub Support with more details.
Here is some sample code you can use to see the meta data in action with a block:
Publishing a message with meta data
The meta data portion of your message is never encrypted and it meant for data that you would use for filtering messages (and other use cases). If you are using an cipher key when you init PubNub, the message portion of the payload would be encrypted from end to end (not decrypted within PubNub since we do not know your cipher key). But the meta portion would remain clear text so that you can perform condition logic in a block based on these keys/values or for stream filtering on a per client basis.
pubnub.publish(
{
channel : "chmeta",
message : {"text": "hello"},
meta: {
"cool": "beans"
}
},
function(status, response) {
console.log(status, response);
}
);
Accessing meta data with PubNub BLOCKS
In your block code (Before or After Publish event handlers), you can access the meta key as follows:
export default (request) => {
console.log(JSON.parse(request.params.meta));
return request.ok();
}
The output of the entire request parameter would be quite verbose and I encourage you to review it as it will have lots of gems in there you might want to take advantage of, but just honing in on the meta key (request.params.meta) will give access to the meta data you provided in the publish. The JSON.parse is needed because the data will be stringified (escaped), {\"cool\": \"beans\"}, and this will transform it back into a real live JSON object, {"cool": "beans"}.
Okay, I did it myself. The code is ugly and I am open to any help with refactoring, but it works - it lets you decrypt messages in PubNub functions (blocks)
Here's the Gist - https://gist.github.com/DataGreed/f0007e7b5b8dcfadd8a44a5d3514b6dc
Don't forget to change the encryption key in getKey function.
PubNub Functions (formerly called BLOCKS) now has Crypto Module
I believe at the time you asked this question, the crypto module was not included with BLOCKS (we rebranded this as PubNub Functions). Now it is:
PubNub Functions Crypto Module Docs
I'm using Logic App to send a message to a service bus on Azure. The logic app starts with a HTTP Request for the trigger which contains a JSON payload in the body. The 'Body' of the request is set as the Content of the Send Message action. Since the payload is JSON when posting I set the Content-Type to application/json. This generates an error on the Send Message action;
{"code":"InvalidTemplate","message":"Unable to process template language expressions in action 'Send_message.' inputs at line '1' and column '1221': 'The template language function 'encodeBase64' expects its parameter to be a string. The provided value is of type 'Object'. Please see https://aka.ms/logicexpressions#encodeBase64 for usage details.'."}
So tried changing the Content-Type to text/plain and it works? Is this a bug or should convert the JSON to a text value somehow before using it in the Send Message action?
Sending a message to service bus requires the message content to be base64 encoded. Since your content is a JSON, you would need to stringify it explicitly prior to encoding, i.e. use #encodeBase64(string(jsonContent))
Changing the content type to text/plain has the same effect, since in that case the content is treated as a string to begin with.