bad request error when persisting TableEntity - azure

I have a TableEntity like this:
public class TableEntity : TableEntity
{
public string SomeXml { get; set; }
}
which contains an XML string called SomeXmL. Most TableEntities persist fine but for some I get:
{"The remote server returned an error: (400) Bad Request."}
The XML string of one of the TableEntities producing the exception contains 33933 characters. Is there a limit? Not sure how else to establish the cause of the exception. One sample XML causing the exception can be found here.

The reason you're getting this error is because that data you're trying to insert is exceeding the maximum size allowed for an entity attribute. The maximum size of an entity attribute is 64KB however because strings in Azure Tables are UTF-16 encoded, maximum size of a String type attribute is 32KB.
Because your XML size is more than 32KB, you're getting this error.
When I tried to insert the sample data you shared in a table in my storage account I got the following error back:
{
"odata.error": {
"code": "PropertyValueTooLarge",
"message": {
"lang": "en-US",
"value": "The property value exceeds the maximum allowed size (64KB). If the property value is a string, it is UTF-16 encoded and the maximum number of characters should be 32K or less.\nRequestId:693f46ec-0002-0012-3a5a-cbcb16000000\nTime:2016-06-21T01:14:00.4544620Z"
}
}
}

Related

how do to with fabric chaincode return “Additional property records is not allowed”

I got a error when I write chaincode with "github.com/hyperledger/fabric-contract-api-go/contractapi"
type PaginatedQueryResult struct {
Records []asset `json:"records"`
FetchedRecordsCount int32 `json:"fetchedRecordsCount"`
Bookmark string `json:"bookmark"`
Completed bool `json:"completed"`
}
when the Record is nil, report error: " asset_transfer_ledger chaincode Value did not match schema:\n 1. return.records: Invalid type. Expected: array, given: null", then I update
PaginatedQueryResult struct like that:
type PaginatedQueryResult struct {
Records []asset `json:"records,omitempty" metadata:",optional" `
FetchedRecordsCount int32 `json:"fetchedRecordsCount"`
Bookmark string `json:"bookmark"`
Completed bool `json:"completed"`
}
if Records is nil, this is ok, but when Record is not nil, get a error: "Additional property records is not allowed"
thanks for posting this you have lead me to find an error in the code. The issue is that the code assumes the json tag to be the name only and doesn't expect ,omitempty so the metadata schema ends up having a property records,omitempty so when a value for records is supplied its not found in the schema as a valid property. Since the metadata tag overrides any json value the solution now until the core code is fixed is the add the name to your metadata tag as well as the JSON, your struct would therefore become:
type PaginatedQueryResult struct {
Records []asset `json:"records,omitempty" metadata:"records,optional" `
FetchedRecordsCount int32 `json:"fetchedRecordsCount"`
Bookmark string `json:"bookmark"`
Completed bool `json:"completed"`
}
Note that records is in the JSON tag for marshalling purposes and the metadata tag.
I have opened a JIRA for this issue here: https://jira.hyperledger.org/browse/FABCAG-31

Invalid wire type and index out of range errors when consuming a protobuf message from .net with protobufjs on nodejs

I am trying to consume a protobuf message from RMQ on node js.
The protobuf message was created with protobuf-net on C#.Net
So for example the c# object looks like this:
[ProtoContract]
public class PositionOpenNotification : MessageBase
{
[ProtoMember(1)]
public int PositionID { get; set; }
[ProtoMember(2)]
public int InstrumentID { get; set; }
..
..Down to tag 30
Then it is added to RMQ and we have .net listeners with the same object on the other side to decode it.
But now we want to read the message from nodejs.
For this I am using amqplib and protobuf-js on the nodejs side.
I was trying to decode the message using an object with decorators like this:
import { Message, Type, Field } from "protobufjs/light";
#Type.d("PositionOpenNotification")
export class PositionOpenNotification extends Message<PositionOpenNotification> {
#Field.d(1,"int32", "required")
public PositionID: number;
}
And Decoding like this:
ch.consume(q.queue, function(msg, res) {
try {
if(msg.content) {
let decoded = PositionOpenNotification.decode( msg.content);
console.log(" Received %s", decoded,q.queue);
}
} catch (e) {
console.log("Error %s ", e.message)
}
}
where ch is the amqplib RMQ channel.
But I always get one of these errors:
invalid wire type 7 at offset 2
invalid wire type 4 at offset 2
invalid wire type 6 at offset 2
index out of range: 237 + 10 > 237
etc
What am I doing wrong?
EDIT:
It looks like I did not take into account the fact that MessageBase(abstract which PositionOpenNotification inherits) is also a ProtoContract and that the data was serialized with length prefix.
So in the end this is what worked:
Add a MessageBase object with the PositionOpenNotification object in it:
#Type.d("MessageBase")
export class MessageBase extends Message<MessageBase> {
#Field.d(108, PositionOpenNotification)
public positionOpenNotification: PositionOpenNotification;
}
And then when deserialzing it:
if(msg.content) {
var reader = Reader.create(msg.content);
let decoded = MessageBase.decodeDelimited(reader);
}
Wire type 7 doesn't exist, so: the error is correct, at least.
This type of error is usually an indicator that the payload has been corrupted in transit. The most common way of doing this is by treating it as text, and / or (something that is seen very very frequently): running it through an encoding backwards to transmit the binary data over a text protocol. Check that you're not doing this. Basically, you need to get the exact same bytes at both ends; until you have that, nothing else will work. In particular, if you need to transmit binary over a text protocol: base-64 is your friend.
As a side note: protobuf-net has methods to export the .proto schema for your object model, to make x-plat more convenient. Look for Serializer.GetProto<T>.
If you have a payload that you are not sure about, you can use https://protogen.marcgravell.com/decode to validate and inspect the binary data.

RestAssured Assertion failing

Feature File Snippet:
Then The value of messages.type should be ERROR
Actual Service Response:
"messages": [
{
"type": "ERROR"
}]
Console Log:
JSON path messages.type doesn't match.
Expected: a string containing "ERROR"
Actual: [ERROR]
I have tried removing double quotes from ERROR parameter mentioned in feature file, it doesn't works
SInce you not provided the code you used, this can be due to the fact that you dint convert the json response to a String . Please try with below code since it has an example to how to convert Json to String .
public void jsonPathExample() {
Response response=given().contentType(ContentType.JSON).get("http://localhost:3000/posts");
//we need to convert response as a String
JsonPath jsonPath = new JsonPath(response.asString());
String actualtype = jsonPath.getString("type");
//Then do your assertion
}

Hamcrest closeTo not working in RestAssured.body()

I have a test that I cannot get the syntax correctly:
#Test
void statsTest() {
given().queryParam("param", "ball")
.when().get()
.then().body("total", is(closeTo(10.0, 0.1*10.0))));
}
However, the test keeps failing even though the condition is met:
java.lang.AssertionError: 1 expectation failed.
JSON path total doesn't match.
Expected: is a numeric value within <1.0> of <10.0>
Actual: 10
I've never had a problem with types before in this setup of RestAssured and Hamcrest. For example, a test of the sort: body("total", greaterThan(9)) works fine, which means that there is some type casting under the hood.
I've looked through the docs and cannot find a way to cast the value of body("total") to a numerical value.
so I suspect that this is a bug or I'm not understanding something here.
Here's the JSON response. I had to clip it to make is short. Hope this works.
{
"stats": {
"totalHits": 1,
"searchEngineTimeInMillis": 83,
"searchEngineRoundTripTimeInMillis": 87,
"searchProcessingTimeInMillis": 101
},
"products": {
"id": "total",
"displayName": "Documents",
"ball": 10}
}
The key value pair corresponding to key: "total" in your response seems to be of integer type. So it needs to be checked for bounds with integer based bounds (1,10). So instead of using the closeTo matcher, you can use the following matcher.
allOf(greaterThanOrEqualTo(1), lessThanOrEqualTo(10)))
I've put together another approach that solves the problem but with a slightly different approach. Much thanks to those who populate the web with their code samples. The following assumes you already have set the base URI and PATH. You can add a path deeper in the response by using the get("/path..."). This answer assumes a JSON type response.
private static Response getResponse(String paramName, String paramValue) {
return given().queryParam(paramName, paramValue)
.when().get();
}
public static String getJsonValue(String jsonPath, String paramName, String paramValue) {
Response response = getResponse(paramName, paramValue);
//response.getBody().prettyPrint();
JsonPath jsonPathEvaluator = response.jsonPath();
return jsonPathEvaluator.get(jsonPath).toString();
}
You can simply print the return value and cast it to the type you need.
The test then looks like this:
public static void checkIfNumberCloseToValue(String jsonPath,
String paramName,
String paramValue,
Double error,
Double expected) {
Double value = Double.valueOf(Utils.getJsonValue(jsonPath, paramName, paramValue));
double range = expected * error;
assertThat(value, closeTo(expected, range));
}

Configuring notification tag for Azure Function

I'm using an Azure function to pick up messages from an event hub and send out notifications via the Azure notification hub. Works great! Now I wanted to see whether I could add tags to those messages in order to allow user targeting via those tags.
The output for the notification hub has a "tag expression" parameter which you can configure. But this seems to be a static text. I need to dynamically set these tags based on the message received from the event hub instead. I'm not sure whether you can somehow put dynamic content in there?
I also found that the constructor of the GcmNotification object that I'm using has an overload which allows a tag string to be used. But when I try that I get a warning on compile time stating this is deprecated and when the function fires it shows an error because the Tag property should be empty.
So I'm not clear on a) whether this is at all possible and b) how to do it when it is. Any ideas?
Update: as suggested I tried creating a POCO object to map to my input string. The string is as follows:
[{"deviceid":"repsaj-neptune-win10pi","readingtype":"temperature1","reading":22.031614503139451,"threshold":23.0,"time":"2016-06-22T09:38:54.1900000Z"}]
The POCO object:
public class RuleMessage
{
public string deviceid;
public string readingtype;
public object reading;
public double threshold;
public DateTime time;
}
For the function I now tried both RuleMessage[] and List<RuleMessage> as parameter types, but the function complains it cannot convert the input:
2016-06-24T18:25:16.830 Exception while executing function: Functions.submerged-function-ruleout. Microsoft.Azure.WebJobs.Host: Exception binding parameter 'inputMessage'. Microsoft.Azure.WebJobs.Host: Binding parameters to complex objects (such as 'RuleMessage') uses Json.NET serialization.
1. Bind the parameter type as 'string' instead of 'RuleMessage' to get the raw values and avoid JSON deserialization, or
2. Change the queue payload to be valid json. The JSON parser failed: Cannot deserialize the current JSON array (e.g. [1,2,3]) into type 'Submission#0+RuleMessage' because the type requires a JSON object (e.g. {"name":"value"}) to deserialize correctly.
To fix this error either change the JSON to a JSON object (e.g. {"name":"value"}) or change the deserialized type to an array or a type that implements a collection interface (e.g. ICollection, IList) like List that can be deserialized from a JSON array. JsonArrayAttribute can also be added to the type to force it to deserialize from a JSON array.
Function code:
using System;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using Microsoft.Azure.NotificationHubs;
public static void Run(List<RuleMessage> inputEventMessage, string inputBlob, out Notification notification, out string outputBlob, TraceWriter log)
{
if (inputEventMessage == null || inputEventMessage.Count != 1)
{
log.Info($"The inputEventMessage array was null or didn't contain exactly one item.");
notification = null;
outputBlob = inputBlob;
return;
}
log.Info($"C# Event Hub trigger function processed a message: {inputEventMessage[0]}");
if (String.IsNullOrEmpty(inputBlob))
inputBlob = DateTime.MinValue.ToString();
DateTime lastEvent = DateTime.Parse(inputBlob);
TimeSpan duration = DateTime.Now - lastEvent;
if (duration.TotalMinutes >= 0) {
notification = GetGcmMessage(inputMessage.First());
log.Info($"Sending notification message: {notification.Body}");
outputBlob = DateTime.Now.ToString();
}
else {
log.Info($"Not sending notification message because of timer ({(int)duration.TotalMinutes} minutes ago).");
notification = null;
outputBlob = inputBlob;
}
}
private static Notification GetGcmMessage(RuleMessage input)
{
string message;
if (input.readingtype == "leakage")
message = String.Format("[FUNCTION GCM] Leakage detected! Sensor {0} has detected a possible leak.", input.reading);
else
message = String.Format("[FUNCTION GCM] Sensor {0} is reading {1:0.0}, threshold is {2:0.0}.", input.readingtype, input.reading, input.threshold);
message = "{\"data\":{\"message\":\""+message+"\"}}";
return new GcmNotification(message);
}
public class RuleMessage
{
public string deviceid;
public string readingtype;
public object reading;
public double threshold;
public DateTime time;
}
Update 28-6-2016: I've not managed to get it working by switching ASA output to line seperated to that it doesn't generate a JSON array any more. This is a temp. fix because the Function binding now fails as soon as there is more than one line in the output (which can happen).
Anyways, I now proceeded to set the tagExpression, as per instruction I changed it to:
{
"type": "notificationHub",
"name": "notification",
"hubName": "repsaj-neptune-notifications",
"connection": "repsaj-neptune-notifications_NOTIFICATIONHUB",
"direction": "out",
"tagExpression": "deviceId:{deviceid}"
}
Where {deviceid} equals the deviceid property on my RuleMessage POCO. Unfortunately this doesn't work, when I call the function it outputs:
Exception while executing function: Functions.submerged-function-ruleout. Microsoft.Azure.WebJobs.Host: Exception binding parameter 'notification'. Microsoft.Azure.WebJobs.Host: No value for named parameter 'deviceid'.
Which is not true, I know for sure the property has been set as I've logged it to the ouput window. I also tried something like {inputEventMessage.deviceid}, but that doesn't work either (as I didn't get how the runtime would map {deviceid} to the correct input object when there's more than one.
The tagExpression binding property supports binding parameters coming from trigger input properties. For example, assume your incoming Event Hub event has properties A and B. You can use these properties in your tagExpression using the parens syntax, e.g.: My Tag {A}-{B}.
In general, most of the properties across all the binding types support binding parameters in this way.

Resources