We're experiencing randomly exceptions when deserializing commands. We have a send-only API that puts messages in a queue. The tries to consume the message but fails. It occurs randomly no matter what message type. Sometimes the backend can deserialize the message.
We set up NServiceBus to use NewtonSoft.Json (API uses 12.0.3 and Backend uses 12.0.1) with default settings. We don't make use of the unobtrusive mode since we declare our commands with ICommand. As transport we use Azure ServiceBus.
The most disturbing thing is: We have an almost exact same infrastructure in our staging environment (BETA) and everything works fine. When we reproduce the same message on both environments (BETA and PRODUCTION) the message can be deserialized on BETA with the same data whereas on PRODUCTION it can't. All headers and body data are equal, even the enclosed message types. Both environments use identical (binary same) binaries. Both environments run on nearly identical configured app services on Azure (e.g. net core 3.1, win 64-bit).
{
"AmqpMessage": null,
"Body": "{\"Email\":\"vofil69829#bbsaili.com\",\"AccountantId\":{\"Value\":\"75f61737-b9e8-40a3-a1c4-23d7bd61c527\"}}",
"MessageId": "d829e8ce-261f-4033-a294-f279c0390851",
"PartitionKey": null,
"TransactionPartitionKey": null,
"SessionId": null,
"ReplyToSessionId": null,
"TimeToLive": "10675199.02:48:05.4775807",
"CorrelationId": "2ca22dd7-396b-4950-acb6-ad4800ac9d65",
"Subject": null,
"To": null,
"ContentType": "application/json",
"ReplyTo": null,
"ScheduledEnqueueTime": "0001-01-01T00:00:00+00:00",
"ApplicationProperties": {
"NServiceBus.Transport.Encoding": "application/octect-stream",
"NServiceBus.MessageId": "2ca22dd7-396b-4950-acb6-ad4800ac9d65",
"NServiceBus.MessageIntent": "Send",
"NServiceBus.ConversationId": "97743453-c3ac-4e08-a1df-ad4800ac9d65",
"NServiceBus.CorrelationId": "2ca22dd7-396b-4950-acb6-ad4800ac9d65",
"NServiceBus.OriginatingMachine": "dw1sdwk00002R",
"NServiceBus.OriginatingEndpoint": "AwarenessCenter.Api",
"$.diagnostics.originating.hostid": "a5a720d14ccabd7a8ba708f5bc79b2a3",
"NServiceBus.ContentType": "application/json",
"NServiceBus.EnclosedMessageTypes": "AwarenessCenter.Domain.Users.Registration.InviteUserCommand, AwarenessCenter.Domain, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null",
"NServiceBus.Version": "7.4.4",
"NServiceBus.TimeSent": "2021-06-15 10:28:28:288468 Z",
"Diagnostic-Id": "00-1d6433cf14ae654a84e290f66d5faf25-3b4885ae29c8fc4f-00",
"NServiceBus.ExceptionInfo.ExceptionType": "NServiceBus.MessageDeserializationException",
"NServiceBus.ExceptionInfo.InnerExceptionType": "Newtonsoft.Json.JsonReaderException",
"NServiceBus.ExceptionInfo.HelpLink": null,
"NServiceBus.ExceptionInfo.Message": "An error occurred while attempting to extract logical messages from incoming physical message 2ca22dd7-396b-4950-acb6-ad4800ac9d65",
"NServiceBus.ExceptionInfo.Source": "NServiceBus.Core",
"NServiceBus.ExceptionInfo.StackTrace": "NServiceBus.MessageDeserializationException: An error occurred while attempting to extract logical messages from incoming physical message 2ca22dd7-396b-4950-acb6-ad4800ac9d65\r\n ---> Newtonsoft.Json.JsonReaderException: Unexpected character encountered while parsing value: {. Path 'AccountantId', line 1, position 50.\r\n at Newtonsoft.Json.JsonTextReader.ReadStringValue(ReadType readType)\r\n at Newtonsoft.Json.JsonTextReader.ReadAsString()\r\n at Newtonsoft.Json.JsonReader.ReadForType(JsonContract contract, Boolean hasConverter)\r\n at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.PopulateObject(Object newObject, JsonReader reader, JsonObjectContract contract, JsonProperty member, String id)\r\n at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateObject(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, JsonContainerContract containerContract, JsonProperty containerMember, Object existingValue)\r\n at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateValueInternal(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, JsonContainerContract containerContract, JsonProperty containerMember, Object existingValue)\r\n at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.Deserialize(JsonReader reader, Type objectType, Boolean checkAdditionalContent)\r\n at Newtonsoft.Json.JsonSerializer.DeserializeInternal(JsonReader reader, Type objectType)\r\n at NServiceBus.Newtonsoft.Json.JsonMessageSerializer.ReadObject(Stream stream, Boolean isArrayStream, Type type)\r\n at NServiceBus.Newtonsoft.Json.JsonMessageSerializer.DeserializeMultipleMessageTypes(Stream stream, IList`1 messageTypes, Boolean isArrayStream)\r\n at NServiceBus.Newtonsoft.Json.JsonMessageSerializer.Deserialize(Stream stream, IList`1 messageTypes)\r\n at NServiceBus.DeserializeMessageConnector.Extract(IncomingMessage physicalMessage) in /_/src/NServiceBus.Core/Pipeline/Incoming/DeserializeMessageConnector.cs:line 119\r\n at NServiceBus.DeserializeMessageConnector.ExtractWithExceptionHandling(IncomingMessage message) in /_/src/NServiceBus.Core/Pipeline/Incoming/DeserializeMessageConnector.cs:line 53\r\n --- End of inner exception stack trace ---\r\n at NServiceBus.DeserializeMessageConnector.ExtractWithExceptionHandling(IncomingMessage message) in /_/src/NServiceBus.Core/Pipeline/Incoming/DeserializeMessageConnector.cs:line 53\r\n at NServiceBus.DeserializeMessageConnector.Invoke(IIncomingPhysicalMessageContext context, Func`2 stage) in /_/src/NServiceBus.Core/Pipeline/Incoming/DeserializeMessageConnector.cs:line 29\r\n at NServiceBus.InvokeAuditPipelineBehavior.Invoke(IIncomingPhysicalMessageContext context, Func`2 next) in /_/src/NServiceBus.Core/Audit/InvokeAuditPipelineBehavior.cs:line 27\r\n at NServiceBus.ProcessingStatisticsBehavior.Invoke(IIncomingPhysicalMessageContext context, Func`2 next) in /_/src/NServiceBus.Core/Performance/Statistics/ProcessingStatisticsBehavior.cs:line 32\r\n at NServiceBus.TransportReceiveToPhysicalMessageConnector.Invoke(ITransportReceiveContext context, Func`2 next) in /_/src/NServiceBus.Core/Pipeline/Incoming/TransportReceiveToPhysicalMessageConnector.cs:line 61\r\n at NServiceBus.MainPipelineExecutor.Invoke(MessageContext messageContext) in /_/src/NServiceBus.Core/Pipeline/MainPipelineExecutor.cs:line 50\r\n at NServiceBus.Transport.AzureServiceBus.MessagePump.InnerProcessMessage(Task`1 receiveTask)",
"NServiceBus.TimeOfFailure": "2021-06-15 10:28:28:346039 Z",
"NServiceBus.ExceptionInfo.Data.Message ID": "2ca22dd7-396b-4950-acb6-ad4800ac9d65",
"NServiceBus.ExceptionInfo.Data.Transport message ID": "b6c3a3cf-84bf-4235-8551-35a1daf4bc5d",
"NServiceBus.FailedQ": "AwarenessCenter.Backend",
"NServiceBus.ProcessingMachine": "dw1sdwk00002R",
"NServiceBus.ProcessingEndpoint": "AwarenessCenter.Backend",
"$.diagnostics.hostid": "f0377aabcaf4cc26d4afb3e60b704550",
"$.diagnostics.hostdisplayname": "dw1sdwk00002R"
},
"LockToken": "00000000-0000-0000-0000-000000000000",
"DeliveryCount": 0,
"LockedUntil": "0001-01-01T00:00:00+00:00",
"SequenceNumber": 541,
"DeadLetterSource": null,
"PartitionId": 0,
"EnqueuedSequenceNumber": 0,
"EnqueuedTime": "2021-06-15T10:28:28.396+00:00",
"LockTokenGuid": "00000000-0000-0000-0000-000000000000",
"ExpiresAt": "9999-12-31T23:59:59.9999999+00:00",
"DeadLetterReason": null,
"DeadLetterErrorDescription": null
}
I suspect the serialization of the message is not done properly or the type serialized and deserialized are not identical. That what the exception is thrown by NServiceBus is hinting at (see NServiceBus.ExceptionInfo.StackTrace header for details. Specifically
An error occurred while attempting to extract logical messages from incoming physical message 2ca22dd7-396b-4950-acb6-ad4800ac9d65\r\n ---> Newtonsoft.Json.JsonReaderException: Unexpected character encountered while parsing value: {. Path 'AccountantId', line 1, position 50
Based on the serialized payload
"{"Email":"vofil69829#bbsaili.com","AccountantId":{"Value":"75f61737-b9e8-40a3-a1c4-23d7bd61c527"}}"
looks like the AccountantId is not a simple type but rather a nested type since JSON translates into
{
"Email":"vofil69829#bbsaili.com",
"AccountantId":
{
"Value":"75f61737-b9e8-40a3-a1c4-23d7bd61c527"
}
}
Where I suspect on the receiving side the expectation is to see a message of the following type
class SomeContract
{
public string Email { get; set; }
public Guid AccountantId { get; set; }
}
So after we deployed a third time with the exact same codebasis the problem resolved by itself. We have no idea what caused the issue.
Related
I'm running chaincode-java from fabric-samples.
#Transaction(intent = Transaction.TYPE.EVALUATE)
public ArrayList<Asset> GetAllAssets(final Context ctx) {
ChaincodeStub stub = ctx.getStub();
ArrayList<Asset> queryResults = new ArrayList<Asset>();
// To retrieve all assets from the ledger use getStateByRange with empty startKey & endKey.
// Giving empty startKey & endKey is interpreted as all the keys from beginning to end.
// As another example, if you use startKey = 'asset0', endKey = 'asset9' ,
// then getStateByRange will retrieve asset with keys between asset0 (inclusive) and asset9 (exclusive) in lexical order.
QueryResultsIterator<KeyValue> results = stub.getStateByRange("", "");
for (KeyValue result: results) {
Asset asset = genson.deserialize(result.getStringValue(), Asset.class);
System.out.println(asset);
queryResults.add(asset);
}
// final String response = genson.serialize(queryResults);
return queryResults;
}
The GetAllAssets() method was returning String, but I changed it to ArrayList.
As a result, GetAllAssets throws error when invoked.
$ peer chaincode query -C mychannel -n basic -c '{"Args":["GetAllAssets"]}'
Error: endorsement failure during query. response: status:500 message:"Unexpected error"
The log says
Thread[fabric-txinvoke:2,5,main] 11:15:01:224 INFO org.hyperledger.fabric.contract.ContractRouter processRequest Got invoke routing request
Thread[fabric-txinvoke:2,5,main] 11:15:01:226 INFO org.hyperledger.fabric.contract.ContractRouter processRequest Got the invoke request for:GetAllAssets []
Thread[fabric-txinvoke:2,5,main] 11:15:01:234 INFO org.hyperledger.fabric.contract.ContractRouter processRequest Got routing:GetAllAssets:org.hyperledger.fabric.samples.assettransfer.AssetTransfer
Thread[fabric-txinvoke:2,5,main] 11:15:01:274 SEVERE org.hyperledger.fabric.Logger error nulljava.lang.NullPointerException
at org.hyperledger.fabric.contract.execution.JSONTransactionSerializer.toBuffer(JSONTransactionSerializer.java:84)
at org.hyperledger.fabric.contract.execution.impl.ContractExecutionService.convertReturn(ContractExecutionService.java:89)
at org.hyperledger.fabric.contract.execution.impl.ContractExecutionService.executeRequest(ContractExecutionService.java:67)
at org.hyperledger.fabric.contract.ContractRouter.processRequest(ContractRouter.java:123)
at org.hyperledger.fabric.contract.ContractRouter.invoke(ContractRouter.java:134)
at org.hyperledger.fabric.shim.impl.ChaincodeInvocationTask.call(ChaincodeInvocationTask.java:106)
at org.hyperledger.fabric.shim.impl.InvocationTaskManager.lambda$newTask$17(InvocationTaskManager.java:265)
at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1736)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Thread[fabric-txinvoke:2,5,main] 11:15:01:276 SEVERE org.hyperledger.fabric.shim.impl.ChaincodeInvocationTask call [13733a23] Invoke failed with error code 500. Sending ERROR
Can I return List from a transaction? Besides String, what other types can I return? Is there any documentation that I can take a look?
Bit of background first; the ContractAPI that is available in Java, Go and Typescript is used to generate a 'model' of the overall contract including the data type that be passed and returned from transaction functions. (JavaScript supports a limited subset to the extent possible based on it's typing).
In order to support this there has to be a 'serializer' of some sort to process the data. The underlying chaincode API of just 'invoke(byte[]): byte[]' gives the developer the power to serialize how they wish though not all of us need to use that power.
There is a default 'serializer' in the ContractAPI; this can be swapped out if needed.
To specifically answer the question;
The return types can be:
strings,
numbers (for Java this is any of the primitive 'number' types)
booleans,
other types that have been annotated.
arrays of the above
For the 'other types', there are annotations that can be used to define types that can also be passed to and from the transaction functions.
You might see something like this:
#DataType()
public final class Complex {
#Property()
private final String id;
#Property()
private final Description description;
#Property()
private final int value;
public String getID() {
return id;
}
public int getValue() {
return value;
}
public Description getDescription(){
return description;
}
}
Description there is also a class annotated in a similar manner.
This would produce the Contract Metadata that would look like
"Description": {
"$id": "Description",
"type": "object",
"properties": {
"colour": {
"type": "string"
},
"owners": {
"type": "array",
"items": {
"type": "string"
}
}
}
},
"Complex": {
"$id": "Complex",
"type": "object",
"properties": {
"id": {
"type": "string"
},
"value": {
"type": "number"
},
"description": {
"$ref": "Description"
}
}
}
On the Contract Model, or Contract Metadata
There is a JSON schema for this at
https://github.com/hyperledger/fabric-chaincode-node/blob/main/apis/fabric-contract-api/schema/contract-schema.json
Isn't this restrictive? what about Lists?
It's a fair comment, from a Java perspective, something like an ArrayList or Map would be a reasonable thing to return. However the challenge is that it is possible for the contracts to be implemented in different languages. Plus once deployed the Contract will be running for some time, therefore the metadata provides a strong 'API Definition' between the Smart Contract and the Client Application.
What transaction functions (also in the metadata) will be clearly defined.
Summary
I would like to provide some more examples (and docs!) but wanted to get this written up first. There are extensions and changes we could make, and would like to make but there are only so many hours!
As a maintainer of these repos, we'd love to have people come on board if this is an area of interest.
I'm having a hard time tracking down a casting error in Azure Stream Analytics. The input data is coming from an Azure IoT Hub. Here's my query code:
-- Create average data from raw data
WITH
AverageSensorData AS
(
SELECT
[Node],
[SensorType],
udf.round(AVG([Value]), 2) AS [Value],
MAX(TRY_CAST([Timestamp] AS DateTime)) AS [Timestamp]
FROM [SensorData]
WHERE TRY_CAST([Timestamp] AS DateTime) IS NOT NULL
GROUP BY
[Node],
[SensorType],
TumblingWindow(minute, 2)
)
-- Insert average data into PowerBI-Output
SELECT
[Node],
[SensorType]
[Value],
[Timestamp],
DATETIMEFROMPARTS(
DATEPART(yy, [Timestamp]),
DATEPART(mm, [Timestamp]),
DATEPART(dd, [Timestamp]),
0, 0, 0, 0) AS [Date]
INTO [SensorDataAveragePowerBI]
FROM [AverageSensorData]
While this runs fine most of the time (at least for a couple of hundreds or thousands of input entities) it will eventually fail. After having turned on Diagnostic logs I was able to find the following eror message in the corresponding execution log (in reality it was in JSON format, I cleaned it up a little for readability):
Message: Runtime exception occurred while processing events, - Specified cast is not valid. OutputSourceAlias: averagesensordata; Type: SqlRuntimeError, Correlation ID: 49938626-c1a3-4f19-b18d-ee2c5a5332f9
And here's some JSON input that probably caused the error:
[
{
"Date": "2017-04-27T00:00:00.0000000",
"EventEnqueuedUtcTime": "2017-04-27T07:53:52.3900000Z",
"EventProcessedUtcTime": "2017-04-27T07:53:50.6877268Z",
"IoTHub": {
/* Some more data that is not being used */
},
"Node": "IoT_Lab",
"PartitionId": 0,
"SensorType": "temperature",
"Timestamp": "2017-04-27T09:53:50.0000000",
"Value": 21.06
},
{
"Date": "2017-04-27T00:00:00.0000000",
"EventEnqueuedUtcTime": "2017-04-27T07:53:53.6300000Z",
"EventProcessedUtcTime": "2017-04-27T07:53:52.0157515Z",
"IoTHub": {
/* Some more data that is not being used */
},
"Node": "IT_Services",
"PartitionId": 2,
"SensorType": "temperature",
"Timestamp": "2017-04-27T09:53:52.0000000",
"Value": 27.0
}
]
The first entry was the last one to go through, so the second one might have been the one breaking everything. I'm not sure, though, and do not see any suspicious values here. If I upload this as test data within the Azure portal then no errors are being raised.
The query above explicitely uses casting for the [Timestamp] column. But since I'm using TRY_CAST I wouldn't expect any casting errors:
Returns a value cast to the specified data type if the cast succeeds; otherwise, returns null.
As I said, the error only appears once in a while (sometimes after 20 minutes and sometimes after a couple of hours) and cannot be reproduced explicitely. Does anyone have an idea about where the error originates or if there is a chance of getting more detailed error information?
Thanks a lot in advance.
UPDATE: Here's the source of the udf.round function:
function main(value, decimalPlaces) {
if (typeof(value) === 'number'){
var decimalMultiplier = 1;
if (decimalPlaces){
decimalMultiplier = Math.pow(10, decimalPlaces);
}
return Math.round(value * decimalMultiplier) / decimalMultiplier
}
return value;
}
Unfortunately, it's been a while since I wrote this, so I'm not a hundred percent sure why I wrote exactly this code. One thing I do remember, though, is, that I all the analyzed messages always contained a valid number in the respective field. I still think that there's a good chance of this function being responsible for my problem.
I have a TableEntity like this:
public class TableEntity : TableEntity
{
public string SomeXml { get; set; }
}
which contains an XML string called SomeXmL. Most TableEntities persist fine but for some I get:
{"The remote server returned an error: (400) Bad Request."}
The XML string of one of the TableEntities producing the exception contains 33933 characters. Is there a limit? Not sure how else to establish the cause of the exception. One sample XML causing the exception can be found here.
The reason you're getting this error is because that data you're trying to insert is exceeding the maximum size allowed for an entity attribute. The maximum size of an entity attribute is 64KB however because strings in Azure Tables are UTF-16 encoded, maximum size of a String type attribute is 32KB.
Because your XML size is more than 32KB, you're getting this error.
When I tried to insert the sample data you shared in a table in my storage account I got the following error back:
{
"odata.error": {
"code": "PropertyValueTooLarge",
"message": {
"lang": "en-US",
"value": "The property value exceeds the maximum allowed size (64KB). If the property value is a string, it is UTF-16 encoded and the maximum number of characters should be 32K or less.\nRequestId:693f46ec-0002-0012-3a5a-cbcb16000000\nTime:2016-06-21T01:14:00.4544620Z"
}
}
}
I'm using an Azure function to pick up messages from an event hub and send out notifications via the Azure notification hub. Works great! Now I wanted to see whether I could add tags to those messages in order to allow user targeting via those tags.
The output for the notification hub has a "tag expression" parameter which you can configure. But this seems to be a static text. I need to dynamically set these tags based on the message received from the event hub instead. I'm not sure whether you can somehow put dynamic content in there?
I also found that the constructor of the GcmNotification object that I'm using has an overload which allows a tag string to be used. But when I try that I get a warning on compile time stating this is deprecated and when the function fires it shows an error because the Tag property should be empty.
So I'm not clear on a) whether this is at all possible and b) how to do it when it is. Any ideas?
Update: as suggested I tried creating a POCO object to map to my input string. The string is as follows:
[{"deviceid":"repsaj-neptune-win10pi","readingtype":"temperature1","reading":22.031614503139451,"threshold":23.0,"time":"2016-06-22T09:38:54.1900000Z"}]
The POCO object:
public class RuleMessage
{
public string deviceid;
public string readingtype;
public object reading;
public double threshold;
public DateTime time;
}
For the function I now tried both RuleMessage[] and List<RuleMessage> as parameter types, but the function complains it cannot convert the input:
2016-06-24T18:25:16.830 Exception while executing function: Functions.submerged-function-ruleout. Microsoft.Azure.WebJobs.Host: Exception binding parameter 'inputMessage'. Microsoft.Azure.WebJobs.Host: Binding parameters to complex objects (such as 'RuleMessage') uses Json.NET serialization.
1. Bind the parameter type as 'string' instead of 'RuleMessage' to get the raw values and avoid JSON deserialization, or
2. Change the queue payload to be valid json. The JSON parser failed: Cannot deserialize the current JSON array (e.g. [1,2,3]) into type 'Submission#0+RuleMessage' because the type requires a JSON object (e.g. {"name":"value"}) to deserialize correctly.
To fix this error either change the JSON to a JSON object (e.g. {"name":"value"}) or change the deserialized type to an array or a type that implements a collection interface (e.g. ICollection, IList) like List that can be deserialized from a JSON array. JsonArrayAttribute can also be added to the type to force it to deserialize from a JSON array.
Function code:
using System;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using Microsoft.Azure.NotificationHubs;
public static void Run(List<RuleMessage> inputEventMessage, string inputBlob, out Notification notification, out string outputBlob, TraceWriter log)
{
if (inputEventMessage == null || inputEventMessage.Count != 1)
{
log.Info($"The inputEventMessage array was null or didn't contain exactly one item.");
notification = null;
outputBlob = inputBlob;
return;
}
log.Info($"C# Event Hub trigger function processed a message: {inputEventMessage[0]}");
if (String.IsNullOrEmpty(inputBlob))
inputBlob = DateTime.MinValue.ToString();
DateTime lastEvent = DateTime.Parse(inputBlob);
TimeSpan duration = DateTime.Now - lastEvent;
if (duration.TotalMinutes >= 0) {
notification = GetGcmMessage(inputMessage.First());
log.Info($"Sending notification message: {notification.Body}");
outputBlob = DateTime.Now.ToString();
}
else {
log.Info($"Not sending notification message because of timer ({(int)duration.TotalMinutes} minutes ago).");
notification = null;
outputBlob = inputBlob;
}
}
private static Notification GetGcmMessage(RuleMessage input)
{
string message;
if (input.readingtype == "leakage")
message = String.Format("[FUNCTION GCM] Leakage detected! Sensor {0} has detected a possible leak.", input.reading);
else
message = String.Format("[FUNCTION GCM] Sensor {0} is reading {1:0.0}, threshold is {2:0.0}.", input.readingtype, input.reading, input.threshold);
message = "{\"data\":{\"message\":\""+message+"\"}}";
return new GcmNotification(message);
}
public class RuleMessage
{
public string deviceid;
public string readingtype;
public object reading;
public double threshold;
public DateTime time;
}
Update 28-6-2016: I've not managed to get it working by switching ASA output to line seperated to that it doesn't generate a JSON array any more. This is a temp. fix because the Function binding now fails as soon as there is more than one line in the output (which can happen).
Anyways, I now proceeded to set the tagExpression, as per instruction I changed it to:
{
"type": "notificationHub",
"name": "notification",
"hubName": "repsaj-neptune-notifications",
"connection": "repsaj-neptune-notifications_NOTIFICATIONHUB",
"direction": "out",
"tagExpression": "deviceId:{deviceid}"
}
Where {deviceid} equals the deviceid property on my RuleMessage POCO. Unfortunately this doesn't work, when I call the function it outputs:
Exception while executing function: Functions.submerged-function-ruleout. Microsoft.Azure.WebJobs.Host: Exception binding parameter 'notification'. Microsoft.Azure.WebJobs.Host: No value for named parameter 'deviceid'.
Which is not true, I know for sure the property has been set as I've logged it to the ouput window. I also tried something like {inputEventMessage.deviceid}, but that doesn't work either (as I didn't get how the runtime would map {deviceid} to the correct input object when there's more than one.
The tagExpression binding property supports binding parameters coming from trigger input properties. For example, assume your incoming Event Hub event has properties A and B. You can use these properties in your tagExpression using the parens syntax, e.g.: My Tag {A}-{B}.
In general, most of the properties across all the binding types support binding parameters in this way.
I'm getting an exception when I run a unit test on a controller in web project (ASP.NET web api). The exception is thrown when LogManager.GetCurrentClassLogger() of the controller is executed:
System.MethodAccessException: Attempt by method 'Castle.Proxies.ClaimsPrincipalProxy.GetObjectData(System.Runtime.Serialization.SerializationInfo, System.Runtime.Serialization.StreamingContext)' to access method 'Castle.DynamicProxy.Internal.TypeUtil.Sort(System.Reflection.MemberInfo[])' failed
It results in TypeInitializationException in LogManager.GetCurrentClassLogger().
here is the call stack:
at Castle.Proxies.ClaimsPrincipalProxy.GetObjectData(SerializationInfo, StreamingContext)
at System.Runtime.Serialization.ObjectCloneHelper.GetObjectData(Object serObj, String& typeName, String& assemName, String[]& fieldNames, Object[]& fieldValues)
at System.AppDomain.get_Evidence()
at System.AppDomain.get_Evidence()
at System.Configuration.ClientConfigPaths.GetEvidenceInfo(AppDomain appDomain, String exePath, ref String typeName)
at System.Configuration.ClientConfigPaths.GetTypeAndHashSuffix(AppDomain appDomain, String exePath)
at System.Configuration.ClientConfigPaths..ctor(String exePath, Boolean includeUserConfig)
at System.Configuration.ClientConfigPaths.GetPaths(String exePath, Boolean includeUserConfig)
at System.Configuration.ClientConfigurationHost.RequireCompleteInit(IInternalConfigRecord record)
at System.Configuration.BaseConfigurationRecord.GetSectionRecursive(String configKey, Boolean getLkg, Boolean checkPermission, Boolean getRuntimeObject, Boolean requestIsHere, ref Object result, ref Object resultRuntimeObject)
at System.Configuration.BaseConfigurationRecord.GetSection(String configKey)
at System.Configuration.ConfigurationManager.GetSection(String sectionName)
at NLog.Config.XmlLoggingConfiguration.get_AppConfig()
at NLog.LogFactory.get_Configuration()
at NLog.LogFactory.GetLogger(LoggerCacheKey cacheKey)
at NLog.LogFactory.GetLogger(String name)
at NLog.LogManager.GetCurrentClassLogger()
Update:
The issue happens when the unit test project references NSubstitute. So, it seems like there some dangerous combination of Web API, NSubstitute and NLog.
Update 2:
Found what upsets Nlog in Controller.
Before calling controller method I set mocked principal for Thread.CurrentPrincipal:
var principal = Substitute.For<ClaimsPrincipal>();
principal.Identity.Returns(...);
Thread.CurrentPrincipal = principal;
How could this be fixed?
The Castle assembly may be marked with the AllowPartiallyTrustedCallersAttribute, and that uses the level 2 security transparency model.
Level 2 transparency causes all methods in AllowPartiallyTrustedCallers assemblies to become security transparent by default, which may be the cause of this exception.
Can you try annotating your unit test (and Class) with the following attribute - [SecuritySafeCritical]