I'm having a hard time tracking down a casting error in Azure Stream Analytics. The input data is coming from an Azure IoT Hub. Here's my query code:
-- Create average data from raw data
WITH
AverageSensorData AS
(
SELECT
[Node],
[SensorType],
udf.round(AVG([Value]), 2) AS [Value],
MAX(TRY_CAST([Timestamp] AS DateTime)) AS [Timestamp]
FROM [SensorData]
WHERE TRY_CAST([Timestamp] AS DateTime) IS NOT NULL
GROUP BY
[Node],
[SensorType],
TumblingWindow(minute, 2)
)
-- Insert average data into PowerBI-Output
SELECT
[Node],
[SensorType]
[Value],
[Timestamp],
DATETIMEFROMPARTS(
DATEPART(yy, [Timestamp]),
DATEPART(mm, [Timestamp]),
DATEPART(dd, [Timestamp]),
0, 0, 0, 0) AS [Date]
INTO [SensorDataAveragePowerBI]
FROM [AverageSensorData]
While this runs fine most of the time (at least for a couple of hundreds or thousands of input entities) it will eventually fail. After having turned on Diagnostic logs I was able to find the following eror message in the corresponding execution log (in reality it was in JSON format, I cleaned it up a little for readability):
Message: Runtime exception occurred while processing events, - Specified cast is not valid. OutputSourceAlias: averagesensordata; Type: SqlRuntimeError, Correlation ID: 49938626-c1a3-4f19-b18d-ee2c5a5332f9
And here's some JSON input that probably caused the error:
[
{
"Date": "2017-04-27T00:00:00.0000000",
"EventEnqueuedUtcTime": "2017-04-27T07:53:52.3900000Z",
"EventProcessedUtcTime": "2017-04-27T07:53:50.6877268Z",
"IoTHub": {
/* Some more data that is not being used */
},
"Node": "IoT_Lab",
"PartitionId": 0,
"SensorType": "temperature",
"Timestamp": "2017-04-27T09:53:50.0000000",
"Value": 21.06
},
{
"Date": "2017-04-27T00:00:00.0000000",
"EventEnqueuedUtcTime": "2017-04-27T07:53:53.6300000Z",
"EventProcessedUtcTime": "2017-04-27T07:53:52.0157515Z",
"IoTHub": {
/* Some more data that is not being used */
},
"Node": "IT_Services",
"PartitionId": 2,
"SensorType": "temperature",
"Timestamp": "2017-04-27T09:53:52.0000000",
"Value": 27.0
}
]
The first entry was the last one to go through, so the second one might have been the one breaking everything. I'm not sure, though, and do not see any suspicious values here. If I upload this as test data within the Azure portal then no errors are being raised.
The query above explicitely uses casting for the [Timestamp] column. But since I'm using TRY_CAST I wouldn't expect any casting errors:
Returns a value cast to the specified data type if the cast succeeds; otherwise, returns null.
As I said, the error only appears once in a while (sometimes after 20 minutes and sometimes after a couple of hours) and cannot be reproduced explicitely. Does anyone have an idea about where the error originates or if there is a chance of getting more detailed error information?
Thanks a lot in advance.
UPDATE: Here's the source of the udf.round function:
function main(value, decimalPlaces) {
if (typeof(value) === 'number'){
var decimalMultiplier = 1;
if (decimalPlaces){
decimalMultiplier = Math.pow(10, decimalPlaces);
}
return Math.round(value * decimalMultiplier) / decimalMultiplier
}
return value;
}
Unfortunately, it's been a while since I wrote this, so I'm not a hundred percent sure why I wrote exactly this code. One thing I do remember, though, is, that I all the analyzed messages always contained a valid number in the respective field. I still think that there's a good chance of this function being responsible for my problem.
Related
I am trying out completions using insertions.
It seems that I am supposed to use a parameter called suffix: to inform where the end of the insert goes.
The payload to the endpoint: POST /v1/completions
{
"model": "code-davinci-002",
"prompt": "Write a JSON document for a person with first name, last name, email and phone number\n\n{\n",
"suffix": "\n}",
"temperature": 0,
"max_tokens": 256,
"top_p": 1,
"frequency_penalty": 0,
"presence_penalty": 0
}
I tried doing this from a ruby implementation of GPT3.
parameters
=> {
:model=>"code-davinci-001",
:prompt=>"generate some JSON for a person with first and last name {",
:max_tokens=>250,
:temperature=>0,
:top_p=>1,
:frequency_penalty=>0,
:presence_penalty=>0,
:suffix=>"\n}"}
post(url: "/v1/completions", parameters: parameters)
I get an invalid argument error for suffix
{"error"=>{"message"=>"Unrecognized request argument supplied: suffix", "type"=>"invalid_request_error", "param"=>nil, "code"=>nil}}
I looked at the Payload from OpenAI vs the payload from the Ruby Library and saw the issue.
My ruby library was setting the model to code-davinci-001 while OpenAI was using code-davinci-002.
As soon as I manually altered the model: attribute in debug, the completion started working correctly.
{
"id"=>"cmpl-5yJ8b01Cw26W6ZIHoRSOb71Dc4QvH",
"object"=>"text_completion",
"created"=>1665054929,
"model"=>"code-davinci-002",
"choices"=>
[{"text"=>"\n \"firstName\": \"John\",\n \"lastName\": \"Smith\"",
"index"=>0,
"logprobs"=>nil,
"finish_reason"=>"stop"}],
"usage"=>{"prompt_tokens"=>14, "completion_tokens"=>19,
"total_tokens"=>33}
}
I'm running chaincode-java from fabric-samples.
#Transaction(intent = Transaction.TYPE.EVALUATE)
public ArrayList<Asset> GetAllAssets(final Context ctx) {
ChaincodeStub stub = ctx.getStub();
ArrayList<Asset> queryResults = new ArrayList<Asset>();
// To retrieve all assets from the ledger use getStateByRange with empty startKey & endKey.
// Giving empty startKey & endKey is interpreted as all the keys from beginning to end.
// As another example, if you use startKey = 'asset0', endKey = 'asset9' ,
// then getStateByRange will retrieve asset with keys between asset0 (inclusive) and asset9 (exclusive) in lexical order.
QueryResultsIterator<KeyValue> results = stub.getStateByRange("", "");
for (KeyValue result: results) {
Asset asset = genson.deserialize(result.getStringValue(), Asset.class);
System.out.println(asset);
queryResults.add(asset);
}
// final String response = genson.serialize(queryResults);
return queryResults;
}
The GetAllAssets() method was returning String, but I changed it to ArrayList.
As a result, GetAllAssets throws error when invoked.
$ peer chaincode query -C mychannel -n basic -c '{"Args":["GetAllAssets"]}'
Error: endorsement failure during query. response: status:500 message:"Unexpected error"
The log says
Thread[fabric-txinvoke:2,5,main] 11:15:01:224 INFO org.hyperledger.fabric.contract.ContractRouter processRequest Got invoke routing request
Thread[fabric-txinvoke:2,5,main] 11:15:01:226 INFO org.hyperledger.fabric.contract.ContractRouter processRequest Got the invoke request for:GetAllAssets []
Thread[fabric-txinvoke:2,5,main] 11:15:01:234 INFO org.hyperledger.fabric.contract.ContractRouter processRequest Got routing:GetAllAssets:org.hyperledger.fabric.samples.assettransfer.AssetTransfer
Thread[fabric-txinvoke:2,5,main] 11:15:01:274 SEVERE org.hyperledger.fabric.Logger error nulljava.lang.NullPointerException
at org.hyperledger.fabric.contract.execution.JSONTransactionSerializer.toBuffer(JSONTransactionSerializer.java:84)
at org.hyperledger.fabric.contract.execution.impl.ContractExecutionService.convertReturn(ContractExecutionService.java:89)
at org.hyperledger.fabric.contract.execution.impl.ContractExecutionService.executeRequest(ContractExecutionService.java:67)
at org.hyperledger.fabric.contract.ContractRouter.processRequest(ContractRouter.java:123)
at org.hyperledger.fabric.contract.ContractRouter.invoke(ContractRouter.java:134)
at org.hyperledger.fabric.shim.impl.ChaincodeInvocationTask.call(ChaincodeInvocationTask.java:106)
at org.hyperledger.fabric.shim.impl.InvocationTaskManager.lambda$newTask$17(InvocationTaskManager.java:265)
at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1736)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Thread[fabric-txinvoke:2,5,main] 11:15:01:276 SEVERE org.hyperledger.fabric.shim.impl.ChaincodeInvocationTask call [13733a23] Invoke failed with error code 500. Sending ERROR
Can I return List from a transaction? Besides String, what other types can I return? Is there any documentation that I can take a look?
Bit of background first; the ContractAPI that is available in Java, Go and Typescript is used to generate a 'model' of the overall contract including the data type that be passed and returned from transaction functions. (JavaScript supports a limited subset to the extent possible based on it's typing).
In order to support this there has to be a 'serializer' of some sort to process the data. The underlying chaincode API of just 'invoke(byte[]): byte[]' gives the developer the power to serialize how they wish though not all of us need to use that power.
There is a default 'serializer' in the ContractAPI; this can be swapped out if needed.
To specifically answer the question;
The return types can be:
strings,
numbers (for Java this is any of the primitive 'number' types)
booleans,
other types that have been annotated.
arrays of the above
For the 'other types', there are annotations that can be used to define types that can also be passed to and from the transaction functions.
You might see something like this:
#DataType()
public final class Complex {
#Property()
private final String id;
#Property()
private final Description description;
#Property()
private final int value;
public String getID() {
return id;
}
public int getValue() {
return value;
}
public Description getDescription(){
return description;
}
}
Description there is also a class annotated in a similar manner.
This would produce the Contract Metadata that would look like
"Description": {
"$id": "Description",
"type": "object",
"properties": {
"colour": {
"type": "string"
},
"owners": {
"type": "array",
"items": {
"type": "string"
}
}
}
},
"Complex": {
"$id": "Complex",
"type": "object",
"properties": {
"id": {
"type": "string"
},
"value": {
"type": "number"
},
"description": {
"$ref": "Description"
}
}
}
On the Contract Model, or Contract Metadata
There is a JSON schema for this at
https://github.com/hyperledger/fabric-chaincode-node/blob/main/apis/fabric-contract-api/schema/contract-schema.json
Isn't this restrictive? what about Lists?
It's a fair comment, from a Java perspective, something like an ArrayList or Map would be a reasonable thing to return. However the challenge is that it is possible for the contracts to be implemented in different languages. Plus once deployed the Contract will be running for some time, therefore the metadata provides a strong 'API Definition' between the Smart Contract and the Client Application.
What transaction functions (also in the metadata) will be clearly defined.
Summary
I would like to provide some more examples (and docs!) but wanted to get this written up first. There are extensions and changes we could make, and would like to make but there are only so many hours!
As a maintainer of these repos, we'd love to have people come on board if this is an area of interest.
I am building an azure durable function which has may activities/azure functions to be called as a part of Job execution.
I have a requirement to view the total execution time taken for completing an orchestration instance. Is there any way to read the total execution time took to run an instance of durable function orchestration?
You can calculate the total execution time for an orchestration by calling the statusQueryGetUri returned when the durable function is first created. The call URI should look like this:
http://<functionappname>.azurewebsites.net/runtime/webhooks/durabletask/instances/<instanceId>?taskHub=<taskHub>&connection=<connectionName>&code=<systemKey>&showHistory=true
The response should look like this:
{
"createdTime": "2018-02-28T05:18:49Z",
"historyEvents": [
...
],
"input": null,
"customStatus": { "nextActions": ["A", "B", "C"], "foo": 2 },
"lastUpdatedTime": "2018-02-28T05:18:54Z",
"output": [
...
],
"runtimeStatus": "Completed"
}
The duration can be determined by polling the status URI until the runtimeStatus reaches any of the terminal states (Failed, Cancelled, Terminated, or Completed), and then subtracting createdTime from lastUpdatedTime.
The following Typescript snippet shows how a the above JSON response (parsed into the status variable) could be processed to show duration as hh:mm:ss:
formatDuration(status: DurableFunctionJob): string {
const diff:number = (new Date(status.lastUpdatedTime).getTime()
- new Date(status.createdTime).getTime());
return new Date(diff).toISOString().substr(11,8);
}
For the example response, this function outputs 00:00:05.
I am triggering a step function through an Express route in a Node app. This step function interacts with the Ethereum blockchain, and is thus highly asynchronous. There is also a possibility of transactions failing if a multiple attempts are made at once.
As such, I want to queue up executions of this step function, but oddly there doesn't seem a straightforward way to do so.
What is the best way to do this?
You can go with Map in Step Functions.
Step Functions provide inbuilt way execute in parallel or on execution at a time for given number of items. Below is an example:
"Validate-All": {
"Type": "Map",
"InputPath": "$.detail",
"ItemsPath": "$.shipped",
"MaxConcurrency": 0,
"Iterator": {
"StartAt": "Validate",
"States": {
"Validate": {
"Type": "Task",
"Resource": "arn:aws:lambda:us-east-1:123456789012:function:ship-val",
"End": true
}
}
},
"ResultPath": "$.detail.shipped",
"End": true
}
You need to change the value of MaxConcurrency=1 so that only one execution will happen once and it will keep going until your items in InputPath is completely not exhausted. InputPath is supposed to be a list, you can queue up your items in InputPath and start the State Machine.
You can read more here.
I'm attempting to use an output table binding with an Azure Function V2 (node).
I have added the table binding to function.json, as described in the documentation.
{
"tableName": "Person",
"connection": "MyStorageConnectionAppSetting",
"name": "tableBinding",
"type": "table",
"direction": "out"
}
And then I am attempting to insert some content in to that table, again using the example as described in the documentation.
for (var i = 1; i < 10; i++) {
context.bindings.tableBinding.push({
PartitionKey: "Test",
RowKey: i.toString(),
Name: "Name " + i
});
}
To confirm - I have also added a setting called MyStorageConnectionAppSetting in to local.settings.json, with a valid Storage Account connection string as it's value.
Sadly though, this is failing and I'm seeing the following error -
System.Private.CoreLib: Exception while executing function: Functions.config. System.Private.CoreLib: Result: Failure
Exception: TypeError: Cannot read property 'push' of undefined
It seems that the binding object has not be created as expected, but I have no idea why.
The package Microsoft.Azure.WebJobs.Extensions.Storage is included in extensions.csproj, and the Function App starts just fine when I call func start.
Although I believe that no connection to the Storage Account is taking place, I did try to run my function both when the Table existed, and when it didn't.
Make sure the parameter has been initialized before usage. The output binding is undefined unless it's initialized or assigned value.
context.bindings.tableBinding = [];