So I tried following this guide and deploy the model using docker tensorflow serving image. Let's say there are 4 features: feat1, feat2, feat3 and feat4. I tried to hit the prediction endpoint {url}/predict with this JSON body:
{
"instances":
[
{
"feat1": 26,
"feat2": 16,
"feat3": 20.2,
"feat4": 48.8
}
]}
I got 400 response code:
{
"error": "Failed to process element: 0 key: feat1 of 'instances' list. Error: Invalid argument: JSON object: does not have named input: feat"
}
This is the signature passed to model.save():
signatures = {
'serving_default':
_get_serve_tf_examples_fn(model,
tf_transform_output).get_concrete_function(
tf.TensorSpec(
shape=[None],
dtype=tf.string,
name='examples')),
}
I understand that from this signature that in every instances element, the only field being accepted is "examples" but when I tried to only pass this one only with empty string:
{
"instances":
[
{
"examples": ""
}
]
}
I also got bad request: {"error": "Name: <unknown>, Feature: feat1 (data type: int64) is required but could not be found.\n\t [[{{node ParseExample/ParseExampleV2}}]]"}
I couldn't find in the guide how to build the JSON body request the right way, it would be really helpful if anyone can point this out or give references regarding this matter.
In that example, the serving function expects a serialized tf.train.Example proto as input. This page explains how binary data can be passed to a deployed model as a string (explaining why the signature expects a tensor of strings). So what you need to do is build an Example proto containing your features and send that over. It could look something like this:
import base64
import tensorflow was tf
features = {'feat1': 26,, 'feat2': 16, "feat3": 20.2, "feat4": 48.8}
# Create an Example proto from your feature dict.
feature_spec = {
k: tf.train.Feature(float_list=tf.train.FloatList(value=[float(v)]))
for k, v in features.items()
}
example = tf.train.Example(
features=tf.train.Features(feature=feature_spec)).SerializeToString()
# Encode your serialized Example using base64 so it can be added into your
# JSON payload.
b64_example = base64.b64encode(example).decode()
result = [{'examples': {'b64': b64_example}}]
What is the output of saved_model_cli show --dir /path/to/model --all? You should follow the output to serialize your request.
I tried to solve this problem by changing the signature serving input but it raised another exception. This problem already solved, check it out here.
Related
I am trying out completions using insertions.
It seems that I am supposed to use a parameter called suffix: to inform where the end of the insert goes.
The payload to the endpoint: POST /v1/completions
{
"model": "code-davinci-002",
"prompt": "Write a JSON document for a person with first name, last name, email and phone number\n\n{\n",
"suffix": "\n}",
"temperature": 0,
"max_tokens": 256,
"top_p": 1,
"frequency_penalty": 0,
"presence_penalty": 0
}
I tried doing this from a ruby implementation of GPT3.
parameters
=> {
:model=>"code-davinci-001",
:prompt=>"generate some JSON for a person with first and last name {",
:max_tokens=>250,
:temperature=>0,
:top_p=>1,
:frequency_penalty=>0,
:presence_penalty=>0,
:suffix=>"\n}"}
post(url: "/v1/completions", parameters: parameters)
I get an invalid argument error for suffix
{"error"=>{"message"=>"Unrecognized request argument supplied: suffix", "type"=>"invalid_request_error", "param"=>nil, "code"=>nil}}
I looked at the Payload from OpenAI vs the payload from the Ruby Library and saw the issue.
My ruby library was setting the model to code-davinci-001 while OpenAI was using code-davinci-002.
As soon as I manually altered the model: attribute in debug, the completion started working correctly.
{
"id"=>"cmpl-5yJ8b01Cw26W6ZIHoRSOb71Dc4QvH",
"object"=>"text_completion",
"created"=>1665054929,
"model"=>"code-davinci-002",
"choices"=>
[{"text"=>"\n \"firstName\": \"John\",\n \"lastName\": \"Smith\"",
"index"=>0,
"logprobs"=>nil,
"finish_reason"=>"stop"}],
"usage"=>{"prompt_tokens"=>14, "completion_tokens"=>19,
"total_tokens"=>33}
}
How can I reference a specific attribute/row in the results returned by client.service? I'd like to return just the value of the 'OverallStatus' shown below. Here is my code snippet:
from zeep import Client
url='https://someframework.wsdl'
client = Client(wsdl=url)
results = client.service.GetSystemStatus
Here is the 'results()' type:
print(type(results()))
<class 'zeep.objects.SystemStatusResponseMsg'>
And here is what's contained in 'results()':
print(results())
{
'Result': [
'StatusCode': 'OK',
'StatusMessage': 'System Status Retrieved',
'SystemStatus': 'OK',
'Outages': None
}
]
},
'OverallStatus': 'OK'
}
I'd like to call something like:
print(results['OverallStatus'])
or
print(results['SystemStatus'])
and just see it's value of 'OK' printed on screen. I'm a bit of a Python newbie, have reviewed casting the object into dicts, lists, tuples, etc. but feel like I'm missing something and have started going into circles.
Found this SO answer that helped point me in the right direction: Python Accessing Nested JSON Data
Without needing to any manipulation this snippet was able to get me the output I was seeking:
>>> print(results()['Result'][0]['SystemStatus'])
OK
We have REST APIs with JSON BODY arrays/objects. To control the data to go in the JSON file at Runtime, using testdata.properties file where defining the data and calling that as below. We are using cucumber Serenity.
Testdata.properties file:
Value = 123
StepDefinition file:
#Given("^Set the \"([^\"]*)\" with \"([^\"]*)\"$")
public void set_data_parameterization (String fieldName, String Value) {
if (fieldName.contains("Test")) {
jsonObjectNew.getAsJsonObject("TestInfo").add("Value",
gson.toJsonTree(Value));
}
System.err.println("Test value fetched from the Scenario outline");
}
JSON File:
{
"TestInfo": {
"Test123": 3,
"Value": 50 // this value to be replaced
}
}
.feature file:
Scenario Outline::
1. Testing data parameterize
Given Set the URL for "Test" as "base"
And Set the "Test" with "Value"
Examples:
|Value|
|700|
|710|
If calling the variable data from .properties file works fine, however if want to have different sets of data to be executed for the same scenario. How can it be achieved. Tried with Examples in feature file, but when run the file as cucumbertest>getting the actual payload value which is 50. It is not replacing with 700/710.
Please guide.
Able to get values as expected now, the issue was I was trying as "String" (ex: "Values"). When tried as in .feature file and rest of the code is similar.
Able to get values iterated of given Examples.
Feature File Snippet:
Then The value of messages.type should be ERROR
Actual Service Response:
"messages": [
{
"type": "ERROR"
}]
Console Log:
JSON path messages.type doesn't match.
Expected: a string containing "ERROR"
Actual: [ERROR]
I have tried removing double quotes from ERROR parameter mentioned in feature file, it doesn't works
SInce you not provided the code you used, this can be due to the fact that you dint convert the json response to a String . Please try with below code since it has an example to how to convert Json to String .
public void jsonPathExample() {
Response response=given().contentType(ContentType.JSON).get("http://localhost:3000/posts");
//we need to convert response as a String
JsonPath jsonPath = new JsonPath(response.asString());
String actualtype = jsonPath.getString("type");
//Then do your assertion
}
Trying to work out if this is possible or not. Trawled terraform docs to no avail (not much surprise there).
Take the below extremely slim line example.
[
{
"cpu": "${var.master_container_cpu}",
}
]
Adjoined to this tf parameter when invoking aws_ecs_task_definition resource;
container_definitions = "${file("task-definitions/example.json")}"
Will result in the following error;
Error: aws_ecs_task_definition.example-task: ECS Task Definition container_definitions is invalid: Error decoding JSON: json: cannot unmarshal string into Go struct field ContainerDefinition.Cpu of type int64
any help more than welcome :)
It looks like you should use a template to compile the JSON before using in the definition
data "template_file" "task" {
template = "${file("${task-definitions/example.json")}"
vars {
cpu = "${var.master_container_cpu}"
}
}
In the JSON file you can reference the var using ${cpu}
Then you are able to use the output as your definition
container_definitions = "${data.template_file.task.rendered}"