I have following Katalon code to make sure the count response from API is correct
but I am getting an error so I need help to see what is missing in my code.
No signature of method: Script1568233794882.assertThat() is applicable for argument types: (java.lang.Integer) values: [29]
response text:
{
"Error": {
"A": {
"dependency": [
],
"duplicateRows": [
],
"requiredFieldRows": [
]
}
},
"Good": {
"A": {
"count": 29
},
"B": {
"count": 35
},
"C": {
"count": 37
}
},
"type": "Test"
}
I have tried this
def response = WS.sendRequest(requestObject)
def responseList = new JsonSlurper().parseText(response.getResponseText())
println('response text: \n' + JsonOutput.prettyPrint(JsonOutput.toJson(responseList)))
assertThat(responseList.Good.A.count).isEqualTo("29")
Also tried using [0], but it is also not working with error java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Integer
assertThat(responseList.Good[0].A).isEqualTo("29")
Try using the plain Groovy assert. Change your assertion line to the following:
assert responseList.Good.A.count == 29
Related
I want to as the title says create a group of attributes a, b and c, such that any combination can be supplied as long as one is given. I have managed to achieve the functionality but it is not reflected in the schema which is what I can't manage to do.
from pydantic import BaseModel, root_validator
class Foo(BaseModel):
a: str | None = None
b: str | None = None
c: str | None = None
#root_validator
def check_at_least_one_given(cls, values):
if not any((values.get('a'), values.get('b'), values.get('c'))):
raise ValueError("At least of a, b, or c must be given")
return values
# Doesn't have required fields
print(Foo.schema_json(indent=2))
{
"title": "Foo",
"type": "object",
"properties": {
"a": {
"title": "A",
"type": "string"
},
"b": {
"title": "B",
"type": "string"
},
"c": {
"title": "C",
"type": "string"
}
}
}
# No error
print(Foo(a="1"))
>>> a='1' b=None c=None
print(Foo(b="2"))
>>> a=None b='2' c=None
print(Foo(c="3"))
>>> a=None b=None c='3'
print(Foo(a="1", b="2"))
>>> a='1' b='2' c=None
print(Foo(a="1", c="3"))
>>> a='1' b=None c='3'
print(Foo(b="2", c="3"))
>>> a=None b='2' c='3'
print(Foo(a="1", b="2", c="3"))
>>> a='1' b='2' c='3'
# Invalid
Foo()
>>> Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "pydantic\main.py", line 342, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for Foo
__root__
At least of a, b, or c must be given (type=value_error)
I want the schema to output something like
{
"title": "Foo",
"type": "object",
"properties": {
"a": {
"title": "A",
"type": "string"
},
"b": {
"title": "B",
"type": "string"
},
"c": {
"title": "C",
"type": "string"
}
},
"required": [
["a", "b", "c"]
]
}
or something else that (probably) more clearly expresses the intent of at least one of these is required.
Is this possible and if so how is it done?
As far as I can tell, Pydantic does not have a built in mechanism for this. And the validation logic you provide in a custom validator will never find its way into the JSON schema. You could try and search through and if unsuccessful post a feature request for something like this in their issue tracker.
The JSON schema core specification defines the anyOf keyword, which takes subschemas to validate against. This allows specifying the required keyword once for each of your fields in its own subschema. See this answer for details and an example.
In the Pydantic Config you can utilize schema_extra to extend the auto-generated schema. Here is an example of how you can write a corresponding workaround:
from typing import Any
from pydantic import BaseModel, root_validator
class Foo(BaseModel):
a: str | None = None
b: str | None = None
c: str | None = None
#root_validator
def check_at_least_one_given(cls, values: dict[str, Any]) -> dict[str, Any]:
if all(
(v is None for v in (values.get("a"), values.get("b"), values.get("c")))
):
raise ValueError("Any one of `a`, `b`, or `c` must be given")
return values
class Config:
#staticmethod
def schema_extra(schema: dict[str, Any]) -> None:
assert "anyOf" not in schema, "Oops! What now?"
schema["anyOf"] = [
{"required": ["a"]},
{"required": ["b"]},
{"required": ["c"]},
]
for prop in schema.get("properties", {}).values():
prop.pop("title", None)
if __name__ == "__main__":
print(Foo.schema_json(indent=2))
Output:
{
"title": "Foo",
"type": "object",
"properties": {
"a": {
"type": "string"
},
"b": {
"type": "string"
},
"c": {
"type": "string"
}
},
"anyOf": [
{
"required": [
"a"
]
},
{
"required": [
"b"
]
},
{
"required": [
"c"
]
}
]
}
This conforms to the specs and expresses your custom validation.
But note that I put in that assert to indicate that I have no strong basis to assume that the automatic schema will not provide its own anyOf key at some point, which would greatly complicate things. Consider this an unstable solution.
Side note:
Be careful with the any check in your validator. An empty string is "falsy" just like None, which might lead to unexpected results, depending on whether you want to consider empty strings to be valid values in this context or not. any(v for v in ("", 0, False, None)) is False.
I adjusted your validator in my code to explicitly check against None for this reason.
I have a list example_list contains two dict objects, it looks like this:
[
{
"Meta": {
"ID": "1234567",
"XXX": "XXX"
},
"bbb": {
"ccc": {
"ddd": {
"eee": {
"fff": {
"xxxxxx": "xxxxx"
},
"www": [
{
"categories": {
"ppp": [
{
"content": {
"name": "apple",
"price": "0.111"
},
"xxx: "xxx"
}
]
},
"date": "A2020-01-01"
}
]
}
}
}
}
},
{
"Meta": {
"ID": "78945612",
"XXX": "XXX"
},
"bbb": {
"ccc": {
"ddd": {
"eee": {
"fff": {
"xxxxxx": "xxxxx"
},
"www": [
{
"categories": {
"ppp": [
{
"content": {
"name": "banana",
"price": "12.599"
},
"xxx: "xxx"
}
]
},
"date": "A2020-01-01"
}
]
}
}
}
}
}
]
now I want to filter the items and only keep "ID": "xxx" and the correspoding value for "price": "0.111", expected result can be something similar to :
[{"ID": "1234567", "price": "0.111"}, {"ID": "78945612", "price": "12.599"}]
or something like {"1234567":"0.111", "78945612":"12.599" }
Here's what I've tried:
map_list=[]
map_dict={}
for item in example_list:
#get 'ID' for each item in 'meta'
map_dict['ID'] = item['meta']['ID']
# get 'price'
data_list = item['bbb']['ccc']['ddd']['www']
for data in data_list:
for dataitem in data['categories']['ppp']
map_dict['price'] = item["content"]["price"]
map_list.append(map_dict)
print(map_list)
The result for this doesn't look right, feels like the item isn't iterating properly, it gives me result:
[{"ID": "78945612", "price": "12.599"}, {"ID": "78945612", "price": "12.599"}]
It gave me duplicated result for the second ID but where is the first ID?
Can someone take a look for me please, thanks.
Update:
From some comments from another question, I understand the reason for the output keeps been overwritten is because the key name in the dict is always the same, but I'm not sure how to fix this because the key and value needs to be extracted from different level of for loops, any help would be appreciated, thanks.
as #Scott Hunter has mentioned, you need to create a new map_dict everytime you are trying to do this. Here is a quick fix to your solution (I am sadly not able to test it right now, but it seems right to me).
map_list=[]
for item in example_list:
# get 'price'
data_list = item['bbb']['ccc']['ddd']['www']
for data in data_list:
for dataitem in data['categories']['ppp']:
map_dict={}
map_dict['ID'] = item['meta']['ID']
map_dict['price'] = item["content"]["price"]
map_list.append(map_dict)
print(map_list)
But what are you doing here is that you are basically just "forcing" your way through ... I recommend you to take a break and check out somekind of tutorial, which will help you to understand how it really works in the back-end. This is how I would have written it:
list_dicts = []
for example in example_list:
for www in item['bbb']['ccc']['ddd']['www']:
for www_item in www:
list_dicts.append({
'ID': item['meta']['ID'],
'price': www_item["content"]["price"]
})
Good luck with this problem and hope it helps :)
You need to create a new dictionary for map_dict for each ID.
I'm writing test script in katalon studio to verify response body of the API.My response body is of format:
{
"status": "Success",
"correlationCode": "1234-5678",
"type": {
"id": 51247,
"name": "Student",
},
"data": {
"name": "Sara Nieves",
"gender": "Female",
"dob": "1995-08-06",
"libraryCard": {
"id": "11178",
"type": "Seniors"
},
"qualifications": [
{
"id": "45650986546",
"name": "Graduate Certificate in Environmental Engineering Management"
}
]
}
}
I want to verify that none of the elements return 'null' value. Since, the elements returned for the API response are not static(meaning name, gender etc might not get returned every time) therefore, i can't use something like "data.name" to verify if it has null value. So, i want a generic way to loop through each and every attribute returned and check if its value is returned as null or not.
Any help will be much appreciated. Thanks!
You have the error message:
groovy.lang.MissingMethodException: No signature of method: WSVerification1569811424284$_run_closure1.doCall() is applicable for argument types: (com.kms.katalon.core.testobject.ResponseObject) values: [200 1 KB] 22572.groovy:21)
I assume your response object type: com.kms.katalon.core.testobject.ResponseObject
The code to parse response as json and validate it:
import groovy.json.JsonSlurper
/**
* the recursive method to validate that json object does not have null values
* #param obj - the parsed json object (sequence of maps and lists)
* #param path - a variable to know where the error occurred in json data.
*/
void assertNoNullValue(Object obj, String path='ROOT'){
//the main assertion
assert obj!=null : "value in json could not be null: $path"
if(obj instanceof Map){
//iterate each key-value in map and validate the value recursively
obj.each{k,v-> assertNoNullValue(v,path+".$k") }
} else if(obj instanceof List){
//iterate each value in list and validate the value recursively
obj.eachWithIndex{v,i-> assertNoNullValue(v,path+"[$i]") }
}
}
def response = ...
assert response.isJsonContentType()
def responseText = response.getResponseText()
//parse body
def data = new JsonSlurper().parseText(responseText)
assertNoNullValue(data)
This solution is not as precise as the one suggested by #dagget, but it is a quick check:
def response = '''
{
"status": "Success",
"correlationCode": "1234-5678",
"type": {
"id": 51247,
"name": "Student",
},
"data": {
"name": "Sara Nieves",
"gender": "femmina",
"dob": "1995-08-06",
"libraryCard": {
"id": "11178",
"type": "Seniors"
},
"qualifications": [
{
"id": "45650986546",
"name": "Graduate Certificate in Environmental Engineering Management"
}
]
}
}
'''
assert !response.contains("null")
I have following JSON response anonymous body and I need to parse nested arrays dynamically to retrieve a key's value based on a condition by using find or findAll in the groovy's closures
[
{
"children": [
{
"attr": {
"reportId": "1",
"reportShortName": "ABC",
"description": "test,
}
},
{
"attr": {
"reportId": "2",
"reportShortName": "XYZ",
"description": "test",
}
}
}
]
I've tried the following ways and had no luck to retrieve the reportId key's value from the JSON response
package com.src.test.api;
import static io.restassured.RestAssured.given;
import io.restassured.path.json.JsonPath;
import io.restassured.response.Response;
public class GetReportId {
public void getReportId(String reportName) throws Exception {
String searchReports = "http://localhost:8080/reports";
Response resp=given().request().when().get(searchReports).then().extract().response();
JsonPath jsonPath = new JsonPath(resp.asString());
String reportId1 =jsonPath.get("$.find{it.children.contains(restAssuredJsonRootObject.$.children.find{it.attr.reportShortName == 'ABC'})}.attr.reportId");
String reportId2 = jsonPath.get("$.find{it.children.attr.reportShortName.contains(restAssuredJsonRootObject.$.children.find{it.attr.reportShortName.equals('XYZ')}.attr.reportShortName)}.attr.reportId");
System.out.println("ReportId: " + reportId1);
}
}
There could be multiple JSON objects in the parent anonymous array and need to make use of find or findAll within the groovy closures to get the reportId
Need to get the reportId, but seems that something is wrong. Any help would be appreciated.
Assuming you want all the reportIds
List<String> reportIds = jsonPath.get("children.flatten().attr.reportId");
will give you what you want, even it the parent anonymous array has multiple entries.
I tested with the following JSON
[
{
"children": [
{
"attr": {
"reportId": "1",
"reportShortName": "ABC",
"description": "test"
}
},
{
"attr": {
"reportId": "2",
"reportShortName": "XYZ",
"description": "test"
}
}
]
},
{
"children": [
{
"attr": {
"reportId": "3",
"reportShortName": "DEF",
"description": "test"
}
},
{
"attr": {
"reportId": "4",
"reportShortName": "IJK",
"description": "test"
}
}
]
}
]
and it gives me ["1", "2", "3", "4"] i.e. reportIds from all the children
If you know the index of the reportId you're looking for then you can use it like so:
String reportId = jsonPath.get("children.flatten().attr.reportId[0]");
If you're looking for the reportId of a particular report you can do that too:
String reportId = jsonPath.get("children.flatten().attr.find{it.reportShortName == 'ABC'}.reportId")
will give you "1".
Note: The type of the variable you assign the results to are important for type inference and casting. For example, you CANNOT do:
String [] reportIds = jsonPath.get("children.flatten().attr.reportId");
or
int reportId = jsonPath.get("children.flatten().attr.reportId[0]");
Both those things will throw a ClassCastException.
I want to query a Mongo collection for documents where a specific field is either missing or has a value that would evaluate to false in Python. This includes atomic values null, 0, '' (the empty string), false, []. However, arrays containing such values (such as ['foo', ''] or just ['']) are not falsey and must not be matched. Can I do this with Mongo’s structured queries (without resorting to JavaScript)?
$type doesn’t seem to help:
> db.foo.insert({bar: ['baz', '', 'qux']});
> db.foo.find({$and: [{bar: ''}, {bar: {$type: 2}}]});
{ "_id" : ObjectId("50599937da5254d6fd731816"), "bar" : [ "baz", "", "qux" ] }
This should work
db.test.find({$or:[{a:{$size:0}},{"a.0":{$exists:true}}]})
Just make sure the a field doesn't have an object inside with the 0 key.
e.g.
> db.test.find()
{ "_id": ObjectId("5059ac3ab1cee080a7168fff"), "bar": [ "baz", "", "qux" ] }
{ "_id": ObjectId("5059ac48b1cee080a7169000"), "hello": 1, "bar": false, "world": 34 }
{ "_id": ObjectId("5059ac53b1cee080a7169001"), "hello": 1, "world": 42 }
{ "_id": ObjectId("5059ac60b1cee080a7169002"), "hello": 13, "bar": null, "world": 34 }
{ "_id": ObjectId("5059ac6bb1cee080a7169003"), "hello": 133, "bar": [ ], "world": 334 }
{ "_id": ObjectId("5059b36cb1cee080a7169004"), "hello": 133, "bar": [ "" ], "world": 334 }
{ "_id": ObjectId("5059b3e3b1cee080a7169005"), "hello": 133, "bar": "foo", "world": 334 }
{ "_id": ObjectId("5059b3f8b1cee080a7169006"), "hello": 1333, "bar": "", "world": 334 }
{ "_id": ObjectId("5059b424b1cee080a7169007"), "hello": 1333, "bar": { "0": "foo" }, "world": 334 }
> db.test.find({$or: [{bar: {$size: 0}}, {"bar.0": {$exists: true}}]})
{ "_id": ObjectId("5059ac3ab1cee080a7168fff"), "bar": [ "baz", "", "qux" ] }
{ "_id": ObjectId("5059ac6bb1cee080a7169003"), "hello": 133, "bar": [ ], "world": 334 }
{ "_id": ObjectId("5059b36cb1cee080a7169004"), "hello": 133, "bar": [ "" ], "world": 334 }
{ "_id": ObjectId("5059b424b1cee080a7169007"), "hello": 1333, "bar": { "0": "foo" }, "world": 334 }
I found this: https://jira.mongodb.org/browse/SERVER-1854?page=com.atlassian.jira.plugin.system.issuetabpanels:changehistory-tabpanel from back in the day.
I can replicate it on my MongoDB 2.0.1 (I also played around a bit to see if it picked it up as something else):
> db.g.find()
{ "_id" : ObjectId("50599eb65395c82c7a47d124"), "bar" : [ "baz", "", "qux" ] }
{ "_id" : ObjectId("5059a0005395c82c7a47d125"), "a" : 3, "b" : { "a" : 1, "b" : 2 } }
> db.g.find({bar: {$type: 4}});
> db.g.find({a: {$type: 2}});
> db.g.find({a: {$type: 16}});
> db.g.find({bar: {$type: 16}});
> db.g.find({bar: {$type: 2}});
{ "_id" : ObjectId("50599eb65395c82c7a47d124"), "bar" : [ "baz", "", "qux" ] }
> db.g.find({bar: {$type: 3}});
> db.g.find({b: {$type: 3}});
{ "_id" : ObjectId("5059a0005395c82c7a47d125"), "a" : 3, "b" : { "a" : 1, "b" : 2 } }
And when I use $type 4 I cannot get the document with the array in. As you can see $type 3 works fine (Which could be related to: https://jira.mongodb.org/browse/SERVER-1475) but the array cannot seem to be picked up.
It is possible that you are seeing a bug. If you file a JIRA on MongoDBs site (jira.mongodb.org) it could help to solve the problem.
However, though, the $type op might not solve your problem. This might be better done via a client side method like unsetting the field totally if it has no elements that way you query for existance. This standardises your querying patterns and makes it easier to integrate in general.
So my personal recommendation here is to standardise "falsey" values into one single conherrent value.
Edit
I noticed they have marked the original bug as a duplicate (that's why it is closed) however I am not sure how it is a duplicate. These arrays are not being picked up as objects but rather as strings, most likely since $type is acting on every element within that field rather than the field itself (or something like that).
I would still open a JIRA and stress that the array cannot be picked up at all.