Error: "Illegal query syntax. Segment before '/' is not an entity or complex type." - nested

I am trying to use filter query in nested $expanded entity.
For example, I am using the below query.
webservice/Results?$expand=FoodDescriptions&$filter=substringof('Vod', FoodDescriptions/Description)&$format=json
While executing this, I am getting the below error
{
"error": {
"code": "",
"message": {
"lang": "en-US",
"value": "Illegal query syntax. Segment before '/' is not an entity or complex type."
}
}
}
I got this query from the below public service example.
https://services.odata.org/V2/Northwind/Northwind.svc/Orders?$expand=Customer&$filter=substringof('Henriette', Customer/ContactName)&$format=json
in this case, I am getting proper response and anyone can test it.
Can someone tell me what could be the reason for this particular error in my code?

Your problem lies most likely under ODataConfig - You dont have FoodDescriptions set as Entity.
If you go over here [https://services.odata.org/V2/Northwind/Northwind.svc/$metadata], you can find this line <EntitySet Name="Customers" EntityType="NorthwindModel.Customer"/>
In your case, you probably just forgot to register FoodDescriptions as entity.

Related

Groovy script assertion that validates the presence of some values in a JSON response

So I'm using an testing tool called ReadyAPI and for scripting it uses the Groovy language. I'm not familiar with the language and the fact that it's based on Java it somehow makes it even worse.
Now I'm trying to validate a REST response in JSON with an assertion that checks that certain elements exist in the response.
This is the code that I have now:
import groovy.json.*
def response = context.expand( 'RequestResponseHere' )
def object = new JsonSlurper().parseText(response)
assert response.contains("CODE1")
assert response.contains("CODE2")
assert response.contains("CODE3")
assert response.contains("CODE4")
The assertion seems to work but I was wondering if there is maybe a simpler way to do it than to have to write so many lines and making it less 'bulky'?
Any help is greatly appreciated.
Thank you!
Added an example of the json data that I have to parse:
What I need is to check that the value of "code" is always part of a list of acceptable values e.i. CODE1, CODE2, etc.
{
"_embedded": {
"taskList": [
{
"code": "CODE1",
"key": 123
},
{
"code": "CODE2",
"key": "234"
},
{
"code": "CODE3",
"key": "2323"
},
{
"code": "CODE4",
"key": "7829"
},
{
"code": "CODE5",
"key": "8992"
}
]
}
}
If you want to check for certain things to be there, you can DRY that
code with:
["code1","code2"].each{ assert response.contains(it) }
And as stated in the comments, if you want to make sure, "all are there,
but I don't care for the order", extracting the values and comparing it
as results can shorten this:
assert response._embeded.taskList*.code.toSet() == ["code1", ...].toSet()
Note the use of *., which is the "spread operator". It is basically
the same as ...collect{ it.code }. Then comparing the string sets
does the rest (this is fine if you are comparing not to many items since
the power assert will print all the items along the failed assertion; if
you have to find the proverbial needle in the haystack, you are better
off writing something smarter to check the two sets).
The assertion seems to work but I was wondering if there is maybe a
simpler way to do it than to have to write so many lines and making it
less 'bulky'?
Unclear what it is that is bulky but if what you want is to lessen the number of statements, one thing you could do is use a single assert.
Instead of this:
assert response.contains("CODE1")
assert response.contains("CODE2")
assert response.contains("CODE3")
assert response.contains("CODE4")
You could have this:
assert response.contains("CODE1") &&
response.contains("CODE2") &&
response.contains("CODE3") &&
response.contains("CODE4") &&

Reading nested attribute in data source (of a Cloud Run service that might not exist) in Terraform

I'm using Terraform v0.14.4 with GCP. I have a Cloud Run service that won't be managed with Terraform (it might exist or not), and I want to read its url.
If the service exists this works ok:
data "google_cloud_run_service" "myservice" {
name = "myservice"
location = "us-central1"
}
output "myservice" {
value = data.google_cloud_run_service.myservice.status[0].url
}
But if it doesn't exist, I can't get it to work!. What I've tried:
data.google_cloud_run_service.myservice.*.status[*].url
status is null
length(data.google_cloud_run_service.myservice) > 0 ? data.google_cloud_run_service.myservice.*.status[0].url : ""
Tried with join("", data.google_cloud_run_service.myservice.*.status)
I get this error: data.google_cloud_run_service.myservice is object with 9 attributes
coalescelist(data.google_cloud_run_service.myservice.*.status, <...>)
It just returns [null], and using compact over the result gets me a Invalid value for "list" parameter: element 0: string required.
Any ideas?
It seems like you are working against the design of this data source a little here, but based on the error messages you've shown it seems like the behavior is that status is null when the requested object doesn't exist and is a list when it does, and so you'll need to write an expression that can deal with both situations.
Here's my attempt, based only on the documentation of the resource along with some assumptions I'm making based on the error message you included:
output "myservice" {
value = (
data.google_cloud_run_service.myservice.status != null ?
data.google_cloud_run_service.myservice.status[0].url :
null
)
}
There is another potentially-shorter way to write that, relying on the try function's ability to catch dynamic errors and return a fallback value, although this does go against the recommendations in the documentation in that it forces an unfamiliar future reader to do a bit more guesswork to understand in which situations the expression might succeed and which it might return the fallback:
output "myservice" {
value = try(data.google_cloud_run_service.myservice.status[0].url, null)
}

JSON schema definitions validation for internal reference

I am creating thousands of definitions and making it available on remote so that any one can reuse the defined schema by referring remote ref. at the time of definition creation i want something which can check the $ref and throw error if it is not available
{
"definitions":{
"description100Type":{
"$ref":"#/definitions/additinalType"
}
},
"$schema":"http://json-schema.org/draft-07/schema#",
"$id":"http://example.com/root.json"
}
In the above example description100Type referring additinalType which was not defined.
How to validate the above case ? i am using ajv for validations.
note:ajv is throwing proper error if description100Type is referred inside properties
{
"properties":{
"checked":{
"$id":"#/properties/checked",
"$ref":"#/definitions/description100Type"
}
}
}

bad request error when persisting TableEntity

I have a TableEntity like this:
public class TableEntity : TableEntity
{
public string SomeXml { get; set; }
}
which contains an XML string called SomeXmL. Most TableEntities persist fine but for some I get:
{"The remote server returned an error: (400) Bad Request."}
The XML string of one of the TableEntities producing the exception contains 33933 characters. Is there a limit? Not sure how else to establish the cause of the exception. One sample XML causing the exception can be found here.
The reason you're getting this error is because that data you're trying to insert is exceeding the maximum size allowed for an entity attribute. The maximum size of an entity attribute is 64KB however because strings in Azure Tables are UTF-16 encoded, maximum size of a String type attribute is 32KB.
Because your XML size is more than 32KB, you're getting this error.
When I tried to insert the sample data you shared in a table in my storage account I got the following error back:
{
"odata.error": {
"code": "PropertyValueTooLarge",
"message": {
"lang": "en-US",
"value": "The property value exceeds the maximum allowed size (64KB). If the property value is a string, it is UTF-16 encoded and the maximum number of characters should be 32K or less.\nRequestId:693f46ec-0002-0012-3a5a-cbcb16000000\nTime:2016-06-21T01:14:00.4544620Z"
}
}
}

Firebase Security: read and write access from data values

I'm building a simple data structure and I'm hoping the Firebase security rules can accommodate it.
Right now I'm getting PERMISSION_DENIED for read privileges.
I know you're usually supposed to design your data structure around the security rules but there are very specific reasons for this data structure
So I'd like to try and make the security rules work around it.
Here's the json export of my data:
{
"form" : {
"form" : {
"data" : "Form",
"owner" : "116296988270749049875",
"public" : true
}
},
"users" : {
"116296988270749049875" : {
"data" : "Daniel Murawsky"
}
}
}
And here's what I've got for the security rules so far:
{
"rules": {
"$form":{
"$dataId":{
".read": "data.child('public').val() == true",
".write": "data.child('owner').val() == auth.uid"
}
}
}
}
I've never seen a use case for having two $location variables, one after another, so I can imagine that being the problem. Any input is welcome.
Thanks!
UPDATE
Thanks to Frank's recomendation to use the security simulator (https://.firebaseio.com/?page=Simulator), I quickly figured out the problem with a little bit of play. Thanks Frank!
I didn't understand (even though I read it a dozen times) the concept that Rules Cascade. Once I got it, it was easy.

Resources