Sort does not work on Text query with parse-server - node.js

The parse-server documentation is a bit outdated: http://docs.parseplatform.org/rest/guide/
Try this query:
curl -X GET \
-H "X-Parse-Application-Id: ${APPLICATION_ID}" \
-H "X-Parse-REST-API-Key: ${REST_API_KEY}" \
-G \
--data-urlencode 'where={"name":{"$text":{"$search":{"$term":"Milk"}}}}' \
--data-urlencode 'order="$score"' \
--data-urlencode 'key="$score"' \
https://localhost:1337/parse/classes/Groceries
And it with return this error:
{
"code": 102,
"error": "Invalid parameter for query: key"
}
Question, how do you sort by "score" when doing Full-text search with Parse if the query parameters does not work?

Related

hudi delta streamer job via apache livy

Please help how to pass --props file and --source-class file to LIVY API POST .
spark-submit --packages org.apache.hudi:hudi-utilities-bundle_2.11:0.5.3,org.apache.spark:spark-avro_2.11:2.4.4 \
--master yarn \
--deploy-mode cluster \
--conf spark.sql.shuffle.partitions=100 \
--driver-class-path $HADOOP_CONF_DIR \
--class org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer \
--table-type MERGE_ON_READ \
--source-class org.apache.hudi.utilities.sources.JsonKafkaSource \
--source-ordering-field tst \
--target-base-path /user/hive/warehouse/stock_ticks_mor \
--target-table test \
--props /var/demo/config/kafka-source.properties \
--schemaprovider-class org.apache.hudi.utilities.schema.FilebasedSchemaProvider \
--continuous
I have converted the configs you are using in a json file to be passed to LIVY API
{
"className": "org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer",
"proxyUser": "root",
"driverCores": 1,
"executorCores": 2,
"executorMemory": "1G",
"numExecutors": 4,
"queue": "default",
"name": "stock_ticks_mor",
"file": "hdfs://tmp/hudi-utilities-bundle_2.12-0.8.0.jar",
"conf": {
"spark.sql.shuffle.partitions": "100",
"spark.jars.packages": "org.apache.hudi:hudi-spark-bundle_2.12:0.8.0,org.apache.spark:spark-avro_2.12:3.0.2",
"spark.serializer": "org.apache.spark.serializer.KryoSerializer",
"spark.task.cpus": "1",
"spark.executor.cores": "1"
},
"args": [
"--props","/var/demo/config/kafka-source.properties",
"--table-type","MERGE_ON_READ",
"--source-class", "org.apache.hudi.utilities.sources.JsonKafkaSource",
"--target-base-path","/user/hive/warehouse/stock_ticks_mor",
"--target-table","test",
"--schemaprovider-class","org.apache.hudi.utilities.schema.FilebasedSchemaProvider",
"--continuous"
]
}
You can submit this json to the LIVY endpoint like
curl -H "X-Requested-By: admin" -H "Content-Type: application/json" -X POST -d #config.json http://localhost:8999/batches
For reference : https://community.cloudera.com/t5/Community-Articles/How-to-Submit-Spark-Application-through-Livy-REST-API/ta-p/247502

Error in Xray Rest API call for importing Test Execution Result

I know this query has been answered in so many post but those have not helped me. I did research, and tried, but still facing issue in making an API call to import test execution result.
Approach I took:
Created Test(Test Details: Cucumber), Test Precondition, Test Set, Test Plan and Test Execution
Exported Test using "Xray - Export to Cucumber" option
Added this in my BDD-Cucumber framework, executed and it has generated me cucumber.json file after execution
Trying API call using postman
/api/v1/import/execution/cucumber
curl --location --request POST 'https://xray.cloud.xpand-it.com/api/v1/import/execution/cucumber' \
--header 'Authorization: Bearer $token’ \
--header 'Content-Type: application/json' \
--data-binary '#/Users/aranjan/Downloads/cucumber.json'
Error:
{ "error": "Error creating Test Execution - Team is required."}
Now, this means it is trying to create new instead of update existing
Then, I used
/api/v1/import/execution/cucumber/multipart
curl --location --request POST 'https://xray.cloud.xpand-it.com/api/v1/import/execution/cucumber/multipart' \
--header 'Authorization: Bearer $token’ \
--form 'info=#/Users/aranjan/Downloads/xrayresultimport.json' \
--form 'result=#/Users/aranjan/Downloads/cucumber.json'
Error:
{ "error": "Unexpected field (result)"}
xrayresultimport.json
{
"fields": {
"project": {
"key": "HYP"
},
"customfield_10962": [
"Team","TeamQAAuto"
],
"issuetype": {
"id": "10722"
}
}
}
/api/v1/import/execution
curl --location --request POST 'https://xray.cloud.xpand-it.com/api/v1/import/execution' \
--header 'Authorization: Bearer $token’ \
--header 'Content-Type: application/json' \
--data-raw '{
"testExecutionKey": "HYP-3313",
"info" : {
"startDate" : "2020-09-25T11:47:35+01:00",
"finishDate" : "2020-09-25T11:53:00+01:00",
"testPlanKey" : "HYP-3341"
},
"tests" : [
{
"testKey" : "HYP-3330",
"start" : "2020-09-25T11:47:35+01:00",
"finish" : "2020-09-25T11:50:56+01:00",
"comment" : "Successful execution",
"status" : "PASSED"
}
]
}'
{ "error": "Error updating Test Execution - Issue update failed!"}
Agenda:
I want to import the execution result in my existing Test Execution.
I request you to guide me here.
Thanks in advance.
Currently, if you use the multipart endpoint it will always create new Test Executions.
The multipart request has a minor typo: you should have "results" instead of "result". An example would be something like this:
curl -H "Content-Type: multipart/form-data" -X POST -F info=#xrayresultimport.json -F results=#cucumber.json -H "Authorization: Bearer $token" https://xray.cloud.xpand-it.com/api/v2/import/execution/cucumber/multipart
That should make it work :)
Note: concerning the last example you gave, there could be several causes for it, including restrictions on Jira side. That would be analyzed by the Xray support team.

parse error: Invalid numeric literal at line 1, column 8

I am trying to get a token to some site by using curl. It looks like request is done correctly because I have to wait a bit for response however something is during deserialization because I always got error: parse error: Invalid numeric literal at line 1, column 8
This is how script looks like:
TOKEN=$(curl --request POST \
--url 'https://${DOMAIN_NAME}/getmy/token' \
--header 'content-type: application/json' \
--data '{"grant_type":"password", "username":"${USER_EMAIL}",
"password":"${USER_PASSWORD}",
"audience":"https://localhost:8443/my-composite-service", "scope":"openid
email test:read test:write", "client_id": "${CLIENT_ID}",
"client_secret": "${CLIENT_SECRET}"}' -s | jq -r .access_token)
Is it because of jq?
What is more I am sure that env variables are there, even with hard coded values the same error will be thrown.
Thank you in advance
Some hints:
Do not put everything in one line, make it readable instead.
Structure your code with functions.
Do error handling.
Use Bash's debugging functionality.
Do not build JSON with string concatenation, use JQ instead, because only JQ quotes JSON data correctly. A password may contain quoting characters.
An example:
set -eu
set -x
USER_EMAIL="user#domain.org"
USER_PASSWORD="password"
CLIENT_ID="id"
CLIENT_SECRET="secret"
DOMAIN_NAME="domain.org"
data()
{
local template='
{
"grant_type": "password",
"username": $username,
"password": $password,
"audience": "https://localhost:8443/my-composite-service",
"scope": "openid email test:read test:write",
"client_id": $client_id,
"client_secret": $client_secret
}'
if jq <<<null -c \
--arg username "${USER_EMAIL}" \
--arg password "${USER_PASSWORD}" \
--arg client_id "${CLIENT_ID}" \
--arg client_secret "${CLIENT_SECRET}" \
"$template"
then
return
else
printf "ERROR: Can not format request data." >&2
exit 1
fi
}
post()
{
if curl --request POST \
--url 'https://${DOMAIN_NAME}/getmy/token' \
--header 'content-type: application/json' \
--data "$1" \
-s
then
return
else
printf "ERROR: Can not send post request." >&2
exit 1
fi
}
token()
{
if jq -r .access_token
then
return
else
printf "ERROR: Can not parse JSON response." >&2
exit 1
fi
}
TOKEN="$(post "$(data)" | token)"

how to use the simple-oauth2 library to make request application/x-www-form-urlencoded?

i have some code which perfectly work through mac terminal and giving me token from website
curl -XPOST "https://link.com/oauth/access_token" \
-H "Content-Type: application/x-www-form-urlencoded" \
-H "Accept: 1.0" \
--data-urlencode "grant_type=client_credentials" \
--data-urlencode "client_id=myawesomeapp" \
--data-urlencode "client_secret=abc123" \
--data-urlencode "scope=read write"
I want to do request through nodejs without curl request. website giving link on npm library simple-oauth2, but my code does not work.
my not working version of this
const credentials = {
client: {
id: 'myawesomeapp',
secret: 'abc123'
},
auth: {
tokenHost: 'https://link.com',
tokenPath: '/oauth/access_token'
},
http: {
'headers.authorization': 'headers.Accept = application/x-www-form-urlencoded'
}
};
oauth2 = oauth2.create(credentials);
oauth2.accessToken.create()
If it's an x-www-form-urlencoded content type, you'll probably need to also update the authorisation method in your options to form (its default is header). I.e.
const credentials = {
/*
your existing config
*/
options: {
authorizationMethod: 'body'
}
}
Hopefully that should do the trick...

Request body is not properly feined in loadtest

I was using loadtest for load tetsing of my node app. I had issue sending post requests via loadtest. The request is of the form:
loadtest http://localhost:7000/eth/checkEthBalance -T "application/x-www-form-urlencoded" -H "application/x-www-form-urlencoded" -m "POST" --data '{"accountAddress":"0x62720366ef403c9891e2bfbd5358ee3c8a57b113"}' -n 1
But in req.body, I am getting:
{ '{"accountAddress":"0x62720366ef403c9891e2bfbd5358ee3c8a57b113"}': '' }
instead of:
{"accountAddress":"0x62720366ef403c9891e2bfbd5358ee3c8a57b113"}
However a curl request workd fab:
curl -X POST \
http://localhost:7000/eth/checkEthBalance \
-H 'cache-control: no-cache' \
-H 'content-type: application/x-www-form-urlencoded' \
-H 'postman-token: 0bf637f7-2037-c4ca-29a7-cc2310786317' \
-d accountAddress=0x62720366ef403c9891e2bfbd5358ee3c8a57b113
DOn't know what's wrong with loadtest. Any help?
Here how I did it, you need to pass -T 'application/json'
Sample : loadtest -P '{"name" : "ashok dey", "email" : "ashokdey#gmail.com", "password" : "password123"}' -c 100 --rps 2000 http://localhost:3434/register -T 'application/json'
You have to pass the same when you are passing a file containing the post data.

Resources