Reindex Solr in crafter delivey version 3 - crafter-cms

I am trying to reindex solr in one of our crafter delivery node with curl command
curl "http://{hostname}:{port}/api/1/target/deploy/{environment}/{siteName}" -X POST -H "Content-Type: text/json" -d '{ "reprocess_all_files": true }'
And changing the curl command URL as per our configuration but I am getting an error "{"message":"Content type 'text/json;charset=UTF-8' not supported"}"
Any suggestion will be much appreciated.

The content type seems to be wrong, it should be application/json, not text/json. The documentation seems to have an error.

Related

POST csv/Text file using cURL

How can I send POST request with a csv or a text file to the server running on a localhost using cURL.
I have tried curl -X POST -d #file.csv http://localhost:5000/upload but I get
{
"message": "The browser (or proxy) sent a request that this server could not understand."
}
My server is flask_restful API. Thanks a lot in advance.
There are many alternate ways to accomplish this. One way is
I have used the following:
curl -F ‘data=#<file_location>’ <URL>
Eg. curl -F data=#data.csv localhost:5000/h
Your command can also be changed slightly like this
curl -X POST -H 'Content-Type: text/csv' -d #file.csv http://localhost:5000/upload
The above is one of the many ways.It can be sent either as a part of form or data, or multipart, etc. You can refer Medium Post
Curl's default Content-Type is application/x-www-form-urlencoded so your problem is probably that the data you are POSTing is not actually form data. It might work if you set the content type header properly:
-H "Content-Type: text/csv"
Though it does depend on the server.

URL encoding with SharePoint URL and cURL

I am new to using CURL. I am trying to connect to a SharePoint URL and pull the data in the form of json. My URL looks like this:
.../_api/web/lists/GetByTitle('titles list')/items
If I give the URL as it is without any encoding, it fails with a HTTP/1.1 400 Bad Request error.
I tried using -G and --data-urlencode like below:
curl -v -G -L --ntlm --user user:password -H 'Accept: application/json;odata=verbose' ".../_api/web/lists/GetByTitle" --data-urlencode "('titles list')" -d "/items"
doing this will convert my URL to .../_api/web/lists/GetByTitle?%28%27titles%20list%27%29&/items
But it fails with HTTP/1.1 404 Not Found since using -G will append to the URL with a ? and & . Putting ? and & to append to the URL will give me a different URL and hence the 404 not found error.
I have no issues accessing other end points like ../_api/web/lists since there is no need to encode it I guess.
How do I properly encode my URL and get the data without any errors?
Any help would be appreciated. Thank you.
I was able to solve the issue by directly giving URL with the encoded characters to cURL command. Like this:
curl -v -L --ntlm --user user:password -H 'Accept: application/json;odata=verbose' ".../_api/web/lists/GetByTitle%28%27titles%20List%27%29/items"
I hope this helps someone. I know I am no expert with linux or cURL and any better answers to this are welcome.

How to call cURL without using server-side cache?

Is there a way to tell cURL command not to use server's side cache?
e.g; I have this curl command:
curl -v www.example.com
How can I ask curl to send a fresh request to not use the cache?
Note: I am looking for an executable command in the terminal.
I know this is an older question, but I wanted to post an answer for users with the same question:
curl -H 'Cache-Control: no-cache' http://www.example.com
This curl command servers in its header request to return non-cached data from the web server.
The -H 'Cache-Control: no-cache' argument is not guaranteed to work because the remote server or any proxy layers in between can ignore it. If it doesn't work, you can do it the old-fashioned way, by adding a unique querystring parameter. Usually, the servers/proxies will think it's a unique URL and not use the cache.
curl "http://www.example.com?foo123"
You have to use a different querystring value every time, though. Otherwise, the server/proxies will match the cache again. To automatically generate a different querystring parameter every time, you can use date +%s, which will return the seconds since epoch.
curl "http://www.example.com?$(date +%s)"
Neither -H 'Pragma: no-cache' nor -H 'Cache-Control: no-cache' helped me. In browser with "cmd+shift+r" (full reload) I was seeing a new version than the output of curl in terminal.
How to debug for yourself
To get the exact same result, I went to browser > F12 (Dev Tools) > Network/Requests > Right-click on the request > "Copy as cURL" and got the equivalent cURL command to the browser's call.
Then, I pasted that in the terminal and started removing the params one by one, until I found that surprisingly --compressed was making a difference in my case. (Calling CloudFront AWS)
You could try following ways to force not to keep Cache when curl.
Note: The server may or may not be configured to respect the Cache-Control header. Therefore, whether this method will work is dependent on the server or website we’re sending the HTTP request to.
curl command with the Cache-Control header
$ curl -H 'Cache-Control: no-cache, no-store' http://www.example.com
Adding the Pragma HTTP Header
$ curl -H 'Pragma: no-cache' http://www.example.com
Finally, the most common way: bypass the cache by changing the URL
curl -H 'Cache-Control: no-cache, no-store' http://www.example.com?$(date +%s)
My problem is I should use single quotes in a query string.
My code
if (event.queryStringParameters && event.queryStringParameters['Name']) {
responseMessage = 'Hello, ' + event.queryStringParameters['Name'] + '!';
}
My requests
curl https://cssrq1srud.execute-api.us-east-1.amazonaws.com/serverless_lambda_stage/hello?Name=Terraform
returns zsh: no matches found
curl 'https://cssrq1srud.execute-api.us-east-1.amazonaws.com/serverless_lambda_stage/hello?Name=Terraform'
returns {"message":"Hello, Terraform!"}
This didn't work for me:
curl -H 'Cache-Control: no-cache' http://www.example.com
but this ended up doing the trick:
curl -H 'Cache-Control: no-cache' http://www.example.com&someFakeParam=$RANDOM
the &someFakeParam=$RANDOM makes the URL unique each time and bypasses caching.

GET request with curl. No errors, but Content-Length equals 0

Simple GET request with curl returns empty body (Content-Length: 0):
curl -v https://www.flyorientthai.com/booking/en/index.php
On the other hand wget can handle that url just fine:
wget https://www.flyorientthai.com/booking/en/index.php
What's wrong with curl?
Turned out 'Connection: Keep-Alive' header is required. It's added by default to request with wget but not with curl.
For anyone using "Copy as cURL" in the Chrome developer tools network tab to generate a curl command — one of the lines in the generated curl command is a header that looks something like...
-H 'If-None-Match: W/"18f6a6-6p8AL7X/p71IhN/WztZm60Aue4k"'
That causes the server to return an empty 304 response if nothing's changed. Just whip it out.
Let's try "Content-Type: application/x-www-form-urlencodeds"

How can I replicate a CouchDB database?

I would like to replicate the CouchDB database https://github.com/kirel/detexify-data
Sadly, I get an error:
$ curl -X POST -H "Content-Type:application/json" -d '{"source":"https://kirelabs.cloudant.com/detexify","target":"detexify"}' http://localhost:5984/_replicate
{"error":"checkpoint_commit_failure","reason":"Failure on source commit: {error,<<\"unauthorized\">>}"}
This seems to be the following error: https://issues.apache.org/jira/browse/COUCHDB-1524
Can anybody please tell me if there is a work-around? How can I get the data?
Failed tries
$ curl -X POST -H "Content-Type:application/json" \
-d '{"source":"https://kirelabs.cloudant.com/detexify",
"target":"detexify",
"use_checkpoints":false}'
http://localhost:5984/_replicate
{"error":"checkpoint_commit_failure",
"reason":"Failure on source commit: {error,<<\"unauthorized\">>}"}
Well a quick easy solution is to not use checkpoints.
{"source":"https://kirelabs.cloudant.com/detexify","target":"detexify","use_checkpoints":false}
I tried it with this request and managed to replicate your data.
Update
Well I tried it again and replication seems to be working fine for me. I replicated 22.7 mb of data before cancelling the replication. I have attached a screenshot.
curl -X POST -H "Content-Type:application/json" -d '{"source":"https://kirelabs.cloudant.com/detexify",
"target":"detexify",
"use_checkpoints":false,"create_target":true}' http://ABBA:dancing-queen#localhost:5984/_replicate
The replication command was copied verbatim from your updated example except for the "create_target" option and admin basic authentication on my local database.

Resources