In my application all the utterances from Bixby are redirected to an action which makes an api call. This api call returns the response to Bixby for the user utterance. Recently I observed that if the user says the following: -
user says: - Add xyz to my cart
api response: - Would you like 2 of those?
user says: - Yes
api response: - Would you like to checkout?
user says: - Yes
Ideally now the request should come to my api and the user should be shown the checkout result page, but instead Bixby shows the user cached response and is shown the below response again: -
api response: - Would you like to checkout?
and the loop continues indefinitely. I wanted to know if this behaviour is expected? Is there a way we can skip the caching and send the request to the api endpoint to respond?
Yes, requests are cached on the server. You can disable the cache if you wish.
For example,
let options = {
cacheTime: 0
};
let response = http.getUrl('https://my-capsule.com/api/search/', options);
See https://bixbydevelopers.com/dev/docs/reference/JavaScriptAPI/http#http-options for more options. No pun intended. :)
In addition to cacheTime provided by the client, the Server can sometimes provide additional directives (max-age and no-store or no-cache) in the Cache-Control header. When this occurs, this is what Bixby does:
no-cache or no-store: Bixby will not cache anything. This will override the cacheTime variable provided by client.
time of response + max-age < current time: Remove response from cache (even if client requested a longer cacheTime)
time of response + cacheTime < current time: Remove response from cache (even if response provided a longer max-age)
Related
My goal is consistent response data from JIRA.
Sometimes, the response will succeed with Basic Authentication. Other times I will get a 401 response saying to authenticate using OAuth. I get the 401 response on resources where I have permission.
OAuth doesn't make sense for my application.
The API docs say Basic Authentication is acceptable.
https://docs.atlassian.com/software/jira/docs/api/REST/8.5.13/
Note that this is not JIRA cloud.
Set winHTTP = CreateObject("WinHttp.WinHttpRequest.5.1")
winHttp.Open "GET", targetURL, False 'false means blocking request
'targetURL = https://jira.{myOrg}.dev/rest/api/2/filter/{myFilterId}/columns
winHttp.setRequestHeader "Authorization", "Basic " + encodedCredentials
winHttp.send
debug.print winHttp.getAllRequestHeaders
debug.print winHttp.responseText
debug.print winHttp.Status
Edit: I get this HTTP Response Header
X-Seraph-LoginReason: AUTHENTICATED_FAILED
The meaning of this header is here: https://docs.atlassian.com/atlassian-seraph/2.6.1-m1/apidocs/com/atlassian/seraph/auth/LoginReason.html
It says I cannot be Authenticated. So JIRA doesn't recognize my credentials as being a correct pair.
It looks, like CAPTCHA was triggered, see https://developer.atlassian.com/cloud/jira/platform/basic-auth-for-rest-apis/ (last paragraph)
A CAPTCHA is 'triggered' after several consecutive failed log in attempts, and requires the user to interpret a distorted picture of a word and type that word into a text field with each subsequent log in attempt. If CAPTCHA has been triggered, you cannot use Jira's REST API to authenticate with the Jira site.
You can check this in the error response from Jira. If there is an X-Seraph-LoginReason header with a value of AUTHENTICATION_DENIED, the application rejected the login without even checking the password. This is the most common indication that Jira's CAPTCHA feature has been triggered.
I'm calling the Shipment/ConfirmShipment endpoint, and it's returning success in both .NET and Postman, but using Postman the shipment actually gets confirmed, while in .NET it doesn't.
In Postman, I'm doing a POST with the body containing JSON for the shipment number:
{"entity":{"ShipmentNbr":{"value":"022025"}}}
In .NET, I'm doing a HttpClient.PostAsync() to the same URL and with the same JSON as above. They both return success with a 202 Accepted response. However, as I mentioned, the Postman call confirms the shipment (Confirm = 1, Status = F), but in .NET, the POST doesn't actually confirm the shipment. Any ideas of what might be preventing it?
API v17.200.001
The status 202 Accepted might be a bit confusing but it does not mean that the action has completed succefully.
It only means that the execution request is valid and has been accepted by the system.
If you want to monitor the status of the Action itself you will need to use the address given in the header of the 202 Accepted response (the Location header)
So in my example that follows, you can see that my action has been accepted and that when I request the status of the operation I then get the 204 No Content which is the Success response.
Here I have requested the execution of the Confirm Shipment action and it has been accepted and I can see the url where I can fetch the status of the action
Here I have requested the status of the the action and can see the successful result
You can find more information here about the execution of an action through the REST API.
I recommend taking a look at the response section.
https://help-2017r2.acumatica.com/(W(1))/Main?ScreenId=ShowWiki&pageid=91bf9106-062a-47a8-be1f-b48517a54324
Here is more information about the 202 and the 204 http response:
https://httpstatuses.com/202
https://httpstatuses.com/204
I found out that editing a full_description of a DockerHub repository can be done via a JavaScript API, and figured this would be a fun excuse to learn the requests package for python. The JavaScript API definitely works, e.g. using this simple docker image.
The JS API basically does
Send a POST request to https://hub.docker.com/v2/users/login with the username and password. The server responds with a token.
Send a PATCH request to the specific https://hub.docker.com/v2/repositories/{user or org}/{repo}, making sure the header has Authorization: JWT {token}, and in this case with content body of {"full_description":"...value..."}.
What is troubling is that the PATCH request on the python side gets a 200 response back from the server (if you intentionally set a bad auth token, you get denied as expected). But it's response actually contains the current information (not the patched info).
The only "discoveries" I've made:
If you add the debug logging stuff, there's a 301. But this is the same URL for the javascript side, so it doesn't matter?
send: b'{"full_description": "TEST"}'
reply: 'HTTP/1.1 301 MOVED PERMANENTLY\r\n'
The token received by doing a POST in requests is the same as if I GET to auth.docker.io as decribed in Getting a Bearer Token section here. Notably, I didn't specify a password (just did curl -X GET ...). This is not true. They are different, I don't know how I thought they were the same.
This second one makes me feel like I'm missing a step. Like I need to decode the token or something? I don't know what else to make of this, especially the 200 response from the PATCH despite no changes.
The code:
import json
from textwrap import indent
import requests
if __name__ == "__main__":
username = "<< SET THIS VALUE >>"
password = "<< SET THIS VALUE >>"
repo = "<< SET THIS VALUE >>"
base_url = "https://hub.docker.com/v2"
login_url = f"{base_url}/users/login"
repo_url = f"{base_url}/repositories/{username}/{repo}"
# NOTE: if I use a `with requests.Session()`, then I'll get
# CSRF Failed: CSRF token missing or incorrect
# Because I think that csrftoken is only valid for login page (?)
# Get login token and create authorization header
print("==> Logging into DockerHub")
tok_req = requests.post(login_url, json={"username": username, "password": password})
token = tok_req.json()["token"]
headers = {"Authorization": f"JWT {token}"}
print(f"==> Sending PATCH request to {repo_url}")
payload = {"full_description": "TEST"}
patch_req = requests.patch(repo_url, headers=headers, json=payload)
print(f" Response (status code: {patch_req.status_code}):")
print(indent(json.dumps(patch_req.json(), indent=2), " "))
Additional information related to your CSRF problem when using requests.Session():
It seems that Docker Hub is not recognizing csrftoken named header/cookie (default name of the coming cookie), when making requests in this case.
Instead, when using header X-CSRFToken on the following requests, CSRF is identified as valid.
Maybe reason is cookie-to-header token pattern.
Once updating session header with cookie of login response
s.headers.update({"X-CSRFToken": s.cookies.get("csrftoken")})
There is no need to set JWT token manually anymore for further requests - token works as cookie already.
Sorry, no enough privileges to just comment, but I think this is relevant enough.
As it turns out the JWT {token} auth was valid the entire time. Apparently, you need a / at the end of the URL. Without it, nothing happens. LOL!
# ----------------------------------------------------V
repo_url = f"{base_url}/repositories/{username}/{repo}/"
As expected, the PATCH then responds with the updated description, not the old description. WOOOOT!
Important note: this is working for me as of January 15th 2020, but in my quest I came across this dockerhub issue that seems to indicate that if you have 2FA enabled on your account, you can no longer edit the description using a PATCH request. I don't have 2FA on my account, so I can (apparently). It's unclear what the future of that will be.
Related note: the JWT token has remained the same the entire time, so for any web novices like myself, don't share those ;)
I am using nodejs and expressjs and using "paypal-ec" node package to do incontext paypal integration.
This is invoked with the following piece of code:
<script src='https://www.paypalobjects.com/js/external/dg.js' type='text/javascript'></script>
<script>
var dg = new PAYPAL.apps.DGFlow(
{
trigger: 'paypal_submit',
expType: 'instant'
});
</script>
What I have achieved with this
I am able to make payment in the Paypal Sandbox environment but it shows me older payment screen where in user needs to fill in the details of address etc (I am not able to attach screenshot because of credits)
What I want to achieve
I am trying to make payment with the screens where in user doesn't need to prefill any data also it gives better UI.
Some experience like provided in this plukr link http://plnkr.co/edit/3vfNSVRyq86pDR5mH4HH?p=preview
The problem with the given piece of code in plunk is that it doesn't expose what is there in the action method and how can I provide amount to it (or any other details if any).
Any kind of help is appreciated.
I get it but don't claim to be a node dev - yet :) so this is "conceptual":
At the end of the day, the server-side call (SetExpressCheckout) where you send your trnx details (items, price, return/cancel urls, etc.) to Paypal and obtain a token is unchanged (with the documented limitations and ignored params that is).
The change is in the front-end where:
the in-context js script<script async src="//www.paypalobjects.com/api/checkout.js"></script>
new redirect url : https://www.paypal.com/checkoutnow?token=[the token you obtained]
are in play
The linked sample's server-side SetExpressCheckout process is the:
http://166.78.8.98/cgi-bin/aries.cgi?sandbox=1&direct=1&returnurl=http://166.78.8.98/cgi-bin/return.htm&cancelurl=http://166.78.8.98/cgi-bin/cancel.htm
You can see the returnurl and cancelurl set (but it could have been done server side as well). This will obtain the token which is needed for the subsequent steps.
If you can inspect the traffic, you'll see the response where the redirect (that is "caught" in the front end and displayed "in-context"):
HTTP/1.1 302 Found
Date: Sun, 05 Jul 2015 16:00:48 GMT
Server: Apache/2.4.7 (Ubuntu)
Access-Control-Allow-Origin: *
Location: https://www.sandbox.paypal.com/checkoutnow?useraction=commit&token=EC-94X58918K2362702E&ul=0
This sample is probably more detailed and "less magical" (shows more of what's going on) and is what helped me implement:
http://plnkr.co/edit/UhNka4VaaRRGY1TK32LE?p=preview
Hth.
Using a browser REST client to POST to the activity stream at e.g.
https://connectionsww.demos.ibm.com/connections/opensocial/basic/rest/activitystreams/#me/#all
...with the settings prescribed in IBM Connections OpenSocial API > POSTing new events
...results in the following response:
<error xmlns="http://www.ibm.com/xmlns/prod/sn">
<code>403</code>
<message>You are not authorized to perform the requested action.</message>
<trace></trace>
</error>
What am I missing?
This same approach works nicely on IBM Connections 4.0.
Which setting needs 'switching on'?
Try a URL like this... https://sbtdev.swg.usma.ibm.com:444/connections/opensocial/basic/rest/activitystreams/#me/#all
I added the Basic/Rest component, and it worked for me.
1 - Added URL https://sbtdev.swg.usma.ibm.com:444/connections/opensocial/basic/rest/activitystreams/#me/#all
2 - Changed Method to Post
3 - Added Content-Type: application/json
4 - Authentication -> Basic
5 - Logged IN
6 - Posted
Same thing here: 403 when I make an AJAX call to an IBM Connections 6.0 REST API url. Same error in Chrome, Firefox and IE11. When I open the same URL in a separate browser tab, everything works fine.
Comparing the http headers of both calls, and fiddling with Postman, the diference is the presence and value of the atribute Origin.
Seems that Connections allows calls from its own server. For example, when: Origin: connections.mycompany.com.
It also allows calls when Origin is not defined, which happens when the url is called from a separate browser tab.
There is a doc at IBMs Support site that confirms this - http://www-01.ibm.com/support/docview.wss?uid=swg21999210. It also suggests a workaround that did the job for me: unsetting the Origin attribute in the IBM HTTP Server that is in front of your Connections instance. Add the lines below to the httpd.conf file:
Header unset Origin
RequestHeader unset Origin