Why is the Azure Logic app HTTP module modifying the response payload? - azure

I'm trying to fetch data from a ticketing system with logic app, using the built-in HTTP module.
When testing with postman, I get the following response:
GET: https://ticketsystem/api/ticket/{{number}}
{
"tickets": [
{
"links": {
"data1": {
"id": 4
},
"data2": {
"id": 3
},
"data3": {
"id": 969
}
...
},
"data1Id": 4,
"data2Id": 3,
"data3Id": 969,
"att1": 1,
"att1": 2,
"att1": 3,
"att1": 4
....
}
]}
But, when trying through the HTTP logic app module, this is the response:
{
"data1Id": 4,
"data2Id": 3,
"data3Id": 969,
"att1": 1,
"att1": 2,
"att1": 3,
"att1": 4
...
}
Everything else is the same, I have even tried in a new logic app and a totally different azure account. It is still the same.
I've looked through the http header response, and there are some differences.
Postman:
Cache-Control: no-cache
Pragma: no-cache
Content-Type: application/vnd.api+json; charset=utf-8
Content-Encoding: gzip
Expires: -1
Vary: Accept-Encoding
Server: Microsoft-IIS/10.0
X-PS-ActionTime: 00:00:00.0022451
X-Frame-Options: deny
X-XSS-Protection: 1; mode=block
Strict-Transport-Security: max-age=31536000; includeSubDomains; preload
Date: Wed, 16 Jun 2021 09:41:50 GMT
Content-Length: 819
Azure HTTP:
"Pragma": "no-cache",
"Vary": "Accept-Encoding",
"X-PS-ActionTime": "00:00:00.0022021",
"X-Frame-Options": "deny",
"X-XSS-Protection": "1; mode=block",
"Strict-Transport-Security": "max-age=31536000; includeSubDomains; preload",
"Cache-Control": "no-cache",
"Date": "Wed, 16 Jun 2021 09:43:27 GMT",
"Server": "Microsoft-IIS/10.0",
"Content-Type": "application/json; charset=utf-8",
"Expires": "-1",
"Content-Length": "1733"
It looks like the "Content-Encoding: gzip" is missing from logic app, but I do not know why this is affecting the overall response structure. Also how to fix this issue.
I have tried to enable "Allow chunking", without any luck.
I understand that I might create an Azure Function to go around this, but I'm trying to avoid that for now.
Any advice?
EDIT
I tested with powershell Invoke-WebRequest, and I see that this is behaving the same as the Logic app HTTP action.
From powershell, the header is also the same (missing Content-Encoding: gzip) and the "Content-Type" = "application/json; charset=utf-8"
But, when testing with python (3.9) with the request module, then it's spitting out the same data as postman.
Content-Type: application/vnd.api+json; charset=utf-8
Content-Encoding: gzip
I am really trying to understand the difference here on the header level, as this is the only difference between the responses, and also what application/vnd.api+json and Content-Encoding: gzip does here.

Did you check this :
https://learn.microsoft.com/en-us/azure/connectors/connectors-native-http#omitted-http-headers
I see a feedback for this too: https://feedback.azure.com/forums/287593-logic-apps/suggestions/42578674-enable-support-for-content-type-header-in-http-get

I've sovled it.
I simply put this as a header on the HTTP Action:
"Accept": "application/vnd.api+json; charset=utf-8"
And the response message was the same as in Postman.
This still does not answer why it is behaving differently, since none of the request headers had this value in all of the metodes I tried.

Related

Python requests too slow compared to Postman or cURL

I am trying to make a single API call using different approaches in Python3 but they are all incredibly slow if I compare with the same request in Postman and/or cURL.
This is what I am trying:
headers = {
"Authorization": "Bearer " + self.access_token,
"Content-Type": "application/json",
"Accept-Encoding": "gzip, deflate, br",
"Connection": "keep-alive",
"Accept": "*/*",
"User-Agent": "PostmanRuntime/7.28.2",
"Cache-Control": "no-cache"
}
session = requests.Session()
api_res = session.get(self.api_endpoint, headers=headers, timeout=self.req_timeout,)
When running this call, it gets stuck for a few minutes until I receive a response. If I use Postman for example, I get the result in 1 second or less.
I also tried using http.client, urllib3 but I still see a huge delay in the call.
I also tried debugging the call:
import http.client
http.client.HTTPConnection.debuglevel = 1
logging.basicConfig()
logging.getLogger().setLevel(logging.DEBUG)
requests_log = logging.getLogger("requests.packages.urllib3")
requests_log.setLevel(logging.DEBUG)
requests_log.propagate = True
Debug response:
[DEBUG] Starting new HTTPS connection (1): ...:443
header: Content-Type: application/json; charset=utf-8
header: Content-Length: 61
header: Connection: keep-alive
header: Date: ...
header: Cache-Control: no-cache
header: Pragma: no-cache
header: Expires: -1
header: WWW-Authenticate: Bearer
header: Access-Control-Allow-Origin: *
header: P3P: CP="IDC DSP COR ADM DEVi TAIi PSA PSD IVAi IVDi CONi HIS OUR IND CNT"
header: Access-Control-Allow-Methods: GET, POST, OPTIONS, HEAD
header: X-XSS-Protection: 1; mode=block
header: Access-Control-Allow-Headers: accept, authorization, content-type, Cache-Control, P3P, GE-IGNORE-CACHE, Signature, fromMobile, ssoToken, fromAdmin, fromGameapp, fromGtv, fromGameapp, ManagerOrgUnitUserName, ManagerOrgUnitId, g-s-x, g-s-x-t, g-s-i-i, User-Agent, Referer, Origin, Access-Control-Allow-Headers
header: X-Content-Type-Options: nosniff
header: X-XSS-Protection: 1; mode=block
header: Referrer-Policy: same-origin
header: Strict-Transport-Security: max-age=31536000; includeSubDomains
header: X-Cache: Error from cloudfront
header: Via: 1.1 ....cloudfront.net (CloudFront)
header: X-Amz-Cf-Pop: ORD52-C1
header: X-Amz-Cf-Id: ...
Any ideas about why is it so slow? What it doesn't happen when I replicate to Postman for example?
Even from Google is taking a lot of time:
requests.get("https://www.google.com")
I also realized that it's working with ipv4 and NOT with ipv6.
Thanks!
I found out that IPv6 was not working but IPv4 was. I had to force calls to IPv4 like this:
import socket
import requests.packages.urllib3.util.connection as urllib3_cn
urllib3_cn.allowed_gai_family = lambda: socket.AF_INET

{"message": "Cannot send an empty message", "code": 50006} gitlab

I want to integrate a bot via a weebhook between Gitlab and discord, so I've configured the bot first, copied his url and put it into the gitlab weebhook configuration input and set it for sending push updates to the Discord server.
With a real push test, I have (with the body)
Request headers:
Content-Type: application/json
X-Gitlab-Event: Push Hook
and as response
Response headers:
Date: Tue, 26 May 2020 18:46:48 GMT
Content-Type: application/json
Content-Length: 58
Connection: close
Set-Cookie: __cfduid=d374998c2f84e3e20b75bbdec88fb63d91590518808; expires=Thu, 25-Jun-20 18:46:48 GMT; path=/; domain=.discordapp.com; HttpOnly; SameSite=Lax, __cfruid=418f7199379a53d23012d37b15f2ac5a3aac36b6-1590518808; path=/; domain=.discordapp.com; HttpOnly; Secure; SameSite=None
Strict-Transport-Security: max-age=31536000; includeSubDomains
X-Ratelimit-Bucket: 3cd1f278bd0ecaf11e0d2391374c011d
X-Ratelimit-Limit: 5
X-Ratelimit-Remaining: 4
X-Ratelimit-Reset: 1590518811
X-Ratelimit-Reset-After: 2
X-Envoy-Upstream-Service-Time: 12
Via: 1.1 google
Cf-Cache-Status: DYNAMIC
Cf-Request-Id: 02f3e816a1000004823d920200000001
Expect-Ct: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
Server: cloudflare
Cf-Ray: 5999a9376a790482-CDG
but got the error:
Response body:
{"message": "Cannot send an empty message", "code": 50006}
or also
Hook executed successfully but returned HTTP 400 {"message": "Cannot send an empty message", "code": 50006}
Thanks for help
You need to use the "Integrations" feature for "Discord notifications" instead of regular webhook.
See documentation here

How do I pass the username/password to authenticate a POST request to a Django rest-framework API from python code?

this is the request I want to emulate in python code, made here with httpie
$ http --auth mucho:pass POST http://3333333.ngrok.io/sms/ msg="love conquers all" to="255123456"
HTTP/1.1 201 Created
Allow: GET, POST, HEAD, OPTIONS
Content-Length: 67
Content-Type: application/json
Date: Mon, 30 Dec 2019 20:31:33 GMT
Server: WSGIServer/0.2 CPython/3.7.5
Vary: Accept, Cookie
X-Content-Type-Options: nosniff
X-Frame-Options: DENY
{
"id": 70,
"msg": "love conquers all",
"owner": "mucho",
"to": "255123456"
}
I have tried this
from requests import Request,Session
url="http://3333333.ngrok.io/sms/"
data = {
"to": "255123456",
"msg": "love conquers all",
}
s=Session()
req=Request('POST',url,data=data)
preped=req.prepare()
preped.prepare_auth(("mucho","pass"),url)
resp=s.send(preped)
It works, but I am looking for a neater/simpler way possibly using headers. Thanks for any help my way
If you are looking for a way to test your api endpoints, you can use insomnia or postman.
It is easy to set the headers.

github api v3 update reference returns a 422 "Object does not exist"

For the context I'm trying to update a file through the GitHub API.
Everything was fine until I tried to update the reference.
According to the doc, below are the requests I forged and their returns.
If anyone has an idea, I did find nothing to make it work.
$ curl -i -XPATCH -d '{"sha": "69d0a253406585d8faf616ce3ae0ff2453b346d7"}' -H "Authorization: token AUTH-TOKEN" https://api.github.com/repos/Trax-air/TraxIT/git/refs/heads/ci-migrate-quay
HTTP/1.1 422 Unprocessable Entity
Server: GitHub.com
Date: Wed, 18 Nov 2015 14:08:49 GMT
Content-Type: application/json; charset=utf-8
Content-Length: 128
Status: 422 Unprocessable Entity
X-RateLimit-Limit: 5000
X-RateLimit-Remaining: 4948
X-RateLimit-Reset: 1447856141
X-OAuth-Scopes: gist, read:repo_hook, repo, user
X-Accepted-OAuth-Scopes:
X-GitHub-Media-Type: github.v3
X-XSS-Protection: 1; mode=block
X-Frame-Options: deny
Content-Security-Policy: default-src 'none'
Access-Control-Allow-Credentials: true
Access-Control-Expose-Headers: ETag, Link, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval
Access-Control-Allow-Origin: *
Strict-Transport-Security: max-age=31536000; includeSubdomains; preload
X-Content-Type-Options: nosniff
X-GitHub-Request-Id: 4EC2914C:94AC:15486DB6:564C8671
{
"message": "Object does not exist",
"documentation_url": "https://developer.github.com/v3/git/refs/#update-a-reference"
}
I tried to update the reference by itself, it worked:
$ curl -i -XPATCH -d '{"sha": "694973310d80edfe9ca08bd2fd5a06a6407b08ad"}' -H "Authorization: token AUTH-TOKEN" https://api.github.com/repos/Trax-air/TraxIT/git/refs/heads/ci-migrate-quay
HTTP/1.1 200 OK
Server: GitHub.com
Date: Wed, 18 Nov 2015 14:10:20 GMT
Content-Type: application/json; charset=utf-8
Content-Length: 337
Status: 200 OK
X-RateLimit-Limit: 5000
X-RateLimit-Remaining: 4947
X-RateLimit-Reset: 1447856141
Cache-Control: private, max-age=60, s-maxage=60
ETag: "25641a46e3d517196995aec80669dcd2"
X-OAuth-Scopes: gist, read:repo_hook, repo, user
X-Accepted-OAuth-Scopes:
Vary: Accept, Authorization, Cookie, X-GitHub-OTP
X-GitHub-Media-Type: github.v3
X-XSS-Protection: 1; mode=block
X-Frame-Options: deny
Content-Security-Policy: default-src 'none'
Access-Control-Allow-Credentials: true
Access-Control-Expose-Headers: ETag, Link, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval
Access-Control-Allow-Origin: *
Strict-Transport-Security: max-age=31536000; includeSubdomains; preload
X-Content-Type-Options: nosniff
Vary: Accept-Encoding
X-Served-By: c6c65e5196703428e7641f7d1e9bc353
X-GitHub-Request-Id: 4EC2914C:94AB:F33F280:564C86CC
{
"ref": "refs/heads/ci-migrate-quay",
"url": "https://api.github.com/repos/Trax-air/TraxIT/git/refs/heads/ci-migrate-quay",
"object": {
"sha": "694973310d80edfe9ca08bd2fd5a06a6407b08ad",
"type": "commit",
"url": "https://api.github.com/repos/Trax-air/TraxIT/git/commits/694973310d80edfe9ca08bd2fd5a06a6407b08ad"
}
}
I then tried to confirm my commit exist:
$curl -i -XGET -H "Authorization: token AUTH-TOKEN" https://api.github.com/repos/Trax-air/TraxIT/git/commits/69d0a253406585d8faf616ce3ae0ff2453b346d7
HTTP/1.1 200 OK
Server: GitHub.com
Date: Wed, 18 Nov 2015 14:03:29 GMT
Content-Type: application/json; charset=utf-8
Content-Length: 1028
Status: 200 OK
X-RateLimit-Limit: 5000
X-RateLimit-Remaining: 4950
X-RateLimit-Reset: 1447856141
Cache-Control: private, max-age=60, s-maxage=60
Last-Modified: Wed, 18 Nov 2015 11:58:58 GMT
ETag: "4823502d472e3b3fe873841fcd60d3c6"
X-OAuth-Scopes: gist, read:repo_hook, repo, user
X-Accepted-OAuth-Scopes:
Vary: Accept, Authorization, Cookie, X-GitHub-OTP
X-GitHub-Media-Type: github.v3
X-XSS-Protection: 1; mode=block
X-Frame-Options: deny
Content-Security-Policy: default-src 'none'
Access-Control-Allow-Credentials: true
Access-Control-Expose-Headers: ETag, Link, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval
Access-Control-Allow-Origin: *
Strict-Transport-Security: max-age=31536000; includeSubdomains; preload
X-Content-Type-Options: nosniff
Vary: Accept-Encoding
X-Served-By: 8a5c38021a5cd7cef7b8f49a296fee40
X-GitHub-Request-Id: 4EC2914C:94AA:AE467E1:564C8530
{
"sha": "69d0a253406585d8faf616ce3ae0ff2453b346d7",
"url": "https://api.github.com/repos/Trax-air/TraxIT/git/commits/69d0a253406585d8faf616ce3ae0ff2453b346d7",
"html_url": "https://github.com/Trax-air/TraxIT/commit/69d0a253406585d8faf616ce3ae0ff2453b346d7",
"author": {
"name": "traxbot",
"email": "traxbot#trax-air.com",
"date": "2015-11-18T11:58:58Z"
},
"committer": {
"name": "traxbot",
"email": "traxbot#trax-air.com",
"date": "2015-11-18T11:58:58Z"
},
"tree": {
"sha": "ca47cb13f520913e643b15e6d0776f38ba577091",
"url": "https://api.github.com/repos/Trax-air/TraxIT/git/trees/ca47cb13f520913e643b15e6d0776f38ba577091"
},
"message": "Updated api_gateway to 0.15",
"parents": [
{
"sha": "694973310d80edfe9ca08bd2fd5a06a6407b08ad",
"url": "https://api.github.com/repos/Trax-air/TraxIT/git/commits/694973310d80edfe9ca08bd2fd5a06a6407b08ad",
"html_url": "https://github.com/Trax-air/TraxIT/commit/694973310d80edfe9ca08bd2fd5a06a6407b08ad"
}
]
}
This may be due to caching.
I asked to Github support and here is their answer:
Thanks for reaching out. The commit in question
(69d0a253406585d8faf616ce3ae0ff2453b346d7) doesn't exist in that repository,
so you're not allowed to update the branch to point to it.
As far as I can tell, it did exist in the repository at some point, but was pruned
because it was no longer reachable. I think the API was telling you that it still exists
in the repository due to caching.
I just cleared our caches and I think you should see that it's no longer available
if you try to fetch that commit. I'm sorry for the confusion about that --
I'll ask the team to investigate why this caching problem happened.
This solved it for me:
'{"sha": "new_sha", "force": true }'

Invalid ETAG error with BreezeJS and IE9

I am using BreezeJS with the SharePoint Adapter and get a consistent invalid etag error on the second and following POST request from client-side. Strangely, this does not happen in IE11 or the latest Google Chrome browser.
Here are some details on what is happening:
An Agenda item (id 3) is pulled from the server and has the etag value W/"4"
An update of the Agenda item is posted to the server with etag value W/"4"
Another update of the same item is posted to the server with etag value W/"4"
I get the error message: The etag value '4' specified in one of the request headers is not valid. Please make sure only one etag value is specified and is valid
The error message makes at lot of sense as we must not use the same ETAG value in two different requests. Using Google Chrome the second (and consequetive) POST request have the correctly incremented etag value. So in this case it would send W/"5" and everything would be fine.
I have noticed that the first POST request returns the new ETAG (W/"5") but it is not being applied to the second POST request. I am using the following versions:
BreezeJS: 1.4.13
Rest adapter: 0.2.3
SharePoint adapter: 0.2.3
For reference I have included the requests and responses for the three operations from Fiddler.
Get agenda item
GET http://intranet/test/_vti_bin/listdata.svc/Agenda?$filter=Id%20eq%203 HTTP/1.1
DataServiceVersion: 2.0
Accept: application/json;odata=verbose
X-Requested-With: XMLHttpRequest
Referer: http://intranet/test/SitePages/test.aspx#inmeeting/2
Accept-Language: da
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET4.0C; .NET4.0E; InfoPath.3)
Host: intranet
Connection: Keep-Alive
HTTP/1.1 200 OK
Cache-Control: no-cache
Content-Type: application/json;charset=utf-8
Vary: Accept-Encoding
Server: Microsoft-IIS/7.5
SPRequestGuid: e4a8e96a-d476-41e8-8445-ecd50fe8f78e
X-SharePointHealthScore: 0
DataServiceVersion: 2.0;
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7015
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 15 Aug 2014 10:28:59 GMT
Content-Length: 1491
{
"d" : {
"results": [
{
"__metadata": {
"uri": "http://intranet/test/_vti_bin/listdata.svc/Agenda(3)", "etag": "W/\"4\"", "type": "Microsoft.SharePoint.DataService.AgendaItem"
}, "ContentTypeID": "0x0100EF440AFE5EDF49AD87D3B9A9484C2C0300ACA340FAC0DE1D49B8514C10085EC342", "Title": "Welcome", "Meeting": {
"__deferred": {
"uri": "http://intranet/test/_vti_bin/listdata.svc/Agenda(3)/Meeting"
}
}, "MeetingId": 2, "Documents": {
"__deferred": {
"uri": "http://intranet/test/_vti_bin/listdata.svc/Agenda(3)/Documents"
}
}, "Links": {
"__deferred": {
"uri": "http://intranet/test/_vti_bin/listdata.svc/Agenda(3)/Links"
}
}, "Responsible": {
"__deferred": {
"uri": "http://intranet/test/_vti_bin/listdata.svc/Agenda(3)/Responsible"
}
}, "ResponsibleId": 1, "StartTime": "\/Date(1408622400000)\/", "EndTime": "\/Date(1408623120000)\/", "DurationInMinutes": 12, "Done": true, "Comments": null, "Sort": 0, "Id": 3, "ContentType": "Meeting Agenda", "Modified": "\/Date(1408105622000)\/", "Created": "\/Date(1408010077000)\/", "CreatedBy": {
"__deferred": {
"uri": "http://intranet/test/_vti_bin/listdata.svc/Agenda(3)/CreatedBy"
}
}, "CreatedById": 1, "ModifiedBy": {
"__deferred": {
"uri": "http://intranet/test/_vti_bin/listdata.svc/Agenda(3)/ModifiedBy"
}
}, "ModifiedById": 1, "Owshiddenversion": 4, "Version": "4.0", "Attachments": {
"__deferred": {
"uri": "http://intranet/test/_vti_bin/listdata.svc/Agenda(3)/Attachments"
}
}, "Path": "/test/Lists/Agenda"
}
]
}
}
First update of agenda item
POST http://intranet/test/_vti_bin/listdata.svc/Agenda(3) HTTP/1.1
Accept: application/json;odata=verbose
Content-Type: application/json;odata=verbose
DataServiceVersion: 2.0
X-RequestDigest: 0x5B15EE86ACA321A71DA9A2939E8FE1E2A29D3F6A60A6424C4F497DFFCD4D509836B6FB85A127CBBC947547D8AB7AE0E91CE6C72E7C359D6CF83351C024858D84,15 Aug 2014 10:26:46 -0000
X-HTTP-Method: MERGE
If-Match: W/"4"
X-Requested-With: XMLHttpRequest
Referer: http://intranet/test/SitePages/test.aspx#inmeeting/2
Accept-Language: da
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET4.0C; .NET4.0E; InfoPath.3)
Host: intranet
Content-Length: 82
Connection: Keep-Alive
Pragma: no-cache
{"Done":false,"__metadata":{"type":"Microsoft.SharePoint.DataService.AgendaItem"}}
HTTP/1.1 204 No Content
Cache-Control: no-cache
ETag: W/"5"
Server: Microsoft-IIS/7.5
SPRequestGuid: dc4896d6-91b9-4894-a169-c70889ad0747
X-SharePointHealthScore: 0
DataServiceVersion: 1.0;
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7015
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 15 Aug 2014 10:29:02 GMT
Second update of Agenda item
POST http://intranet/test/_vti_bin/listdata.svc/Agenda(3) HTTP/1.1
Accept: application/json;odata=verbose
Content-Type: application/json;odata=verbose
DataServiceVersion: 2.0
X-RequestDigest: 0x5B15EE86ACA321A71DA9A2939E8FE1E2A29D3F6A60A6424C4F497DFFCD4D509836B6FB85A127CBBC947547D8AB7AE0E91CE6C72E7C359D6CF83351C024858D84,15 Aug 2014 10:26:46 -0000
X-HTTP-Method: MERGE
If-Match: W/"4"
X-Requested-With: XMLHttpRequest
Referer: http://intranet/test/SitePages/test.aspx#inmeeting/2
Accept-Language: da
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET4.0C; .NET4.0E; InfoPath.3)
Host: intranet
Content-Length: 81
Connection: Keep-Alive
Pragma: no-cache
{"Done":true,"__metadata":{"type":"Microsoft.SharePoint.DataService.AgendaItem"}}
HTTP/1.1 412 Precondition Failed
Cache-Control: private
Content-Type: application/json
Server: Microsoft-IIS/7.5
SPRequestGuid: 965fbf16-8911-426b-9a90-8b21b4a78008
X-SharePointHealthScore: 0
DataServiceVersion: 1.0;
X-AspNet-Version: 2.0.50727
X-Powered-By: ASP.NET
MicrosoftSharePointTeamServices: 14.0.0.7015
X-MS-InvokeApp: 1; RequireReadOnly
Date: Fri, 15 Aug 2014 10:29:02 GMT
Content-Length: 214
{
"error": {
"code": "", "message": {
"lang": "en-US", "value": "The etag value '4' specified in one of the request headers is not valid. Please make sure only one etag value is specified and is valid."
}
}
}
UPDATE 1
I have done some debugging and started at the _processSavedEntity method of the SPAdapter. In Chrome it receives the new ETAG value from the response, in IE9 it gets null. Tracing this back I ended in breeze.debug.js (1.4.13) at line 15156 where the response gets into the system. Calling the getAllResponseHeaders() methods on the jqXHR returns an empty string in IE9, while Fiddler shows the headers present (same as above). So, that's a bit of a mystery. BTW. I'm using jQuery 1.9.1.
The problem is that IE9 and previous, when receiving a response code 204 No Content, throws away all the response headers.
It is discussed here jQuery.ajax with POST or PUT has no response headers for IE8 and IE9
If there are no concurrency problems, the driver could sniff IE9 and previous and issue a HEAD (I don't know if SharePoint supports it) or a GET for the same entity.
But both hacks give concurrency problems. Which one is worse depends on the context.
In any case, the whole purpose of having ETags for optimistic concurrency is defeated.
I am afraid that there is no way to fully support optimistic concurrency for OData in IE9 and previous.

Resources