I need to do this all within Postman application. I have found examples that use blob and tables but they didn't seem to fit with the queue storage. I think the biggest problem is that I am unable to create the signature part properly.
I believe I managed to execute Get Message in postman but I get "Server failed to authenticate the request." error upon Post request. The message I'm trying to post is in XML format and uses the same pre-request Script as the Get.
Pre-request Script of the Post and Get request (copied from online sources):
const storageAccount = pm.variables.get('azure_storage_account');
const accountKey = pm.variables.get('azure_storage_key');
pm.variables.set("header_date", new Date().toUTCString());
// Get hash of all header-name:value
const headers = pm.request.getHeaders({ ignoreCase: true, enabled: true });
// Construct Signature value for Authorization header
var signatureParts = [
pm.request.method.toUpperCase(),
headers["content-encoding"] || "",
headers["content-language"] || "",
headers["content-length"] || "",
// pm.request.body ? pm.request.body.toString().length || "" : "",
headers["content-md5"] || "",
headers["content-type"] || "",
headers["x-ms-date"] ? "" : (pm.variables.get("header_date") || ""),
headers["if-modified-since"] || "",
headers["if-match"] || "",
headers["if-none-match"] || "",
headers["if-unmodified-since"] || "",
headers["range"] || ""
];
// Construct CanonicalizedHeaders
const canonicalHeaderNames = [];
Object.keys(headers).forEach(key => {
if (key.startsWith("x-ms-")) {
canonicalHeaderNames.push(key);
}
});
// Sort headers lexographically by name
canonicalHeaderNames.sort();
const canonicalHeaderParts = [];
canonicalHeaderNames.forEach(key => {
let value = pm.request.getHeaders({ ignoreCase: true, enabled: true })[key];
// Populate variables
value = pm.variables.replaceIn(value);
// Replace whitespace in value but not if its within quotes
if (!value.startsWith("\"")) {
value = value.replace(/\s+/, " ");
}
canonicalHeaderParts.push(`${key}:${value}`);
});
// Add headers to signature
signatureParts.push.apply(signatureParts, canonicalHeaderParts);
// Construct CanonicalizedResource
const canonicalResourceParts = [
`/${pm.variables.get("azure_storage_account")}${pm.request.url.getPath()}`
];
const canonicalQueryNames = [];
pm.request.url.query.each(query => {
canonicalQueryNames.push(query.key.toLowerCase());
});
canonicalQueryNames.sort();
canonicalQueryNames.forEach(queryName => {
const value = pm.request.url.query.get(queryName);
// NOTE: This does not properly explode multiple same query params' values
// and turn them into comma-separated list
canonicalResourceParts.push(`${queryName}:${value}`);
});
// Add resource to signature
signatureParts.push.apply(signatureParts, canonicalResourceParts);
console.log("Signature Parts", signatureParts);
// Now, construct signature raw string
const signatureRaw = signatureParts.join("\n");
console.log("Signature String", JSON.stringify(signatureRaw));
// Hash it using HMAC-SHA256 and then encode using base64
const storageKey = pm.variables.get("azure_storage_key");
const signatureBytes = CryptoJS.HmacSHA256(signatureRaw, CryptoJS.enc.Base64.parse(storageKey));
const signatureEncoded = signatureBytes.toString(CryptoJS.enc.Base64);
console.log("Storage Account", pm.variables.get("azure_storage_account"));
console.log("Storage Key", storageKey);
// Finally, make it available for headers
pm.variables.set("header_authorization",
`SharedKey ${pm.variables.get("azure_storage_account")}:${signatureEncoded}`);
Headers in Postman:
Headers and URI of Request
Does anyone have working examples or knows where to move from here?
Edit:
Adding console and Error images:
Console output censored for safety
Error output censored for safety
As you said it is working well for Get And if using SAS ,it maybe permission issue .make sure to Storage-Queue-Contributor permissions on the storage queue.
There can be many reasons for this error if using SAS :
Check if you have insufficient SAS Permissions. Check if you marked read, write,list permissioms .Or try generating the new SAS key .
Make sure request does not include any empty headers when it is being access programmatically. If the value of a particular header is empty (or null), the header should be excluded from the request.
Also if above are not the reasons , try upgrading versions of sdks
4.Do check for proper encoding ex: UTF-8.
See Azure Storage Queue via REST API c# using Shared Key Authentication - Stack Overflow
References:
c# - The MAC signature found in the HTTP request ' ' is not the
same as any computed signature. Server used following string to
sign: 'PUT - Stack Overflow
Authorize with Shared Key (REST API) - Azure Storage | Microsoft
Docs
rest - The MAC signature found in the HTTP request '...' is not the
same as any computed signature - Stack Overflow
Using Hyperledger Fabric 2.0 as of, I'm trying to evaluate a transaction result like so:
const network = await gateway.getNetwork('evm');
// Get the contract from the network.
const contract = network.getContract('cc');
const result = await contract.evaluateTransaction('getEVMAddress') ;
And the getEVMAddress function is defined as follows:
async getEVMAddress(stub) {
console.info('============= START : getEVMAddress ===========');
const evmAsBytes = await stub.getState('EVMADDRESS');
if (!evmAsBytes || evmAsBytes.length === 0) {
throw new Error(`EVMADDRESS does not exist`);
}
var evmAddress = JSON.parse(evmAsBytes);
console.info('============= END : getEVMAddress ===========');
return JSON.stringify(evmAddress);
}
Which is storing a single simple string. I don't know what's happening, I already tried decoding with BlockDecoder but I seem unable to query results outside the ledger. Do note that there are no errors if we brint the string from inside the chain code, its results are just fine, but outside the chain code I only receive a Buffer with extraneous data within which I'm unable to parse correctly.
Thanks in advance.
while retrieving data from the ledger you will by default receive buffered data which you must covert to string.
Try to convert your output to string using toString() method.
var s= result.toString(). // converts buffer to string.
After that use var object = JSON.parse(s). //converts your string to JSON object.
Finally you can view a specific column of EVM table using,
var c= s['coulmn_name']; //column name is the name of the column in your record that you want in your EVM table.
The solution was to send the string back from a Buffer with Buffer.from(string)
I've been trying to crack this for a while with no success.
The server-side decryption uses RSACryptoServiceProvider RSA-OAEP. I can't change this
public void SetEncryptedPassword(string password) {
using (RSACryptoServiceProvider decrypter = new RSACryptoServiceProvider()) {
decrypter.FromXmlString(Resources.PrivateKey);
var decryptedBytes = decrypter.Decrypt(Convert.FromBase64String(password), true);
_password = Encoding.UTF8.GetString(decryptedBytes).ToSecureString();
}
}
I am trying to implement a web client that can access this service but I can't get the encryption right. I have tried loads of libraries but found the most help with SubtleCrypto, which at least can accept the public key provided by the server. I had to add the kty, alg and ext properties and encode the key as URL Base64, but it appears to import fine. Encryption does come back with something so I guess it's working?
const encrypt = async (msg)=>{
let msgBytes = stringToBytes(msg);
let publicKey2 = await window.crypto.subtle.importKey("jwk",publicKey, {name:"RSA-OAEP", hash:"SHA-256"}, true, ["encrypt"]).catch((issue)=>console.log(issue));
var result = await window.crypto.subtle.encrypt({name: "RSA-OAEP"}, publicKey2, msgBytes );
var toBase64 = _arrayBufferToBase64(result);
return toBase64;
}
I had a few issues getting a valid base64 string so now I'm using this
function _arrayBufferToBase64( buffer ) {
var binary = '';
var bytes = new Uint8Array( buffer );
var len = bytes.byteLength;
for (var i = 0; i < len; i++) {
binary += String.fromCharCode( bytes[ i ] );
}
return window.btoa( binary );
}
The result looks a little shorter than the outputs produced by the iPad and .net services, but I have no idea if that means anything.
The decryption always fails with the error "Error occurred while decoding OAEP padding.", which tells me that it fails at the first step.
Am I doing something wrong? Any advice would be helpful. I'll be watching comments and replies for most of the day so I can supply extra information if you ask for it.
Thanks in advance
CodeSandbox.io demo
The problem arises because in the C#-code SHA-1 is (implicitly) used for OAEP and in the JavaScript/Node.js-code SHA-256.
RSACryptoServiceProvider only supports PKCS#1 v1.5-padding and OAEP with SHA-1. The support of OAEP with SHA-2 is only
implemented for the newer RSA implementation, RSACng (available since .NET 4.6), which belongs to the new Cryptography API (Next Generation).
Since you can't change the C#-code according to your own statement, there is only the possibility to change the hash in the JavaScript/Node.js-code from SHA-256 to SHA-1.
In my build process, I want to include a timestamp from an RFC-3161-compliant TSA. At run time, the code will verify this timestamp, preferably without the assistance of a third-party library. (This is a .NET application, so I have standard hash and asymmetric cryptography functionality readily at my disposal.)
RFC 3161, with its reliance on ASN.1 and X.690 and whatnot, is not simple to implement, so for now at least, I'm using Bouncy Castle to generate the TimeStampReq (request) and parse the TimeStampResp (response). I just can't quite figure out how to validate the response.
So far, I've figured out how to extract the signature itself, the public cert, the time the timestamp was created, and the message imprint digest and nonce that I sent (for build-time validation). What I can't figure out is how to put this data together to generate the data that was hashed and signed.
Here's a rough idea of what I'm doing and what I'm trying to do. This is test code, so I've taken some shortcuts. I'll have to clean a couple of things up and do them the right way once I get something that works.
Timestamp generation at build time:
// a lot of fully-qualified type names here to make sure it's clear what I'm using
static void WriteTimestampToBuild(){
var dataToTimestamp = Encoding.UTF8.GetBytes("The rain in Spain falls mainly on the plain");
var hashToTimestamp = new System.Security.Cryptography.SHA1Cng().ComputeHash(dataToTimestamp);
var nonce = GetRandomNonce();
var tsr = GetTimestamp(hashToTimestamp, nonce, "http://some.rfc3161-compliant.server");
var tst = tsr.TimeStampToken;
var tsi = tst.TimeStampInfo;
ValidateNonceAndHash(tsi, hashToTimestamp, nonce);
var cms = tst.ToCmsSignedData();
var signer =
cms.GetSignerInfos().GetSigners()
.Cast<Org.BouncyCastle.Cms.SignerInformation>().First();
// TODO: handle multiple signers?
var signature = signer.GetSignature();
var cert =
tst.GetCertificates("Collection").GetMatches(signer.SignerID)
.Cast<Org.BouncyCastle.X509.X509Certificate>().First();
// TODO: handle multiple certs (for one or multiple signers)?
ValidateCert(cert);
var timeString = tsi.TstInfo.GenTime.TimeString;
var time = tsi.GenTime; // not sure which is more useful
// TODO: Do I care about tsi.TstInfo.Accuracy or tsi.GenTimeAccuracy?
var serialNumber = tsi.SerialNumber.ToByteArray(); // do I care?
WriteToBuild(cert.GetEncoded(), signature, timeString/*or time*/, serialNumber);
// TODO: Do I need to store any more values?
}
static Org.BouncyCastle.Math.BigInteger GetRandomNonce(){
var rng = System.Security.Cryptography.RandomNumberGenerator.Create();
var bytes = new byte[10]; // TODO: make it a random length within a range
rng.GetBytes(bytes);
return new Org.BouncyCastle.Math.BigInteger(bytes);
}
static Org.BouncyCastle.Tsp.TimeStampResponse GetTimestamp(byte[] hash, Org.BouncyCastle.Math.BigInteger nonce, string url){
var reqgen = new Org.BouncyCastle.Tsp.TimeStampRequestGenerator();
reqgen.SetCertReq(true);
var tsrequest = reqgen.Generate(Org.BouncyCastle.Tsp.TspAlgorithms.Sha1, hash, nonce);
var data = tsrequest.GetEncoded();
var webreq = WebRequest.CreateHttp(url);
webreq.Method = "POST";
webreq.ContentType = "application/timestamp-query";
webreq.ContentLength = data.Length;
using(var reqStream = webreq.GetRequestStream())
reqStream.Write(data, 0, data.Length);
using(var respStream = webreq.GetResponse().GetResponseStream())
return new Org.BouncyCastle.Tsp.TimeStampResponse(respStream);
}
static void ValidateNonceAndHash(Org.BouncyCastle.Tsp.TimeStampTokenInfo tsi, byte[] hashToTimestamp, Org.BouncyCastle.Math.BigInteger nonce){
if(tsi.Nonce != nonce)
throw new Exception("Nonce doesn't match. Man-in-the-middle attack?");
var messageImprintDigest = tsi.GetMessageImprintDigest();
var hashMismatch =
messageImprintDigest.Length != hashToTimestamp.Length ||
Enumerable.Range(0, messageImprintDigest.Length).Any(i=>
messageImprintDigest[i] != hashToTimestamp[i]
);
if(hashMismatch)
throw new Exception("Message imprint doesn't match. Man-in-the-middle attack?");
}
static void ValidateCert(Org.BouncyCastle.X509.X509Certificate cert){
// not shown, but basic X509Chain validation; throw exception on failure
// TODO: Validate certificate subject and policy
}
static void WriteToBuild(byte[] cert, byte[] signature, string time/*or DateTime time*/, byte[] serialNumber){
// not shown
}
Timestamp verification at run time (client site):
// a lot of fully-qualified type names here to make sure it's clear what I'm using
static void VerifyTimestamp(){
var timestampedData = Encoding.UTF8.GetBytes("The rain in Spain falls mainly on the plain");
var timestampedHash = new System.Security.Cryptography.SHA1Cng().ComputeHash(timestampedData);
byte[] certContents;
byte[] signature;
string time; // or DateTime time
byte[] serialNumber;
GetDataStoredDuringBuild(out certContents, out signature, out time, out serialNumber);
var cert = new System.Security.Cryptography.X509Certificates.X509Certificate2(certContents);
ValidateCert(cert);
var signedData = MagicallyCombineThisStuff(timestampedHash, time, serialNumber);
// TODO: What other stuff do I need to magically combine?
VerifySignature(signedData, signature, cert);
// not shown: Use time from timestamp to validate cert for other signed data
}
static void GetDataStoredDuringBuild(out byte[] certContents, out byte[] signature, out string/*or DateTime*/ time, out byte[] serialNumber){
// not shown
}
static void ValidateCert(System.Security.Cryptography.X509Certificates.X509Certificate2 cert){
// not shown, but basic X509Chain validation; throw exception on failure
}
static byte[] MagicallyCombineThisStuff(byte[] timestampedhash, string/*or DateTime*/ time, byte[] serialNumber){
// HELP!
}
static void VerifySignature(byte[] signedData, byte[] signature, System.Security.Cryptography.X509Certificates.X509Certificate2 cert){
var key = (RSACryptoServiceProvider)cert.PublicKey.Key;
// TODO: Handle DSA keys, too
var okay = key.VerifyData(signedData, CryptoConfig.MapNameToOID("SHA1"), signature);
// TODO: Make sure to use the same hash algorithm as the TSA
if(!okay)
throw new Exception("Timestamp doesn't match! Don't trust this!");
}
As you might guess, where I think I'm stuck is the MagicallyCombineThisStuff function.
I finally figured it out myself. It should come as no surprise, but the answer is nauseatingly complex and indirect.
The missing pieces to the puzzle were in RFC 5652. I didn't really understand the TimeStampResp structure until I read (well, skimmed through) that document.
Let me describe in brief the TimeStampReq and TimeStampResp structures. The interesting fields of the request are:
a "message imprint", which is the hash of the data to be timestamped
the OID of the hash algorithm used to create the message imprint
an optional "nonce", which is a client-chosen identifier used to verify that the response is generated specifically for this request. This is effectively just a salt, used to avoid replay attacks and to detect errors.
The meat of the response is a CMS SignedData structure. Among the fields in this structure are:
the certificate(s) used to sign the response
an EncapsulatedContentInfo member containing a TSTInfo structure. This structure, importantly, contains:
the message imprint that was sent in the request
the nonce that was sent in the request
the time certified by the TSA
a set of SignerInfo structures, with typically just one structure in the set. For each SignerInfo, the interesting fields within the structure are:
a sequence of "signed attributes". The DER-encoded BLOB of this sequence is what is actually signed. Among these attributes are:
the time certified by the TSA (again)
a hash of the DER-encoded BLOB of the TSTInfo structure
an issuer and serial number or subject key identifier that identifies the signer's certificate from the set of certificates found in the SignedData structure
the signature itself
The basic process of validating the timestamp is as follows:
Read the data that was timestamped, and recompute the message imprint using the same hashing algorithm used in the timestamp request.
Read the nonce used in the timestamp request, which must be stored along with the timestamp for this purpose.
Read and parse the TimeStampResp structure.
Verify that the TSTInfo structure contains the correct message imprint and nonce.
From the TimeStampResp, read the certificate(s).
For each SignerInfo:
Find the certificate for that signer (there should be exactly one).
Verify the certificate.
Using that certificate, verify the signer's signature.
Verify that the signed attributes contain the correct hash of the TSTInfo structure
If everything is okay, then we know that all signed attributes are valid, since they're signed, and since those attributes contain a hash of the TSTInfo structure, then we know that's okay, too. We have therefore validated that the timestamped data is unchanged since the time given by the TSA.
Because the signed data is a DER-encoded BLOB (which contains a hash of the different DER-encoded BLOB containing the information the verifier actually cares about), there's no getting around having some sort of library on the client (verifier) that understands X.690 encoding and ASN.1 types. Therefore, I conceded to including Bouncy Castle in the client as well as in the build process, since there's no way I have time to implement those standards myself.
My code to add and verify timestamps is similar to the following:
Timestamp generation at build time:
// a lot of fully-qualified type names here to make sure it's clear what I'm using
static void WriteTimestampToBuild(){
var dataToTimestamp = ... // see OP
var hashToTimestamp = ... // see OP
var nonce = ... // see OP
var tsq = GetTimestampRequest(hashToTimestamp, nonce);
var tsr = GetTimestampResponse(tsq, "http://some.rfc3161-compliant.server");
ValidateTimestamp(tsq, tsr);
WriteToBuild("tsq-hashalg", Encoding.UTF8.GetBytes("SHA1"));
WriteToBuild("nonce", nonce.ToByteArray());
WriteToBuild("timestamp", tsr.GetEncoded());
}
static Org.BouncyCastle.Tsp.TimeStampRequest GetTimestampRequest(byte[] hash, Org.BouncyCastle.Math.BigInteger nonce){
var reqgen = new TimeStampRequestGenerator();
reqgen.SetCertReq(true);
return reqgen.Generate(TspAlgorithms.Sha1/*assumption*/, hash, nonce);
}
static void GetTimestampResponse(Org.BouncyCastle.Tsp.TimeStampRequest tsq, string url){
// similar to OP
}
static void ValidateTimestamp(Org.BouncyCastle.Tsp.TimeStampRequest tsq, Org.BouncyCastle.Tsp.TimeStampResponse tsr){
// same as client code, see below
}
static void WriteToBuild(string key, byte[] value){
// not shown
}
Timestamp verification at run time (client site):
/* Just like in the OP, I've used fully-qualified names here to avoid confusion.
* In my real code, I'm not doing that, for readability's sake.
*/
static DateTime GetTimestamp(){
var timestampedData = ReadFromBuild("timestamped-data");
var hashAlg = Encoding.UTF8.GetString(ReadFromBuild("tsq-hashalg"));
var timestampedHash = System.Security.Cryptography.HashAlgorithm.Create(hashAlg).ComputeHash(timestampedData);
var nonce = new Org.BouncyCastle.Math.BigInteger(ReadFromBuild("nonce"));
var tsq = new Org.BouncyCastle.Tsp.TimeStampRequestGenerator().Generate(System.Security.Cryptography.CryptoConfig.MapNameToOID(hashAlg), timestampedHash, nonce);
var tsr = new Org.BouncyCastle.Tsp.TimeStampResponse(ReadFromBuild("timestamp"));
ValidateTimestamp(tsq, tsr);
// if we got here, the timestamp is okay, so we can trust the time it alleges
return tsr.TimeStampToken.TimeStampInfo.GenTime;
}
static void ValidateTimestamp(Org.BouncyCastle.Tsp.TimeStampRequest tsq, Org.BouncyCastle.Tsp.TimeStampResponse tsr){
/* This compares the nonce and message imprint and whatnot in the TSTInfo.
* It throws an exception if they don't match. This doesn't validate the
* certs or signatures, though. We still have to do that in order to trust
* this data.
*/
tsr.Validate(tsq);
var tst = tsr.TimeStampToken;
var timestamp = tst.TimeStampInfo.GenTime;
var signers = tst.ToCmsSignedData().GetSignerInfos().GetSigners().Cast<Org.BouncyCastle.Cms.SignerInformation>();
var certs = tst.GetCertificates("Collection");
foreach(var signer in signers){
var signerCerts = certs.GetMatches(signer.SignerID).Cast<Org.BouncyCastle.X509.X509Certificate>().ToList();
if(signerCerts.Count != 1)
throw new Exception("Expected exactly one certificate for each signer in the timestamp");
if(!signerCerts[0].IsValid(timestamp)){
/* IsValid only checks whether the given time is within the certificate's
* validity period. It doesn't verify that it's a valid certificate or
* that it hasn't been revoked. It would probably be better to do that
* kind of thing, just like I'm doing for the signing certificate itself.
* What's more, I'm not sure it's a good idea to trust the timestamp given
* by the TSA to verify the validity of the TSA's certificate. If the
* TSA's certificate is compromised, then an unauthorized third party could
* generate a TimeStampResp with any timestamp they wanted. But this is a
* chicken-and-egg scenario that my brain is now too tired to keep thinking
* about.
*/
throw new Exception("The timestamp authority's certificate is expired or not yet valid.");
}
if(!signer.Verify(signerCerts[0])){ // might throw an exception, might not ... depends on what's wrong
/* I'm pretty sure that signer.Verify verifies the signature and that the
* signed attributes contains a hash of the TSTInfo. It also does some
* stuff that I didn't identify in my list above.
* Some verification errors cause it to throw an exception, some just
* cause it to return false. If it throws an exception, that's great,
* because that's what I'm counting on. If it returns false, let's
* throw an exception of our own.
*/
throw new Exception("Invalid signature");
}
}
}
static byte[] ReadFromBuild(string key){
// not shown
}
I am not sure to understand why you want to rebuild the data structure signed in the response. Actually if you want to extract the signed data from the time-stamp server response you can do this:
var tsr = GetTimestamp(hashToTimestamp, nonce, "http://some.rfc3161-compliant.server");
var tst = tsr.TimeStampToken;
var tsi = tst.TimeStampInfo;
var signature = // Get the signature
var certificate = // Get the signer certificate
var signedData = tsi.GetEncoded(); // Similar to tsi.TstInfo.GetEncoded();
VerifySignature(signedData, signature, certificate)
If you want to rebuild the data structure, you need to create a new Org.BouncyCastle.Asn1.Tsp.TstInfo instance (tsi.TstInfo is a Org.BouncyCastle.Asn1.Tsp.TstInfo object) with all elements contained in the response.
In RFC 3161 the signed data structure is defined as this ASN.1 sequence:
TSTInfo ::= SEQUENCE {
version INTEGER { v1(1) },
policy TSAPolicyId,
messageImprint MessageImprint,
-- MUST have the same value as the similar field in
-- TimeStampReq
serialNumber INTEGER,
-- Time-Stamping users MUST be ready to accommodate integers
-- up to 160 bits.
genTime GeneralizedTime,
accuracy Accuracy OPTIONAL,
ordering BOOLEAN DEFAULT FALSE,
nonce INTEGER OPTIONAL,
-- MUST be present if the similar field was present
-- in TimeStampReq. In that case it MUST have the same value.
tsa [0] GeneralName OPTIONAL,
extensions [1] IMPLICIT Extensions OPTIONAL }
Congratulations on getting that tricky protocol work done!
See also a Python client implementation at rfc3161ng 2.0.4.
Note that with the RFC 3161 TSP protocol, as discussed at Web Science and Digital Libraries Research Group: 2017-04-20: Trusted Timestamping of Mementos and other publications, you and your relying parties must trust that the Time-Stamping Authority (TSA) is operated properly and securely. It is of course very difficult, if not impossible, to really secure online servers like those run by most TSAs.
As also discussed in that paper, with comparisons to TSP, now that the world has a variety of public blockchains in which trust is distributed and (sometimes) carefully monitored, there are new trusted timestamping options (providing "proof of existence" for documents). For example see
OriginStamp - Trusted Timestamping with Bitcoin. The protocol is much much simpler, and they provide client code for a large variety of languages. While their online server could also be compromised, the client can check whether their hashes were properly embedded in the Bitcoin blockchain and thus bypass the need to trust the OriginStamp service itself.
One downside is that timestamps are only posted once a day, unless an extra payment is made. Bitcoin transactions have become rather expensive, so the service is looking at supporting other blockchains also to drive costs back down and make it cheaper to get more timely postings.
Update: check out Stellar and Keybase
For free, efficient, lightning-fast, widely-vetted timestamps, check out the Stellar blockchain protocol, and the STELLARAPI.IO service.
Below is the code I am using to post json data to a rest ful service
var client = new JsonServiceClient(BaseUri);
Todo d = new Todo(){Content = "Google",Order =1, Done = false };
var s = JsonSerializer.SerializeToString < Todo>(d);
// client.Post<string>("/Todos/", "[{\"content\":\"YouTube\"}]");
// string payload= "[{\"id\":2,\"content\":\"abcdef\",\"order\":1,\"done\":false}]";
// string payload = #"{""todo"":{ {""content"":""abcdef"",""order"":1}} }";
client.Post<string>("/todos/", s);
I tried passing plain json data , it keep on fialing with message "Bad data". Then i tried serizliing the entity that also didn't work.
You can use PostJsonToUrl, which is included in ServiceStack.Text.