I have a swagger file where the "pattern" is ^(?:[A-Za-z0-9+/]{4})*(?:[A-Za-z0-9+/]{2}==|[A-Za-z0-9+/]{3}=)?$ and the default string needs to be "Basic dXNlckBkb21haW4uY29tOnBhc3N3b3Jk" (which is a Base64 encoded string with the word 'Basic' infront of it).
I am getting the error "String does not match pattern. I am not sure how to add the 'Basic' infront of the Base64 string. Tips?
It looks like you're using HTTP basic authentication for your REST API. I would suggest you to document that as "Security Scheme Object" instead. e.g.
{
"type": "basic"
}
Ref: https://github.com/OAI/OpenAPI-Specification/blob/master/versions/2.0.md#security-scheme-object
Related
I have objects in an S3 bucket, and I do not have control over the names of the keys. Some of these keys have special characters and AWS SDK does not like them.
For example, one object key is: folder/Johnson, Scott to JKL-Discovery.pdf, it might look fine at first glance, but if I URL encode it: folder%2F%E2%80%8DJohnson%2C+Scott+to+JKL-Discovery.pdf, you can see that after folder/ (or folder%2F when encoded) there is a random sequence of characters %E2%80%8D before Johnson.
It is unclear where these characters come from, however, I need to be able to handle this use case. When I try to make a copy of this object using the Node.js AWS SDK,
const copyParams = {
Bucket,
CopySource,
Key : `folder/Johnson, Scott to JKL-Discovery.pdf`
};
let metadata = await s3.copyObject(copyParams).promise();
It fails and can't find the object, if I encodeURI() the key, it also fails.
How can I deal with this?
DO NOT SUGGEST I CHANGE THE ALLOWED CHARACTERS IN THE KEY NAME. I DO NOT HAVE CONTROL OVER THIS
Faced the same problem but with PHP. copyObject() method is automatically encoding destination parameters (Bucket and Key) parameters, but not source parameter (CopySource) so it has to be encoded manually. In php it looks like this:
$s3->copyObject([
'Bucket' => $targetBucket,
'Key' => $targetFilePath,
'CopySource' => $s3::encodeKey("{$sourceBucket}/{$sourceFilePath}"),
]);
I'm not familiar with node.js but there should also exist that encodeKey() method that can be used?
trying your string, there's a tricky 'zero width space' unicode char...
http://www.ltg.ed.ac.uk/~richard/utf-8.cgi?input=342+200+213&mode=obytes
I would sanitize the string removing unicode chars and then proceding with url encoding as requested by official docs.
encodeURI('folder/johnson, Scott to JKL-Discovery.pdf'.replace(/[^\x00-\x7F]/g, ""))
I have created a web api with search option. For checking the api, i have used Postman tool in that i have provided symbols like "+", "#" for search. It is not recognized by the Web api Get method parameter.
From Postman Get Method() ->
http://localhost:60670/api/home?query=#
"query" is the parameter for search, in that i have given "+" or "#" keyword.
public IActionResult Get(string query)
But from the code it is not recognized by the Get method and "query" parameter is showing "null".
Help on this please!
few ways to do it:
1- encode your parameters : + --> %2B and # --> %23 (and any other special character) SEE : http://www.degraeve.com/reference/specialcharacters.php
2- send via POST instead of GET (I prefer this)
Encode your parameters:
original path = \Documents\FDS_D7U_C180175P+001005.pdf
encode path = %5C%20Documents%5CFDS_D7U_C180175P%2B001005.pdf
Enter the encoded path on the postman.
click on send.
The Azure Logic Apps action "Get Blob Content" doesn't allow us to set the return content-type.
By default, it returns the blob as binary (octet-stream), which is useless in most cases. In general it would be useful to have text (e.g. json, xml, csv, etc.).
I know the action is in beta. Is that on the short term roadmap?
Workaround I found is to use the Logic App expression base64ToString.
For instance, create an action of type "Compose" (Data Operations group) with the following code:
"ComposeToString": {
"inputs": "#base64ToString(body('Get_blob_content').$content)",
"runAfter": {
"Get_blob_content": [
"Succeeded"
]
},
"type": "Compose"
}
The output will be the text representation of the blob.
So I had a blob sitting in az storage with json in it.
Fetching blob got me a octet back that was pretty useless, as I was unable to parse it.
BadRequest. The property 'content' must be of type JSON in the
'ParseJson' action inputs, but was of type 'application/octet-stream'.
So I setup an "Initialize variable", content type of string, pointing to GetBlobContent->File Content. The base64 conversion occurs under the hood and I am now able to access my json via the variable.
No code required.
JSON OUTPUT...
FLOW, NO CODE...
Enjoy! Healy in Tampa...
After fiddling much with Logic Apps, I finally understood what was going on.
The JSON output from the HTTP request is the JSON representation of an XML payload:
{
"$content-type": "application/xml",
"$content": "77u/PD94bWwgdm..."
}
So we can decode it, but it is useless really. That is an XML object for Logic App. We can apply xml functions to it, such as xpath.
You would need to know the content-type.
Use #{body('Get_blob_content')['$content']} to get the content part alone.
Is enough to "Initialize Variable" and take the output of the Get Blob Content as type "String". This will automatically parse the content:
I would like to format a json payload in a workflow activity. I use the new {Text.JavaScriptEncode} to enclose my properties in {}. I should do it wrong because the tokens are not evaluated anymore. So if i use
{Text.JavaScriptEncode}{
"Courriel":{FormSubmission.Field:Courriel}
{Text.JavaScriptEncode}}
It ends with the following value:
{
"Courriel":{FormSubmission.Field:Courriel}
}
So the {FormSubmission.Field:Courriel} is not evaluated. If i don't specify {Text.JavaScriptEncode} before the first {, nothing is rendered (empty string).
I'm using Orchard 1.10.1.0
You might need to turn on the Tokenizers HashMode.
I haven't tested your token but I'm pretty sure the tokenizer tries to evaluate
this as a token and fails:
{"Courriel":{FormSubmission.Field:Courriel}
With hashMode enabled your code will look like this:
#{Text.JavaScriptEncode}{
"Courriel":#{FormSubmission.Field:Courriel}
#{Text.JavaScriptEncode}}
I'm running a node.js EB container and trying to store JSON inside an Environment Variable. The JSON is stored correctly, but when retrieving it via process.env.MYVARIABLE it is returned with all the double quotes stripped.
E.g. MYVARIABLE looks like this:
{ "prop": "value" }
when I retrieve it via process.env.MYVARIABLE its value is actualy { prop: value} which isn't valid JSON. I've tried to escape the quotes with '\' ie { \"prop\": \"value\" } that just adds more weird behavior where the string comes back as {\ \"prop\\":\ \"value\\" }. I've also tried wrapping the whole thing in single quotes e.g. '{ "prop": "value" }', but it seems to strip those out too.
Anyone know how to store JSON in environment variables?
EDIT: some more info, it would appear that certain characters are being doubly escaped when you set an environment variable. E.g. if I wrap the object in single quotes. the value when I fetch it using the sdk, becomes:
\'{ "prop": "value"}\'
Also if I leave the quotes out, backslashes get escaped so if the object looks like {"url": "http://..."} the result when I query via the sdk is {"url": "http:\\/\\/..."}
Not only is it mangling the text, it's also rearranging the JSON properties, so properties are appearing in a different order than what I set them to.
UPDATE
So I would say this seems to be a bug in AWS based on the fact that it seems to be mangling the values that are submitted. This happens whether I use the node.js sdk or the web console. As a workaround I've taken to replacing double quotes with single quotes on the json object during deployment and then back again in the application.
Use base64 encoding
An important string is being auto-magically mangled. We don't know the internals of EB, but we can guess it is parsing JSON. So don't store JSON, store the base64-encoded JSON:
a = `{ "public": { "s3path": "https://d2v4p3rms9rvi3.cloudfront.net" } }`
x = btoa(a) // store this as B_MYVAR
// "eyAicHVibGljIjogeyAiczNwYXRoIjogImh0dHBzOi8vZDJ2NHAzcm1zOXJ2aTMuY2xvdWRmcm9udC5uZXQiIH0gfQ=="
settings = JSON.parse(atob(process.env.B_MYVAR))
settings.public.s3path
// "https://d2v4p3rms9rvi3.cloudfront.net"
// Or even:
process.env.MYVAR = atob(process.env.B_MYVAR)
// Sets MYVAR at runtime, hopefully soon enough for your purposes
Since this is JS, there are caveats about UTF8 and node/browser support, but I think atob and btoa are common. Docs.