I have a JSON file. I need to pass contents of this file to a terraform resource.
If I just pass it as it is, then the file has newlines and whitespace. I want to remove that and send compact JSON.
Is there a way to do that?
According to this comment, you can do it via a round-trip:
As of Terraform 0.12, you can produce minified JSON using a round-trip through jsondecode and jsonencode, because jsonencode always produces minimal JSON:
policy = jsonencode(jsondecode(templatefile("filename.tpl", {})))
Related
I'm gonna have to create set of pipelines based on json input file. This file contains configuration like paths, numbers etc. Each json input looks exactly the same (contains the same keys)
Is it possible to read json once and use it to define pipelines variables? Something like this:
read json -> returns dic
assign certain keys to pipeline variables like "VAR1 = dic.key1, VAR2 = dic.key2" etc.
My goal is to create single ARM template and just exchange the json files.
Ok guys, I have found the soltion and I'm posting it here for futures.
"Lookup" activity does the job. In it's output we can access json body and call it like above.
I have the following YAML file which is used by a third party tool.
timezone: "Europe/Zurich"
_export:
py:
python: ${virtualenv_home}/bin/python3
log_level: INFO
poll_interval: 1
workflow_name: test_workflow
!include : 'params.yml'
+say_hello:
echo>: Hello world!
My goal is to load this YAML file with PyYAML and change a few things and then dump it into a file.
This would work just fine if that "!include: 'params.yml'" wouldn't be.
How would I load this line so if it gets dumped back into a file it looks the same way it does now "!include : 'params.yml'?
The actual including will be handled by the third-party tool.
I played around with the answer from the following post PyYAML: load and dump yaml file and preserve tags ( !CustomTag ) but didn't get the correct results.
? !include ''
: params.yml
Thank you
The result you got is correct in the sense that is is equivalent to your input.
PyYAML, and YAML in general, is not a format where you have complete control over how your data is serialized. In some situations, it just makes decisions that you cannot influence. Let's check that with a minimal example:
import yaml, sys
node = yaml.compose("!include : 'params.yml'")
yaml.serialize(node, sys.stdout)
This code loads the input only up until node level, where information about the style of the key and value are preserved. Then it dumps this again as YAML. The output is
? !include ''
: 'params.yml'
As you can see, even though PyYAML knows that !include '' originally was an implicit key whose content was an empty plain scalar, it still serializes it as explicit key with its content being an empty single-quoted scalar. From a YAML perspective this is equivalent to the original input, it just uses different YAML style.
This experiment shows that you cannot force PyYAML to output the key-value pair in the style you want, since even if it knows the original style, it still changes it. This means there is no way to achieve what you want unless you modify PyYAML itself.
If you have control over the final consumer of the YAML file, I would suggest to change the structure like this:
$include: 'params.yml'
The usage of !include in your original file already goes against the spec's intention because tags are supposed to be applied to the value they are tagging. However in your case, !include tags an empty scalar that is a mapping key, but seems to be applied to the value of said key. If you want to have a key that does something special with its value, you should rather have the key to be a scalar with a special value, not an empty scalar with a special tag. By using $include, it is far easier to do a modification while preserving the YAML style. IIRC OpenAPI uses a similar technique with $ref.
I have a huge JSON file called data.json filled with objects with equal properties.
A sample of an object is this:
{
"directory": "directory_here",
"posted": false,
"date": null
}
In my script, I am using fs.readFile reading the data.json file if posted is false, if it is then run some functions and then change posted to true and also change date from null to the current date. So I need to change the properties inside the data.json file. How can I do that?
Here arr is Array of object
arr.forEach((obj)=>{
if(!obj.posted) obj.posted = true;
if(!obj.date) obj.date = new Date()
})
Except for a few pathological cases you cannot directly edit the contents of a JSON text file. Instead, you would need to either (a) read-modify-write the data, or (b) switch to a different storage format that does allow in-place editing.
If you (have to) stick with JSON, you have two main choices:
Read the file in chunks, parsing to JSON as you go. When you get a complete object, modify it as required then write it to a new file. After the whole file has been read, and all objects written, you can replace the original file with the newly-created one.
Read and parse the whole file into an array of objects. Make the required changes to all objects and then write the entire array (in JSON format) over the top of the original file. (Or, for extra safety, you could write to a new file and replace the original if there were no problems).
The second approach is probably easier to write, but will consume more memory.
A possible alternative (that I suspect won't be appropriate in your case) would be to switch to a storage format that does allow in-place editing. This could be a file of fixed record length records, or something like an SQLite database.
I am using NodeJS to generate Ed25519 keypairs. I need to convert the public key to a custom character encoding. However, there seems to be no way to convert the KeyObjects returned by the crypto.generateKeyPair() to buffers.
Does the standard library offer a way to directly generate the keys as buffers instead of KeyObjects?
The KeyObject offers a .export() method that will give you a string or a buffer. It seems you can use that method to convert your KeyObjects and can then apply your custom encoding.
https://nodejs.org/api/crypto.html#crypto_keyobject_export_options
You can get it to directly generate as a buffer/string only if you specify the publicKeyEncoding and/or privateKeyEncoding. But, if you're using a non-supported, custom encoding, then you can't get it to do that. You can export it to a Buffer/String and then apply your custom encoding to that.
From the doc for the API:
If a publicKeyEncoding or privateKeyEncoding was specified, this
function behaves as if keyObject.export() had been called on its
result. Otherwise, the respective part of the key is returned as a
KeyObject.
In ADFv2 I'm looking up a date and passing it to an Azure Function. I can pass just the data like so:
#activity('GetLastDateProcessed').output.firstRow.LastDateProcessed
However if I embed this into a JSON string like this:
{"lastProcessDate":"#activity('GetLastDateProcessed').output.firstRow.LastDateProcessed"}
I get this {"lastProcessDate":"#activity('GetLastDateProcessed').output.firstRow.LastDateProcessed"} instead of {"lastProcessDate":"2019-11-13"} as input into function.
Last I've tried to use a parameter with no success also.
#concat('{"lastProcessDate":"', string(pipeline().parameters.lastProcessDate), '"}')
The problem here is the parameter was not set. I set the parameter like this:
#activity('GetLastDateProcessed').output.firstRow.LastDateProcessed
However this is a default value and is never dynamically updated. If I can update this string then the #concat method will work, but haven't been able to figure out how to dynamically update a parameter for the pipeline.
Another option could be a pipeline variable, but I don't know how to reference the variable.
How do I concat strings together with dynamic content?
I think what you are missing is that when you use the at-sign '#' in the json string you should follow it with a curly bracket '{'
In your example it will look something like this:
{"lastProcessDate":"#{activity('GetLastDateProcessed').output.firstRow.LastDateProcessed}"}
here is the source (found it in the comments):
https://azure.microsoft.com/en-us/blog/azure-functions-now-supported-as-a-step-in-azure-data-factory-pipelines/#:~:text=Azure%20Data%20Factory%20(ADF)%20is,in%20your%20data%20factory%20pipelines.
I was able to get this to work by creating a second pipeline. This is not optimal, but works for people running into this same issue. Hopefully someone finds a better solution than this!
From the first pipeline I set the second pipelines parameter with this:
#activity('GetLastDateProcessed').output.firstRow.LastDateProcessed
I named the parameter in the second pipeline lastProcessDate so then this worked:
#concat('{"lastProcessDate":"', string(pipeline().parameters.lastProcessDate), '"}')
This is not straight forward and can't be how Microsoft is expecting us to solve this!
I was able to achieve this with command.
{
"storedprocedure":"storedProcName",
"params":"#{variables('currentDt')}"
}