Terraform: how to pass multiple integers to JSON - terraform

Say we have three three variables in JSON:
{
name: "Someone",
age: 30,
code: 25
}
and I want to read this as a template in terraform and replace the integer values.
For one variable, I could solve this by modifying the JSON into:
{
name: "Someone",
age: "${age}",
code: 25
}
and then on my template, do this:
template = replace(file("file.json"), "\"$${age}\"", "$${age}")
but in this case, I have another variable called code which I have to update as well, is there any way to accomplish this? Thank you!
EDIT 1:
This is the JSON:
{
name: "Someone",
age: "${age}",
code: "${code}"
}
This is the terraform:
data "template_file" "part_1" {
template = file("file.json")
vars = {
age = var.age
code= var.code
}
}
This works all fine, but I need the 'age' and 'count' on JSON as integer, not as strings. If I keep JSON like this:
{
name: "Someone",
age: ${age},
code: ${code}
}
It gives me an inappropriate JSON format. That's why I add the replace to template, so I can remove the double quotes and place there an integer, but I can do this only once!

Try this:
template = "${replace( "${replace(file("file.json"), "\"$${age}\"", "$${age}")}" ), "\"$${code}\"", "$${code}" }"
Here I'm replacing value again after return first replacement. It works for me.
If you have more number of variable then you have to use multiple replace.
Hope it will be helpful

Related

Terraform is it possible to parse nested map with different keys?

I have a map like this:
root = {
nat = {
pr = {
dev = [{
value: "10.10.10.10",
description: "test1"
},
],
},
pr2 = {
dev = [{
value: "10.10.10.11",
description: "test2"
}],
prod = [],
}
},
cc = {
pr = {
jej = [{
value: "10.10.10.10",
description: "test"
}]
}
},
smt = {
st = [{
value = "10.10.10.10",
description = "test"
}],
s2 = [{
value = "10.10.10.10",
description = "tt"
}]
}
}
Which can be modified in future by adding new nested maps. That map will be in module. I will paste the path(key) in input of module like "root.nat" and will expect in output the array of objects, thats will be all arrays of objects in "root.nat", sample of output:
[{
value = "10.10.10.10",
description = "test1"
},
{
value = "10.10.10.11",
description = "test2"
},
]
The problem is actually, that I cannot know how many nested maps I will have when inputing the path(key). And I can't iterate using for, cause I don't know exact fields.
Is it actually possible?
Terraform isn't designed for this sort of general computation. This particular problem requires unbounded recursion and that in particular isn't available in the Terraform language: Terraform always expects to be dealing with fixed data structures whose shape is known statically as part of their type.
If possible I would suggest using something outside of Terraform to preprocess this data structure into a flat map from dot-separated string key to a list of objects:
{
"root.nat" = [
{
value = "10.10.10.11",
description = "test2"
},
# etc
]
# etc
}
If you cannot avoid doing this transformation inside Terraform -- for example, if the arbitrary data structure you showed is being loaded dynamically from some other service rather than fixed in your configuration, so that it's not possible to pre-generate the flattened equivalent -- then you could use my Terraform provider apparentlymart/javascript as an escape hatch to the JavaScript programming language, and implement your flattening algorithm in JavaScript rather than in the Terraform language.
Since JavaScript is a general-purpose language, you can write a recursive algorithm to re-arrange your data structure into a flat map of lists of objects with dot-separated keys, and then look up those keys in Terraform code using the index operator as normal.
Check my answer for this This Question.
Basically, you can create as many for loops inside each other as you believe it will be necessary.
As long you check for null before proceeding to the next loop the code won't fail.
So in cases like this:
pr2 = {
dev = [{
value: "10.10.10.11",
description: "test2"
}],
prod = [],
prod would not be part of the final array.
Additionally, you can always add try functions in your verification if null to find out if you are in the right "level" of the loop.

In Nodejs with querystring how to encode array with single item

I have an object like this:
const querystring = require('querystring');
let data = {
name: 'Bill',
hobbies: ['Painting', 'Running']
}
If I encode the object like this:
console.log(querystring.encode(data));
I get this:
name=Bill&hobbies=Painting&hobbies=Running
Then when I decode it:
console.log(querystring.decode(querystring.encode(data));
It comes back perfectly like this:
{
name: 'Bill',
hobbies: ['Painting', 'Running']
}
However if I pass an object with array data that has only one item in it like this:
let data = {
name: 'Bill',
hobbies: ['Painting']
}
When I encode / decode it:
console.log(querystring.decode(querystring.encode(data));
This comes back:
{
name: 'Bill',
hobbies: 'Painting'
}
Notice how it converts the array to a single string, rather than an array with a single item.
Is there anyway around this limitation with querystring? Or is there another module that handles this better? I've been looking around and can't find anything.

Is there a way to create a dict object from user input() in python 3.7.1?

The purpose of this question is I want to write some request through pymongo.
For one criterion per field, this is not a difficulty to use input.
find_mongo = {}
key = input("enter a field :")
value = input("enter a criterion :")
find_mongo[key] = value
db.collection.find(find_mongo)#it works without problem.
That's more problematic when I want more complicated criteria.
For instance if I want the value to be in a range:
{"field":{"$lte":1,"$gte":0.5,"$exists:true"}}
Because for the user, who's my own-self, it would be very easier to write this in the shell:
enter a field : size
enter a criterion or criteria : {"$lte":1,"$gte":0.5,"$exists":true"}
the current issue is value returns a string object : '{"$lte":1,"$gte":0.5,"$exists":true"}'
and as far as I know, mongo is not able to find with a str expression but with a dict.
I want this too if I need to write even more complicated requests with "$or" nested in a "$and" expression:
{"$and":[
{"size":{"$lte":100}},
{ $or: [ { quantity: { $lt: 20 } }, { price: 10 } ] }
]
}
and just have to write in an input the following example:
enter your request: {"$and":[
{"size":{"$lte":100}},
{ $or: [ { quantity: { $lt: 20 } }, { price: 10 } ] }
]}
in order to be able to execute:
find_mongo = input("enter your request: ")
db.collection.find(find_mongo)
Sure it's easier from the mongoshell, but if I don't request from mongoshell is because I want to make transformation of the files of the request in python.
Besides I am not searching for a solution to obtain {"$lte":1,"$gte":0.5,"$exists":true"}. Honestly with some reflections, conditions and iterations, I think I am able to find a way. I am really searching if it is possible to enter and return an object, here a dict and not an str one, through the input() user. Just because it's easier for me if such a solution does exist.
Notes: Ubuntu 18.04, Python 3.7.1

How can I mask json using json-masker for fields with "-" in it?

My requirement is to mask certain fields of a JSON while logging them.I am working on node.js. I have used json-masker library of node.js. While passing the JSON path of attributes with "-" in the name in the "whitelist" parameter, I am getting lexical error.
JSON
{
"attribute1":"value1",
"attribute2":"value2",
"attribute-name":"value3"
}
Code
const masker = require('json-masker');
const mask= masker({
whitelist: ['$.attribute1','$.attribute-name']
});
Error
Error Lexical error on line 1. Unrecognized text.
$.attribute-name
Also, is there a way to specify only the attributes that needs to be masked rather that specifying the ones that need not be masked(as specified in whitelist).
Please suggest if there is a better approach to do this using any other function/library.
Please note that I am receiving this JSON , so I cannot change the key name
The correct syntax is '$["attribute-name"]' instead of '$.attribute-name'
The $ fields are processed by jsonpath, a dependency of json-masker. This issue is discussed in one of their github issues (#90) and the solution presented there.
Use maskdata npm module: https://www.npmjs.com/package/maskdata
You can mask json fields containing '-' without any extra effort. Also, you can mask the nested fields too.
Example:
const MaskData = require('./maskdata');
const maskJSONOptions = {
// Character to mask the data. Default value is '*'
maskWith : "*",
// It should be an array
// Field names to mask. Can give multiple fields.
fields : ['level1.level2.level3.field3', 'level1.level2.field2', 'level1.field1', 'value1']
};
const nestedObject = {
level1: {
field1: "field1",
level2: {
field2: "field2",
level3: {
field3: "field3",
}
}
},
value1: "value"
};
const maskedObj = MaskData.maskJSONFields(nestedObject, defaultJSONMaskOptions2);
//Output : {"level1":{"field1":"******","level2":{"field2":"******","level3":{"field3":"******"}}},"value1":"*****"}

What is the name of the code format with spaces before attributes to keep them starting from the same column?

We can see a kind of code format like this Object:
{
name: "Jason",
age: 28
gender: "male"
}
Or in Ruby, we can define a factory like this:
FactoryBot.define do
factory :company do
email { FFaker::Internet.email }
phone { FFaker::PhoneNumber.short_phone_number }
status { :processing }
name { FFaker::Company.name }
identity_no { FFaker::Identification.ssn }
end
end
The space between key and value (or method and attribute) are dynamically changing according to the longest key name (or method name).
How do you name this kind of format? And how can I do this in Vim?

Resources