Chef custom attributes - attributes

I'm working on a custom Chef Cookbook and have defined a custom attribute called default["server"]["apikey"] = nil thats been defined within the cookbook in a separate attributes file that looks like this:
#Default Attributes
default["webhooks"]["get_response"] = ""
default["webhooks"]["put_response"] = ""
default["webhooks"]["post_response"] = ""
default["server"]["username"] = "user"
default["server"]["password"] = "123"
default["server"]["apikey"] = nil
Within my recipe I then do this:
webhooks_request "Request an API key from TPP " do
uri "172.16.28.200/sdk/authorize/"
post_data (
{ 'Username' => node["server"]["username"], 'Password' => node["server"]["password"]}
)
header_data (
{ 'content-type' => 'application/json'}
)
expected_response_codes [ 200, 201, 400 ]
action :post
end
I then follow this with ruby_block that updates the value of the ``default["server"]["apikey"]` attribute with the API key like this:
ruby_block "Extract the API Key" do
block do
jsonData = JSON.parse(node["webhooks"]["post_response"])
jsonData.each do | k, v |
if k == 'APIKey'
node.overide["server"]["apikey"] = v
end
end
end
action :run
end
I can then validate it using this:
ruby_block "Print API Key" do
block do
print "\nKey = : " + node["server"]["apikey"] + "\n"
end
action :run
end
However, if I then try to use the node["server"]["apikey"] attribute in a following block like this:
webhooks_request "Get data from TPP" do
uri "127.0.0.1/vedsdk/certificates/retrieve?apikey=#{node["server"]["apikey"]}"
post_data (
{ 'data' => "NsCVcQg4fd"}
)
header_data (
{ 'content-type' => 'application/json', 'auth' => node["server"] ["username"]}
)
expected_response_codes [ 200, 201, 400, 401 ]
action :post
end
The value of node["server"]["apikey"]} is always empty. Interestingly though the value of the node["server"] ["username"] attribute is available and works as expected.
Obviously, I'm missing something here buy can't work out what :(

Writing it as a generic answer (it will avoid keeping it unanswered in list too ;))
When inside a resource you may evaluate an attribute value at converge time with lazy attribute evaluation.
The correct usage is
resource "name" do
attribute lazy {"any value #{with interpolation} inside"}
end
The common error is to use lazy inside interpolation as we only want the variable to be lazy evaluated and there's only one.
By design lazy is meant to evaluate the attribute value, it can contain Ruby code to compute the value from something done by a previous resource too.

Related

Terraform test for null or false value of object key

Hopefully a relatively straightforward one. I have an object that uses the experimental optional attrs feature.
This means that one of the object attributes/keys do not need to be present. I then need to test for either null or false value of said object key in object composition with for loop.
When using the module_variable_optional_attrs experiment, it seems that if you use lookup() to find an object key which does not exist it'll always return null, not the default, as you might expect it to.
I now am having to test with a conditional (lookup(connection, "auto_accept", false) == false || lookup(connection, "auto_accept", false) == null)
this doesn't seem very clean. looking for suggestions on improvements.
EDIT
main.tf
terraform {
# Optional attributes and the defaults function are
# both experimental, so we must opt in to the experiment.
experiments = [module_variable_optional_attrs]
}
variable "example_var" {
type = list(object({
name = string
auto_accept = optional(bool)
}))
description = "Some variable"
default = [{
name = "example-name"
}]
}
Below commands are run from terraform console
> lookup(var.example_var[0], "auto_accept")
false
# now lets make the key undefined
> lookup(var.example_var[0], "auto_accept")
tobool(null)
> lookup(var.example_var[0], "auto_accept", false)
tobool(null)
> tobool(null)
null
I have been in a similar scenario where I had following input map:
payload = {
name = "SomeTestName"
optional_attr = "SomeOptionalAttribue"
}
The above payload could have following possible inputs:
payload = {
name = "SomeTestName"
optional_attr = "SomeOptionalAttribue"
}
OR
payload = {
name = "SomeTestName"
}
OR
payload = {
name = "SomeTestName"
optional_attr = null
}
My use case is to look for optional_attr in the payload and take its value as key and get corresponding value from the following map. If optional_attr is null or not provided then I take default value of mydefault key from the following map:
master_data = {
optional_attr_value1 = "value1"
optional_attr_value2 = "value2"
mydefault = "default_value"
}
Following combination of lookup with null check worked for me:
value = lookup(master_data, can(coalesce(payload.optional_attr)) ? lookup(tomap(payload), "optional_attr", "mydefault") : "mydefault" , "default_value")
What above code does is first it checks if optional_attr is not provided at all in the payload or its provided with a value of null. If this is true then the ternary statement returns false and "mydefault" value is provided that is used by outside lookup function to get mydefault key's corresponding value from master_data. If ternary operator returns true (optional_attry is provided with a value) then the internal lookup function is triggered and it will return value from master_data corresponding to whatever key provided as a value of optional_attr. Again if optional_attr value does not match any key in the master_data then default key of my_default is returned by internal lookup function.

Using find{ } on a map where the whole map is evaluated not each element

I created some mixin methods. Code and example below:
URL.metaClass.withCreds = { u, p ->
delegate.openConnection().tap {
setRequestProperty('Authorization', "Basic ${(u + ':' + p).bytes.encodeBase64()}")
}
}
URLConnection.metaClass.fetchJson = {
delegate.setRequestProperty('Accept', 'application/json')
delegate.connect()
def code = delegate.responseCode
def result = new JsonSlurper().parse(code >= 400 ? delegate.errorStream : delegate.inputStream as InputStream)
[
ok : code in (200..299),
body: result,
code: code
]
}
example usage:
new URL("$baseUrl/projects/$name").withCreds(u, p).fetchJson().find {
it.ok
}?.tap{
it.repos = getRepos(it.key).collectEntries { [(it.slug): it] }
}
}
When I dont use find(), my object is, as expected, a map with those 3 elements. When I use find it is a Map.Entry with key ok and value true
which produces this error:
groovy.lang.MissingPropertyException: No such property: ok for class: java.util.LinkedHashMap$Entry
Possible solutions: key
It occured to me when I wrote this post that it was treated the map as an iterable and thus looking at every entry which I have subsequently verified. How do I find on the whole map? I want it.ok because if it's true, I need to carry it forward
There is no such method in Groovy SDK. Map.find() runs over an entry set of the map you call method on. Based on expectation you have defined I'm guessing you are looking for a function that tests map with a given predicate and returns the map if it matches the predicate. You may add a function that does to through Map.metaClass (since you already add methods to URL and URLConnection classes). Consider following example:
Map.metaClass.continueIf = { Closure<Boolean> predicate ->
predicate(delegate) ? delegate : null
}
def map = [
ok : true,
body: '{"message": "ok"}',
code: 200
]
map.continueIf { it.ok }?.tap {
it.repos = "something"
}
println map
In this example we introduced a new method Map.continueIf(predicate) that tests if map matches given predicate and returns a null otherwise. Running above example produces following output:
[ok:true, body:{"message": "ok"}, code:200, repos:something]
If predicate is not met, map does not get modified.
Alternatively, for more strict design, you could make fetchJson() method returning an object with corresponding onSuccess() and onError() methods so you can express more clearly that you add repos when you get a successful response and optionally you create an error response otherwise.
I hope it helps.

Spock: check the query parameter count in URI

I have just started with spock. I have one functionality. where the java function makes an http call. As per functionality, the URI used in http call, must contain "loc" parameter and it should be only once.
I am writing Spock test case. I have written below snippet.
def "prepareURI" () {
given: "Search Object"
URI uri = new URI();
when:
uri = handler.prepareURI( properties) // it will return URI like http://example.com?query=abc&loc=US
then:
with(uri)
{
def map = uri.getQuery().split('&').inject([:]) {map, kv-> def (key, value) = kv.split('=').toList(); map[key] = value != null ? URLDecoder.decode(value) : null; map }
assert map.loc != null
}
}
From above snippet, my 2 tests got passed like
It should be exists
It should not be null
I want to check the count of "loc" query parameter. It should be passed exactly once. With map as above, If I pass "loc" parameter twice, map overrides the old value with 2nd one.
Does any one knows, how to access the query parameters as list, and in list I want to search the count of Strings which starts with "loc"
Thanks in advance.
Perhaps an example would be the best start:
def uri = new URI('http://example.com?query=abc&loc=US')
def parsed = uri.query.tokenize('&').collect { it.tokenize('=') }
println "parsed to list: $parsed"
println "count of 'loc' params: " + parsed.count { it.first() == 'loc' }
println "count of 'bob' params: " + parsed.count { it.first() == 'bob' }
println "count of params with value 'abc': " + parsed.count { it.last() == 'abc' }
prints:
$ groovy test.groovy
parsed to list: [[query, abc], [loc, US]]
count of 'loc' params: 1
count of 'bob' params: 0
count of params with value 'abc': 1
the problem, as you correctly noted, is that you can not put your params into a map if your intent is to count the number of params with a certain name.
In the above, we parse the params in to a list of lists where the inner lists are key, value pairs. This way we can call it.first() to get the param names and it.last() to get the param values. The groovy List.count { } method lets us count the occurences of a certain item in the list of params.
As for your code, there is no need to call new URI() at the beginning of your test as you set the value anyway a few lines down.
Also the with(uri) call is unnecessary as you don't use any of the uri methods without prefixing them with uri. anyway. I.e. you can either write:
def uri = new URI('http://example.com?query=abc&loc=US')
def parsed = uri.query.tokenize('&').collect { it.tokenize('=') }
or:
def uri = new URI('http://example.com?query=abc&loc=US')
uri.with {
def parsed = query.tokenize('&').collect { it.tokenize('=') }
}
(note that we are using query directly in the second example)
but there is not much point in using with if you are still prefixing with uri..
The resulting test case might look something like:
def "prepareURI"() {
given: "Search Object"
def uri = handler.prepareURI( properties) // it will return URI like http://example.com?query=abc&loc=US
when:
def parsed = query.tokenize('&').collect { it.tokenize('=') }
then:
assert parsed.count { it.first() == 'loc' } == 1
}

Plug.Conn.Unfetched does not implement the Access behaviour

From the code below, when I call conn.params["geo"], I get the following error:
test/plugs/geoip_test.exs:4
** (UndefinedFunctionError) function Plug.Conn.Unfetched.fetch/2 is undefined (Plug.Conn.Unfetched does not implement the Access behaviour)
stacktrace:
(plug) Plug.Conn.Unfetched.fetch(%{:__struct__ => Plug.Conn.Unfetched, :aspect => :params, "geo" => "Mountain View, US", "ip" => "8.8.8.8"}, "geo")
...
defmodule AgilePulse.Plugs.GeoIPTest do
use AgilePulse.ConnCase
test "returns Mountain View for 8.8.8.8" do
conn = build_conn
params = Map.put(conn.params, "ip", "8.8.8.8")
conn = Map.put(conn, :params, params) |> AgilePulse.Plugs.GeoIP.call(%{})
assert conn.params["geo"] == "Mountain View, US"
end
end
defmodule AgilePulse.Plugs.GeoIP do
import Plug.Conn
def init(opts), do: opts
def call(%Plug.Conn{params: %{"ip" => ip}} = conn, _opts) do
geo = set_geo(ip)
params = Map.put(conn.params, "geo", geo)
Map.put(conn, :params, params)
end
def call(conn, _opts), do: conn
...
end
Could someone enlighten me on why this is failing and what the appropriate solution is? TY!
Short answer: Change this:
params = Map.put(conn.params, "ip", "8.8.8.8")
To:
params = %{"ip": "8.8.8.8"}
Explanation: Phoenix.ConnTest.build_conn/0 returns a Conn with params set to %Plug.Conn.Unfetched{}. By using Map.put on that, you don't reset the value of __struct__, but only add a new key:
%Plug.Conn{ ...,
params: %{:__struct__ => Plug.Conn.Unfetched, :aspect => :params,
"ip" => "8.8.8.8"}, ... }
When you call params["geo"] later, Elixir sees that params is a struct, and tries to call the fetch/2 function on the struct's module, which doesn't exist. To reset params to a normal map (so that Elixir calls Map.get when you use the square bracket syntax), you can just do params = %{"ip": "8.8.8.8"}.

node.js - Is there any proper way to parse JSON with large numbers? (long, bigint, int64)

When I parse this little piece of JSON:
{ "value" : 9223372036854775807 }
This is what I get:
{ hello: 9223372036854776000 }
Is there any way to parse it properly?
Not with built-in JSON.parse. You'll need to parse it manually and treat values as string (if you want to do arithmetics with them there is bignumber.js) You can use Douglas Crockford JSON.js library as a base for your parser.
EDIT2 ( 7 years after original answer ) - it might soon be possible to solve this using standard JSON api. Have a look at this TC39 proposal to add access to source string to a reviver function - https://github.com/tc39/proposal-json-parse-with-source
EDIT1: I created a package for you :)
var JSONbig = require('json-bigint');
var json = '{ "value" : 9223372036854775807, "v2": 123 }';
console.log('Input:', json);
console.log('');
console.log('node.js bult-in JSON:')
var r = JSON.parse(json);
console.log('JSON.parse(input).value : ', r.value.toString());
console.log('JSON.stringify(JSON.parse(input)):', JSON.stringify(r));
console.log('\n\nbig number JSON:');
var r1 = JSONbig.parse(json);
console.log('JSON.parse(input).value : ', r1.value.toString());
console.log('JSON.stringify(JSON.parse(input)):', JSONbig.stringify(r1));
Output:
Input: { "value" : 9223372036854775807, "v2": 123 }
node.js bult-in JSON:
JSON.parse(input).value : 9223372036854776000
JSON.stringify(JSON.parse(input)): {"value":9223372036854776000,"v2":123}
big number JSON:
JSON.parse(input).value : 9223372036854775807
JSON.stringify(JSON.parse(input)): {"value":9223372036854775807,"v2":123}
After searching something more clean - and finding only libs like jsonbigint, I just wrote my own solution. Is not the best, but it solves my problem. For those that are using Axios you can use it on transformResponse callback (this was my original problem - Axios parses the JSON and all bigInts cames wrong),
const jsonStr = `{"myBigInt":6028792033986383748, "someStr":"hello guys", "someNumber":123}`
const result = JSON.parse(jsonStr, (key, value) => {
if (typeof value === 'number' && !Number.isSafeInteger(value)) {
let strBig = jsonStr.match(new RegExp(`(?:"${key}":)(.*?)(?:,)`))[1] // get the original value using regex expression
return strBig //should be BigInt(strBig) - BigInt function is not working in this snippet
}
return value
})
console.log({
"original": JSON.parse(jsonStr),
"handled": result
})
A regular expression is difficult to get right for all cases.
Here is my attempt, but all I'm giving you is some extra test cases, not the solution. Likely you will want to replace a very specific attribute, and a more generic JSON parser (that handles separating out the properties, but leaves the numeric properties as strings) and then you can wrap that specific long number in quotes before continuing to parse into a javascript object.
let str = '{ "value" : -9223372036854775807, "value1" : "100", "strWNum": "Hi world: 42 is the answer", "arrayOfStrWNum": [":42, again.", "SOIs#1"], "arrayOfNum": [100,100,-9223372036854775807, 100, 42, 0, -1, 0.003] }'
let data = JSON.parse(str.replace(/([:][\s]*)(-?\d{1,90})([\s]*[\r\n,\}])/g, '$1"$2"$3'));
console.log(BigInt(data.value).toString());
console.log(data);
you can use this code for change big numbers to strings and later use BigInt(data.value)
let str = '{ "value" : -9223372036854775807, "value1" : "100" }'
let data = JSON.parse(str.replace(/([^"^\d])(-?\d{1,90})([^"^\d])/g, '$1"$2"$3'));
console.log(BigInt(data.value).toString());
console.log(data);

Resources