SVG—convert Units to use in SVGMatrix - svg

I am about to write a routine that expands <use> elements such that these elements are replaced by the full DOM tree, as described here
The specs say:
An additional transformation translate(x,y) is appended to the end (i.e., right-side) of the ‘transform’ attribute on the generated ‘g’, where x and y represent the values of the ‘x’ and ‘y’ attributes on the ‘use’ element
But those values for x and y may be given like: 10%, so I have created a tiny helper routine like that:
createLength (value) {
const
l = root.createSVGLength(),
parsed = parseFloat(value),
parsedUnit = value
.replace(parsed.toString(), '')
.replace(/^\s*/m, '')
.replace(/\s*$/m, ''),
//UNITS is a map like:
//{px: SVGLength.SVG_LENGTHTYPE_PX, … }
unit = UNITS.hasOwnProperty(parsedUnit) ?
UNITS[parsedUnit] : UNITS.number
;
l.newValueSpecifiedUnits(unit, parsed);
return l;
}
which successfully creates instances of SVGLength. Later in the code I want to create the bespoken SVGTransform which »is appended to the end[…]«, like this:
const [x,y,width,height] =
['x','y','width','height']
.map(attr => node.getAttribute(attr))
.map(val => this.createLength(val))
;
expanded.transform.baseVal
.appendItem(
root.createSVGTransformFromMatrix(
this.matrix(1,0,0,1,
x.value,
y.value)));
What throws this error in chromium, if the given value for x or y is in percentages:
DOMException: Failed to read the 'value' property from 'SVGLength': Could not resolve relative length.
expanded is a reference which is meant to be the replacement of the <use>, which is not attached to a parent node at the time, when the value is assigned.
What is the mistake here?

Thanks to the comment of #RobertLongson I could finally make it work. Instead of creating a new instance of SVGLength, based on the attribute's string value (obtained by node.getAttribute('x')), I just used node.x.baseVal.value for the matrix.
As the docs say for <svg>.createSVGLength():
Creates an SVGLength object outside of any document trees.
Since percentages are relative to values from parent nodes, it is not possible to convert those to numbers, since there is no scale present.
node.x.baseVal also yields the SVGLength, but that one is »inside of any document trees«, so it is possible to convert from pixel to number.

Related

How to combine and sort key-value pair in Terraform

since the last update of the Logicmonitor provider in Terraform we're struggling with a sorting isse.
In LogicMonitor the properties of a device are a name-value pair, and they are presented alfabetically by name. Also in API requests the result is alphabetical. So far nothing fancy.
But... We build our Cloud devices using a module. Calling the module we provide some LogicMonitor properties specially for this device, and a lot more are provided in the module itself.
In the module this looks like this:
`
custom_properties = concat([
{
name = "host_fqdn"
value = "${var.name}.${var.dns_domain}"
},
{
name = "ocid"
value = oci_core_instance.server.id
},
{
name = "private_ip"
value = oci_core_instance.server.private_ip
},
{
name = "snmp.version"
value = "v2c"
}
],
var.logicmonitor_properties)
`
The first 4 properties are from the module and combined with anyting what is in var.logicmonitor_properties. On the creation of the device in LogicMonitor all properties are set in the order the are and no problem.
The issue arises when there is any update on a terraform file in this environment. Due to the fact the properties are presented in alphabetical order, Terraform is showing a lot of changes if finds (but which are in fact just a mixed due to sorting).
The big question is: How can I sort the complete list of properties bases on the "name".
Tried to work with maps, sort and several other functions and examples, but got nothing working on key-value pairs. Merging single key's works fine in a map, but how to deal with name/value pairs/
I think you were on the right track with maps and sorting. Terraform maps do not preserve any explicit ordering themselves, and so whenever Terraform needs to iterate over the elements of a map in some explicit sequence it always do so by sorting the keys lexically (by Unicode codepoints) first.
Therefore one answer is to project this into a map and then project it back into a list of objects again. The projection back into list of objects will implicitly sort the map elements by their keys, which I think will get the effect you wanted.
variable "logicmonitor_properties" {
type = list(object({
name = string
value = string
}))
}
locals {
base_properties = tomap({
host_fqdn = "${var.name}.${var.dns_domain}"
ocid = oci_core_instance.server.id
private_ip = oci_core_instance.server.private_ip
"snmp.version" = "v2c"
})
extra_properties = tomap({
for prop in var.logicmonitor_properties : prop.name => prop.value
})
final_properties = merge(local.base_properties, local.extra_properties)
# This final step will implicitly sort the final_properties
# map elements by their keys.
final_properties_list = tolist([
for k, v in local.final_properties : {
name = k
value = v
}
])
}
With all of the above, local.final_properties_list should be similar to the custom_properties structure you showed in your question except that the elements of the list will be sorted by their names.
This solution assumes that the property names will be unique across both base_properties and extra_properties. If there are any colliding keys between both of those maps then the merge function will prefer the value from extra_properties, overriding the element of the same key from base_properties.
First, use the sort() function to sort the keys in alphabetical order:
sorted_keys = sort(keys(var.my_map))
Next, use the map() function to create a new map with the sorted keys and corresponding values:
sorted_map = map(sorted_keys, key => var.my_map[key])
Finally, you can use the jsonencode() function to print the sorted map in JSON format:
jsonencode(sorted_map)```

Any way to conditionalize variable in jsonencoded data?

Say I have the simplified following snippet to create a task definition as json.
...
task_container_definitions = jsonencode([{
name : var.name,
image : "${var.image}:${var.tag}",
cpu : var.cpu,
memory : var.memory,
}])
...
Say I want to add a variable to optionally create an additional definition so it looks something like this:
variable "another_definition" {
type = any
default = {}
}
...
task_container_definitions = jsonencode([{
name : var.name,
image : "${var.image}:${var.tag}",
cpu : var.cpu,
memory : var.memory,
},
var.another_definition
])
And define it as follows.
another_definition = {
name = "another_container"
image = "another_container"
cpu = 10
memory = 512
essential = true
}
I am able to get this to to output as expected as long as the variable is defined.
...
+ {
+ cpu = 10
+ essential = true
+ image = "another_container"
+ memory = 512
+ name = "another_container"
},
But if the variable is not defined, I see empty {} added to the output when I do a terraform plan, which is not what I expect. I have tried using null as well as the default but get an error.
...
+ {},
Is there a way to toggle this variable off so that if it is not defined then it doesn't show up in the outputted json definition? Is there a better approach than what I am attempting?
I was a little confused at first as to what you were asking, thinking that you were asking for the functionality of the merge function, and I mention that only in case I was right the first time, but I think I now understand your problem as that you want this task_container_definitions to have either one or two elements, depending on whether var.another_definition is set.
There's no single function for that particular situation, but I think we can combine some language features together to get that result.
First, let's decide that the variable being set means that it has a non-null value, and thus its default value should be null to represent the "unset" case:
variable "another_definition" {
type = any
default = null
validation {
# The time constraint above is looser than we really
# want, so this validation rule also enforces that
# the caller can't set this to something inappropriate,
# like a single string or a list.
condition = (
var.another_definition != null ?
can(keys(var.another_definition)) :
true
)
error_message = "Additional task container definition must be an object."
}
}
In Terraform it's a pretty common situation to need to convert between a value that might be null and a list that might have zero or one elements, or vice-versa, and so Terraform has some language features to help with that. In this case we can use a splat expression to concisely represent that. Let's see how that looks in terraform console first just to give a sense of what we're achieving with this:
$ terraform console
> null[*]
[]
> "hello"[*]
[
"hello",
]
> { object = "example" }[*]
[
{
"object" = "example"
},
]
Notice that when I applied the [*] operator to null it returned an empty list, but when I applied it to these other values it converted them to a single-element list. This is how the [*] operator behaves when you apply it to something that isn't a list; see the splat operator docs if you want to learn about the different behavior for lists, which isn't really relevant here because of the validation rule I added above which prevents the var.another_definition value from being a list.
Another tool we have in our Terraform toolbox here is the concat function, which takes one or more lists and returns a single list with the input elements all concatenated together in the given order. We can use this to combine your predefined list that's populated from var.name, var.cpu, etc with the zero-or-one element list created by [*], in order to create a list with their one or two elements:
locals {
task_container_definitions = concat(
[
name = var.name
image = "${var.image}:${var.tag}"
cpu = var.cpu
memory = var.memory
],
var.another_definition[*],
)
task_container_definitions_json = jsonencode(local.task_container_definitions)
}
If any of the arguments to concat are empty lists then they are effectively ignored altogether, because they contribute no elements to the result, and so this achieves (what I hope is) the desired result, by making the "other definition" appear in the result only when it's set to something other than null.

Filter leading to different results when typed manually and via defined as a variable

I'm trying to create a calculated column indicating if a team has a prize or not from the table below:
To do that I need to count within the group if there's a player whose "Prize" field is not empty. Here's the 1st attempt:
Dax Formula:
=
Var Player_Same_Team = filter(Table4,Table4[Team]=earlier(Table4[Team]))
Var Has_Prize = len(Table4[Prize])>0
Return
calculate(countrows(filter(Table4,len(Table4[Prize])>0)),Player_Same_Team)>0
Looks like it's going what I intend it to do. However, when I swap the filter content to a pre-defined variable, it gave me results that don't make sense:
Dax Formula:
=
Var Player_Same_Team = filter(Table4,Table4[Team]=earlier(Table4[Team]))
Var Has_Prize = len(Table4[Prize])>0
Return
calculate(countrows(filter(Table4,Has_Prize)),Player_Same_Team)>0
The typed content len(Table4[Prize])>0 is the same as that in the variable, so what may be causing the difference? Thanks for your help.
As soon as you assign it to a variable, the value of the variable remains constant. Therefore the Len is evaluated to a value, that you are then passing as a filter.
The first example works because the CALCULATE accepts a table as a parameter, and player_same_team is evaluated to a table, by using the FILTER expression.
In order for what you are trying to do to work it would have to be something like this:
= Var Player_Same_Team = filter(Table4,Table4[Team]=earlier(Table4[Team]))
Var Has_Prize = filter(Table4,len(Table4[Prize])>0)
Return calculate(countrows(Has_Prize),Player_Same_Team)>0
You can also write the measure in a slightly different way:
= CALCULATE ( COUNT(Table4[Team]),
ALLEXCEPT(Table4[Team]),
LEN(Table4[Prize])>0) > 0

Declaring an instance of a composite type without initializing all the fields

So I am trying to create an instance of a structure:
struct keypoint
x
y
scale
angle
Vector{Any}(VecLength)
end
Now I know the values of all the fields except the last one. I need to initialize the instance of the structure with the known values but for the last field I have to call another function where the data to be generated and then stored in the last field of the instance. Is there a way to get this done in Julia?
I am referring to the tutorials here and here but I guess in both places all the fields of the instance have been initialized at one go.
Thanks!
mutable struct keypoint
x
y
scale
angle
keypoint(x,y,scale) = new(x,y,scale)
end
a = keypoint(1,1.0,2.0) # keypoint(1, 1.0, 2.0, #undef)
Notice that if you then try to access a.angle you get
ERROR: UndefRefError: access to undefined reference
Stacktrace:
[1] getproperty(::Any, ::Symbol) at .\sysimg.jl:18
so by leaving it off you get an undef in there that errors upon access. But you can then set it later.

Why does this Context.Sync not work?

Why does this code snippet not write the values back to Excel unless I un-comment the range.values=range.values line?
$('#run').click(function() {
invokeRun()
.catch(OfficeHelpers.logError);
});
function invokeRun() {
return Excel.run(function(context) {
var range = context.workbook.worksheets.getItem("Sheet1").getRange("A1:B3");
range.load('values');
return context.sync()
.then(function() {
range.values[1][1]=99;
console.log(JSON.stringify(range.values));
//range.values=range.values
return context.sync();
});
});
}
Array properties are special. I have added a page on my website to describe the topic: Reading and writing array properties.
Summarizing from there, the way that the proxy-object model works, whenever you set a property on an object, the Office.js runtime has a hook into the setter and getter, which is used to intercept the call and add the command to the queue.
Let's take an example of a regular property first. Per the above, whenever you set something like range.format.fill.color = "red", the setter for the color property intercepts the request and internally adds a command into the queue to set the range fill color to red (to be dispatched with the next context.sync)
On the other hand, if all you had was var color = range.format.fill.color
(after a load and a sync, of course), the getter would fire instead of the setter, and the color variable would get the range's current fill color.
Now, that was regular properties. Whenever you set an element of the array, you are effectively accessing the array value as a getter. From a runtime perspective, this line is no different from a slightly more verbose version:
var array = range.values;
array[r][c] = '-';
Because the getter for range.values returns a perfectly plain JS array object, accessing it and then setting its value does nothing to propagate it back to the original Range object.
If you want the values to get reflected back, the best thing is to get a reference to the array right after the sync (i.e., var array = range.values, just as above), then set the values on the array as needed, and then finally set it back to the object: range.values = array.
It means you could also modify the values array in place, and then assign the values property back to itself at the completion of the loop (range.values = range.values). However, this looks awkward, as if it’s a no-op, whereas in reality it is not. So personally, I prefer to retrieve the array at the beginning and assign it to its own variable, then do any necessary modifications, and finally set the full array back.
UPDATE to clarify the above:
To be very clear, the arrays returned by accessing the .values, .formulas, etc., ARE pure vanilla JS arrays. That's actually the crux of the problem: that in order for Office.js to return pure objects, it means that those pure objects can't be "spiked" with the ability to reflect changes.
For what it's worth, we actually have an upcoming feature that should be rolling out in a month or two, where we will be introducing an object.set syntax, as in:
range.set({
values: [[1, 2], [3, 4]],
format: {
fill: {
color: "purple"
}
}
}
This will make it more convenient to set multiple properties on the same object, but it might also make the array properties easier to deal with.

Resources