How to add map in the Azure App Configuration - azure

In the Azure App Configuration, we store value in the form of key-value pair. Generally, we store string values in the key-value pair like:
"key" : "red"
But I want to store map in value like:
"key" : {
1: {1,2,3},
2: {1,4}
}
In my spring-boot application, I will read the variable as Map<integer, List>

You can Leverage content-type to store JSON key-values in App Configuration.
Data is stored in App Configuration as key-values, where values are treated as the string type by default. However, you can specify a custom type by leveraging the content-type property associated with each key-value, so that you can preserve the original type of your data or have your application behave differently based on content-type.
Valid JSON values
When a key-value has JSON content-type, its value must be in valid JSON format for clients to process it correctly. Otherwise, clients may fail or fall back and treat it as string format. Some examples of valid JSON values are:
"John Doe"
723
false
null
"2020-01-01T12:34:56.789Z"
[1, 2, 3, 4]
{"ObjectSetting":{"Targeting":{"Default":true,"Level":"Information"}}}

Related

Handling Nullable Json Response when using RESTFul Technical Profile

I am using the REST technical profile to retrieve data from a claims provider.
The JSON payload that is returned includes values that are optional for some users.
When the value is not set, it will be returned as the null literal
When the value is set, it will be returned as a string that contains an ISO 8601 formatted date time.
User Group 1 (well formed ISO 8601 string)
{
"OptionalIso8601DateTime" : "2022-12-11T12:34:56Z"
}
User Group 2 (null literal)
{
"OptionalIso8601DateTime" : null
}
AAD B2C appears to have two distinct behaviours when processing the JSON Payload of a REST endpoint (that are not configurable) which results the inability for a claims mapping to be defined that does not result in a server_error response being generated for one group of users.
Behaviours:
All strings that are determined to contain valid dateTime string literals are automatically cast to dateTime
as a result, a valid dateTime string literal can only be mapped to claims of type dateTime
The null literal value is automatically cast as a string - and populated with the empty string value
as a result, the null literal can only be mapped to claims of type string
I have attempted to define mappings using the string and dateTime data types, however neither is able to function for both user groups.
Observations:
AAD B2C Claim DataType
Payload Value
Result
Comment
string
null
""
null literal is converted to empty string
string
"2022-12-11T12:34:56Z"
AADB2C99072
dateTime cannot be converted to string
dateTime
null
AADB2C99072
string cannot be converted to dateTime
dateTime
"2022-12-11T12:34:56Z"
1670762096
value emitted as number (unix epoch) as per default OpenId Jwt Issuance Policy
I have no control over the format of the payload that is returned by the external system. Ideally, they would omit the claim in the event that there is no value, however returning the null literal is valid according to the Json specification.
I am looking for a mechanism to be able to map the data (specifically the case where the value is not null) as a valid dateTime so that it can be passed to downstream applications - that does not result in authentication failing with error AADB2C99072 for one set of users.
Ideally, if there were some way to configure AAD B2C to ignore null values (treat them as absent from the payload), or for the null to be preserved so that there is not an error when the empty string is set as the value of the dateTime claim.

Updating firestore document nested data overwrites it

I'm trying to set some new fields in a nested dict within a Firestore document, which results in the data being overwritten.
Here's where I write the first part of the info I need:
upd = {
"idOffer": {
<offerId> : {
"ref" : <ref>,
"value" : <value>
}
}
}
<documentRef>.update(upd)
So output here is something like:
<documentid>:{idOffer:{<offerId>:{ref:<ref>, value:<value>}}}
Then I use this code to add some fields to the current <offerId> nested data:
approval = {
"isApproved" : <bool>,
"dateApproved" : <date>,
"fullApproval" : <bool>
}
<documentRef>.update({
"idOffer.<offerId>" : approval
})
From which I expect to get:
<documentid>:{idOffer:{<offerId>:{ref:<ref>, value:<value>, isApproved:<bool>,dateApproved:<date>,fullApproval:<bool>}}}
But I end up with:
<documentid>:{idOffer:{<offerId>:{isApproved:<bool>,dateApproved:<date>,fullApproval:<bool>}}}
Note: I use <> to refer to dynamic data, like document Ids or References.
When you call update with a dictionary (or map, or object, or whatever key/value pair structure used in other languages), the entire set of data behind the given top-level keys are going to be replaced. So, if you call update with a key of idOffer.<offerId>, then everything under that key is going to be replaced, while every other child key of the idOffer level will remain unchanged.
If you don't want to replace the entire object behind the key, then be more specific about which children you'd like to update. In your example, instead of updating a single idOffer.<offerId> key, specify three keys for the nested children:
idOffer.<offerId>.isApproved
idOffer.<offerId>.dateApproved
idOffer.<offerId>.fullApproval
That is to say, the dictionary you pass should have three keyed entries like this at the top level, rather than a single key of idOffer.<offerId>.

Can I loop over keys and values of an object in OPA to validate if they adhere to a certain format (CamelCase)

We are using conftest to validate if our terraform changeset applies to certain rules & compliances. One thing we want to validate is wether our AWS resources are tagged according to the AWS tagging convention, which specifies certain tags to use (e.g. Owner, ApplicationRole, Project) and specifies that all tags and values are in CamelCase.
in terraform the changeset is portrayed in the following (simplified) json output:
{
"resource_changes":{
"provider_name":"aws",
"change":{
"before":{
},
"after":{
"tags":{
"ApplicationRole":"SomeValue",
"Owner":"SomeValue",
"Project":"SomeValue"
}
}
}
}
}
What I am now trying to do is to validate the following:
Check wether tags are set.
Validate if the keys and values are all camelcase.
Check that the keys include the set (ApplicationRole, Owner, Project) in the minimum.
However, I am having trouble defining that in Rego (I am quite new to OPA).
Is there a way to "loop" over the keys and values of an object, and validate if they are formatted correctly?
in pseudo code:
for key, value in tags {
re_match(`([A-Z][a-z0-9]+)+`, key)
re_match(`([A-Z][a-z0-9]+)+`, value)
}
I have tried the following:
tags_camel_case(tags) {
some key
val := tags[key]
re_match(`^([A-Z][a-z0-9]+)+`, key) # why is key not evaluated?
re_match(`^([A-Z][a-z0-9]+)+`, val)
}
However, when evaluating against the following test json:
{
"AppRole": "SomeValue",
"appRole": "SomeValue"
}
the rule returns true, even though I am checking both key and value vs the regex
The tags_camel_case(tags) function returns true for the input with two keys because (by default) variables in Rego are existentially quantified. This means rule bodies are satisfied if for some set of variable bindings, the statements in the rule body are true. In the example above, the rule body would be satisfied by {key=AppRole, val=SomeValue}.
To express for all you can use a simple trick. First write a rule to check if any of the tags ARE NOT camel case. Second write the rule to check if the first rule is not satisfied.
For example:
# checks if all tags are camel case
tags_camel_case(tags) {
not any_tags_not_camel_case(tags)
}
# checks if any tags are NOT camel case
any_tags_not_camel_case(tags) {
some key
val := tags[key]
not is_camel_case(key, val)
}
# checks if a and b are both camel case
is_camel_case(a, b) {
re_match(`^([A-Z][a-z0-9]+)+`, a)
re_match(`^([A-Z][a-z0-9]+)+`, b)
}
For more info on expression 'for all' in Rego see https://www.openpolicyagent.org/docs/latest/how-do-i-write-policies/#universal-quantification-for-all

Property selection is not supported on values of type 'Integer'

I would like to send this dynamic content:
content:#concat(formatDateTime(adddays(utcnow(),-1),'mm'),formatDateTime(adddays(utcnow(),-1),'dd'))
from web activity in Azure Data Factory to logic Apps.
on the logic app side I have defined such a body:
in the second step I would like to extract the value:
but after running at this step I get this error:
InvalidTemplate. Unable to process template language expressions in action 'Extract' inputs at line '1' and column '1292': 'The template language expression 'triggerBody()?['ID']' cannot be evaluated because property 'ID' cannot be selected. Property selection is not supported on values of type 'Integer'. Please see https://aka.ms/logicexpressions for usage details.'.
How can I solve this problem?
Add
Content-type = application/json
in request header.
From my test and your error message, your ID in your content must be like this:
{
"ID":222223
}
In this way, The ID would be in String type. So you need to change your ID into String type like this:
{
"ID":"222223"
}
Or change your JSON Schema "ID" type to Integer and the Variable Type to Integer. Then the logic apps will work.

Firebase Database reference toSON returns numeric keys

I am trying to fetch numbers from a table in my Firebase Database called /numbers using the toJSON() method of the reference object. This is done with the node.js admin object. The keys in the table are in E.164 format, so they are numbers like +15555555555. Such is the structure
numbers: {
+18392998683: 'some_user_id',
+18589392928: 'another_user_id',
....
}
I am expecting the keys to be the ones I supplied, but instead I receive numeric array subscripts for each entry: 0, 1, 2, 3, etc. when doing
for (key in numbersObj)
Any ideas?
It turns out calling toJSON() on the reference does not directly return the object, but instead a representation of the reference, in this case an https link to the reference itself.
https://firebase.google.com/docs/reference/admin/node/admin.database.Reference#toJSON

Resources