Compare two text files using groovy in Jenkins slave machine. Sample text files are as follows.
Sample1.txt:
team_a, added
team_b, removed
team_c, added
Sample2.txt:
team_d, added
team_e, added
team_c, removed
Need to identify the change and should give the output as team_c has been removed.
Asking this question as being newbie to groovy.
You haven't posted your code, but say you are a newbie, so I'll talk you through an example:
// For simplicity, I won't use files, but a "heredoc"
Sample1txt = '''
team_a, added
team_b, removed
team_c, added
'''
Sample2txt ='''
team_d, added
team_e, added
team_c, removed
'''
//get the data in a map each
Map a = [:]
Sample1txt.splitEachLine(",") { line -> a[line[0]] = line[1]}
Map b = [:]
Sample2txt.splitEachLine(",") { line -> b[line[0]] = line[1]}
//for each key value pair in the second map...
b.each {k, v ->
//if the key exists (you didn't say what you want to do with new keys)
if (a.containsKey(k)) {
//and if the key's value is not equal to the same key's value in the first map
if (!b[k].equals(a[k])) {
//print it
println "${k} : ${v}"
}
}
}
This is not a robust solution, but it will point you in the right direction.
Related
since the last update of the Logicmonitor provider in Terraform we're struggling with a sorting isse.
In LogicMonitor the properties of a device are a name-value pair, and they are presented alfabetically by name. Also in API requests the result is alphabetical. So far nothing fancy.
But... We build our Cloud devices using a module. Calling the module we provide some LogicMonitor properties specially for this device, and a lot more are provided in the module itself.
In the module this looks like this:
`
custom_properties = concat([
{
name = "host_fqdn"
value = "${var.name}.${var.dns_domain}"
},
{
name = "ocid"
value = oci_core_instance.server.id
},
{
name = "private_ip"
value = oci_core_instance.server.private_ip
},
{
name = "snmp.version"
value = "v2c"
}
],
var.logicmonitor_properties)
`
The first 4 properties are from the module and combined with anyting what is in var.logicmonitor_properties. On the creation of the device in LogicMonitor all properties are set in the order the are and no problem.
The issue arises when there is any update on a terraform file in this environment. Due to the fact the properties are presented in alphabetical order, Terraform is showing a lot of changes if finds (but which are in fact just a mixed due to sorting).
The big question is: How can I sort the complete list of properties bases on the "name".
Tried to work with maps, sort and several other functions and examples, but got nothing working on key-value pairs. Merging single key's works fine in a map, but how to deal with name/value pairs/
I think you were on the right track with maps and sorting. Terraform maps do not preserve any explicit ordering themselves, and so whenever Terraform needs to iterate over the elements of a map in some explicit sequence it always do so by sorting the keys lexically (by Unicode codepoints) first.
Therefore one answer is to project this into a map and then project it back into a list of objects again. The projection back into list of objects will implicitly sort the map elements by their keys, which I think will get the effect you wanted.
variable "logicmonitor_properties" {
type = list(object({
name = string
value = string
}))
}
locals {
base_properties = tomap({
host_fqdn = "${var.name}.${var.dns_domain}"
ocid = oci_core_instance.server.id
private_ip = oci_core_instance.server.private_ip
"snmp.version" = "v2c"
})
extra_properties = tomap({
for prop in var.logicmonitor_properties : prop.name => prop.value
})
final_properties = merge(local.base_properties, local.extra_properties)
# This final step will implicitly sort the final_properties
# map elements by their keys.
final_properties_list = tolist([
for k, v in local.final_properties : {
name = k
value = v
}
])
}
With all of the above, local.final_properties_list should be similar to the custom_properties structure you showed in your question except that the elements of the list will be sorted by their names.
This solution assumes that the property names will be unique across both base_properties and extra_properties. If there are any colliding keys between both of those maps then the merge function will prefer the value from extra_properties, overriding the element of the same key from base_properties.
First, use the sort() function to sort the keys in alphabetical order:
sorted_keys = sort(keys(var.my_map))
Next, use the map() function to create a new map with the sorted keys and corresponding values:
sorted_map = map(sorted_keys, key => var.my_map[key])
Finally, you can use the jsonencode() function to print the sorted map in JSON format:
jsonencode(sorted_map)```
below is the list of files :
abc_2019_01_30_5816789.bak, abc_2019_01_31_2992794.bak,
xyz_2019_01_26_4690992.bak, xyz_2019_01_27_8319704.bak,
pqr_2019_01_30_5986789.bak, pqr_2019_01_31_3142809.bak,
test_2019_01_30_6076789.bak, test_2019_01_31_3232818.bak,
testing_2019_01_30_6026789.bak, testing_2019_01_31_3192814.bak,
repair_2019_01_30_6116789.bak, repair_2019_01_31_3282823.bak,
factory_2019_01_30_5646789.bak, factory_2019_01_31_2802775.bak
i have this list in "parsedlist", so when i sort them and pick the latest 7, i see a couple of duplicate files. my requirement is to have 7 unique files which are latest and write them to a text file. I have tried the below code :
List<String> sortedList = parsedList.sort(false).reverse()
println sortedList.take(7)
String filename = "D:\\latest.txt"
new File(filename).write(sortedList.take(7).join(","))
You are just sorting the list by the full filename, which gives you all the "xyz" files first, then "testing" and so on...
One way would be to use groupBy to first group the files by prefix, then sort each group and finally pick the last item from each group.
println parsedList
.groupBy{it[0..-24]} // group by prefix (remove timestamp), results in a map like [abc:['abc_2019_...', 'abc_2019_...'], xyz:[...], ...]
.values() // collect the values from the KeyValuePairs (i.e. just the lists with the strings) --> [['abc_2019_...', 'abc_...'], ['xyz_...','xyz_...'], ...]
*.sort() // sort each of the lists
*.getAt(-1) // from each list take the last item
Groovy web console
An alternative for fun (as there are many ways to do things in Groovy)
println parsedList.groupBy { it.split(/_\d{4}_/).head() }
.collect { k, v -> v.sort().last() }
I have a map1 which holds the information as
[40256942,6] [60246792,5]
Now that I want to prepare a map2 that holds information such as
itemNo, 40256942
qty, 6
itemNo, 60246792
qty, 5
to prepare final information as json
“partialArticlesInfo”: [{itemNo:”40256942”, availQty:”6”}, {itemNo:”60246792”, availQty:”5”}]
I am trying to iterate map1 to retrieve values and set that against the key. But I am getting only one entry which is last one. Is there any way , I get the new map with entries such as mentioned above
Map<String, String> partialArticlesInfo = new HashMap<String,String>();
Map<String, String> partialArticlesTempMap = null;
for (Map.Entry<String,String> entry : partialStockArticlesQtyMap.entrySet())
{
partialArticlesTempMap = new HashMap<String,String>();
partialArticlesTempMap.put("itemNo",entry.getKey());
partialArticlesTempMap.put("availQty",entry.getValue());
partialArticlesInfo.putAll(partialArticlesTempMap);
}
In Java (I'm assuming you're using Java, in the future it would be helpful to specify that) and every other language I know of, a map holds mappings between keys and values. Only one mapping is allowed per key. In your "map2", the keys are "itemNo" and "availQty". So what is happening is that your for loop sets the values for the first entry, and then is overwriting them with the data from the second entry, which is why that is the only one you see. Look at Java - Map and Map - Java 8 for more info.
I don't understand why you are trying to put the data into a map, you could just put it straight into JSON with something like this:
JSONArray partialArticlesInfo = new JSONArray();
for (Map.Entry<String,String> entry : partialStockArticlesQtyMap.entrySet()) {
JSONObject stockEntry = new JSONObject();
stockEntry.put("itemNo", entry.getKey());
stockEntry.put("availQty", entry.getValue());
partialArticlesInfo.put(stockEntry);
}
JSONObject root = new JSONObject();
root.put("partialArticlesInfo",partialArticlesInfo);
This will take "map1" (partialStockArticlesQtyMap in your code) and create a JSON object exactly like your example - no need to have map2 as an intermediate step. It loops over each entry in map1, creates a JSON object representing it and adds it to a JSON array, which is finally added to a root JSON object as "partialArticlesInfo".
The exact code may be slightly different depending on which JSON library you are using - check the docs for the specifics.
I agree with Brendan. Another solution would be otherwise to store in the Set or List objects like the following.
class Item {
Long itemNo;
int quantity;
public int hashCode() {
Long.hashCode(itemNo) + Integer.hashCode(quantity);
}
public int equals(Object other) {
other instanceOf Item && other.itemNo == this.itemNo && other.quantity = this.quantity;
}
}
}
then you can use the JsonArray method described by him to get the Json string in output
This means that adding new variables to the object won't require any more effort to generate the Json
How Can I Make A Deep Copy of a Groovy ConfigObject? I see that I can make a shallow copy of the object with .clone(), but I want to make a full deep copy.
The problem with #HappyCoder86 answer is it assumes all config key / values are strings. If some of your config values are objects / closures (common in grails) below solution wont work.
Below solution may be slow but would work if you have values of type other then string
static def deepcopy(ConfigObject orig) {
ConfigObject copy = new ConfigObject()
orig.keySet().each { key ->
def value = orig.get(key)
if (value instanceof ConfigObject) {
value = deepcopy(value)
}
copy.put(key, value)
}
return copy
}
ConfigObject config = new ConfigSlurper().parse originalConf.toProperties()
ConfigObject clonedConfigObject = new ConfigSlurper().parse("${originalConfigObject.prettyPrint()}")
Won't work with entries having Class or Closure values types.
But will work with Lists and Maps.
I'm getting a text which contains ${somethingElse} inside, but it's just a normal String.
I've got a class:
class Whatever {
def somethingElse = 5
void action(String sth) {
def test = []
test.testing = sth
assert test.testing == 5
}
}
Is it possible with groovy?
EDIT:
my scenario is: load xml file, which contains nodes with values pointing to some other values in my application. So let's say I've got shell.setVariable("current", myClass). And now, in my xml I want to be able to put ${current.someField} as a value.
The trouble is, that the value from xml is a string and I can't evaluate it easily.
I can't predict how these "values" will be created by user, I just give them ability to use few classes.
I cannot convert it when the xml file is loaded, it has to be "on demand", since I use it in specific cases and I want them to be able to use values at that moment in time, and not when xml file is loaded.
Any tips?
One thing you could do is:
class Whatever {
def somethingElse = 5
void action( String sth ) {
def result = new groovy.text.GStringTemplateEngine().with {
createTemplate( sth ).make( this.properties ).toString()
}
assert result == "Number 5"
}
}
// Pass it a String
new Whatever().action( 'Number ${somethingElse}' )
At first, what we did, was used this format in xml:
normalText#codeToExecuteOrEvaluate#normalText
and used replace closure to regexp and groovyShell.evaluate() the code.
Insane. It took a lot of time and a lot of memory.
In the end we changed the format to the original one and crated scripts for each string we wanted to be able to evaluate:
Script script = shell.parse("\""+stringToParse+"\"")
where
stringToParse = "Hello world # ${System.currentTimeMillis()}!!"
Then we were able to call script.run() as many times as we wanted and everything performed well.
It actually still does.