Getting Empty Array Using Groovy Script in Nifi - groovy

I have an requirement where I need to parse the data into the required format
Input:
{
"Message" : "\nRecord 1:\nRequired data is missing. \n\nRecord 2:\nprocessing failed\n"
}
Here the content and delimiters are not fixed. The fixed part is only /nRecord keyword on which I am writing the Script. But I am not getting desired Output using Groovy.
desired Output:
[
{
"Record 1": "nRequired data is missing"
},
{
"Record 2": "processing failed"
}
]
I have written Groovy Script for the same but I am getting empty array.
import org.apache.commons.io.IOUtils
import groovy.json.*
import java.util.ArrayList
import java.nio.charset.*
import java.nio.charset.StandardCharsets
import groovy.json.JsonSlurper
import groovy.json.JsonBuilder
def flowFile = session.get()
if(!flowFile) return
try {
flowFile = session.write(flowFile,
{ inputStream, outputStream ->
def text = IOUtils.toString(inputStream, StandardCharsets.UTF_8)
splitted = text.split('\nRecord')
int j = splitted.size()
final1 = []
for (int i=0;i<j-1;i++)
{
k = "Record " + splitted[i+1]
valid = k.replaceAll("\\n|\"|\\n|}","")
final1.add("{\"" + valid.replaceFirst(":",'":"')+ "\"}" )
}
def json = JsonOutput.toJson(final1)
outputStream.write(JsonOutput.prettyPrint(json).getBytes(StandardCharsets.UTF_8))
} as StreamCallback)
session.transfer(flowFile, REL_SUCCESS)
} catch(Exception e) {
log.error('Error during JSON operations', e)
flowFile = session.putAttribute(flowFile, "error", e.getMessage())
session.transfer(flowFile, REL_FAILURE)
}
Can you please help me with the same.
Thank you.

I would use a regex with a simple trick:
import groovy.json.*
def json = new JsonSlurper().parseText '{ "Message" : "\nRecord 1:\nRequired data is missing. \n\nRecord 2:\nprocessing failed\nRecord 3:\nprocessing failed badly\n" }'
String msg = json.Message.replaceAll( /\n+(Record \d+:)/, '=$1' ) // THE trick!
List res = ( msg =~ /(?m)(Record \d+):([^=]+)*/ ).collect{ _, rec, text -> [ (rec):text.trim() ] }
assert JsonOutput.toJson( res ) == '[{"Record 1":"Required data is missing."},{"Record 2":"processing failed"},{"Record 3":"processing failed badly"}]'

Related

NiFi ExecuteScript Groovy: java.lang.reflect.UndeclaredThrowableException in XmlSlurper().parseText()

The content is a valid XML text.
But XmlSlurper().parseText(content) produced a java.lang.reflect.UndeclaredThrowableException
Questions:
is there a possibility to verify the XML to find out, what the problem is?
How ca I handle this exception in the catch() part or how can I get the "original" exception type?
import org.apache.commons.io.IOUtils
import org.codehaus.groovy.runtime.StackTraceUtils
import java.nio.charset.*
import groovy.json.*
import groovy.util.*
def flowFile = session.get()
if (!flowFile) return
try {
flowFile = session.write(flowFile,
{inputStream, outputStream ->
def content = IOUtils.toString(inputStream, StandardCharsets.UTF_8)
def MyCatalog = new XmlSlurper().parseText(content)
} as StreamCallback )
session.transfer(flowFile, REL_SUCCESS)
} catch(Exception e) {
def err = "Error in parseText() ${e}"
log.error(err, e)
session.transfer(flowFile, REL_FAILURE)
}

json to json conversion by ExecuteScript processor

I have following json data and i want to convert data in expected result by ExecuteScript nifi processor
{
"time": "2017-01-01T01:14:55+00:00",
"any": {
"nested": "data"
}
}
expected Result
{
"time": 1483233295,
"any": {
"nested": "data"
}
}
I am using following groovy code but getting some error please help to find the solution
var flowFile = session.get();
if (flowFile !== null) {
var StreamCallback = Java.type("org.apache.nifi.processor.io.StreamCallback");
var IOUtils = Java.type("org.apache.commons.io.IOUtils");
var StandardCharsets = Java.type("java.nio.charset.StandardCharsets");
flowFile = session.write(flowFile, new
StreamCallback(function(inputStream, outputStream) {
var inputJSON = IOUtils.toString(inputStream,StandardCharsets.UTF_8);
var contentObj = JSON.parse(inputJSON);
contentObj.time = flowFile.getAttribute("timestamp");
outputStream.write(JSON.stringify(contentObj).getBytes(StandardCharsets.UTF_8));
}));
session.transfer(flowFile, REL_SUCCESS);
}
getting error
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
/home/jdoodle.groovy: 8: unable to resolve class StreamCallback
# line 8, column 36.
flowFile = session.write(flowFile, new
^
1 error
use ExecuteGroovyScript processor (it's optimized for groovy lang) with this kind of code:
import groovy.json.JsonSlurper
import groovy.json.JsonBuilder
def flowFile = session.get()
if (!flowFile) return
flowFile.write{rawIn, rawOut->
def json = rawIn.withReader("UTF-8"){ r-> new JsonSlurper().parse(r) }
json.time = Date.parse("yyyy-MM-dd'T'HH:mm:ssX", json.time).getTime()/1000
rawOut.withWriter("UTF-8"){ w-> new JsonBuilder(json).writeTo(w) }
}
REL_SUCCESS << flowFile
this code converts format of the field time with Date format to unix epoch time inside json content of flowfile.

NiFi processor to search and replace multiple words in a txt file

I have txt file which I am reading using FetchSFTP in NiFi. Also, I have key and values in json format, as shown below received after REST call and JoltTransformJSON:
[{
"Key": "k2s2e2",
"Value": "Ottawa"
}, {
"Key": "60601",
"Value": "Chicago"
}, {
"Key": "",
"Value": "London"
}]
How can I replace all the occurrences of matching key from above to its value in txt file.
Example: abc.txt
000 apple stocks at k2s2e2 888
9000 samsung stocks at 60601 9990377
88 nokia devivces at 78889 790888071 hgj 7
Output:
000 apple stocks at Ottawa 888
9000 samsung stocks at Chicago 9990377
88 nokia devivces at 78889 790888071 hgj 7
My attempt using ExecuteGroovyScript:
import static groovy.json.JsonOutput.toJson
import java.nio.charset.StandardCharsets
import groovy.json.JsonBuilder
import groovy.json.JsonSlurper
class KeyValue{
String key
String value
}
private findAndReplace(KeyValueList) {
def response
KeyValueList.each {
def srcExp = it.key
def replaceText = it.value
def inputFilepath = "C:\\Project\\abc.txt"
def outputFilepath = "C:\\Project\\abc_output.txt"
new File(outputFilepath).withWriter { w ->
new File(inputFilepath).eachLine { line ->
w << line.replaceAll(srcExp , replaceText ) << '\n'
}
response = w
}
new File(inputFilepath).text= new File(outputFilepath).text
}
return response;
}
def flowFile = session.get()
if(!flowFile) return
def KeyValueList = []
//try {
def is = flowFile.read().withReader("UTF-8"){ new JsonSlurper().parse(it) }
is.each {
if(it.Key != "") {
KeyValue keyValue = new KeyValue(key:it.Key,value:it.Value)
KeyValueList.add(keyValue)
}
}
def retval = findAndReplace(KeyValueList)
flowFile = session.write(flowFile, {outputStream ->
outputStream.write(retval.toString().getBytes(StandardCharsets.UTF_8))
} as OutputStreamCallback)
session.transfer(flowFile, REL_SUCCESS)
//}catch(Exception e) {
// log.info(e.getMessage())
// REL_FAILURE << flowFile
//}
it's not a response to your question. just a try to fix your code.
if i understand correctly you are trying to
read json from flowfile
read text from some file path
write text with replacement to flowfile
code for ExecuteGroovyScript processor
import groovy.json.JsonSlurper
def ff = session.get()
if(!ff) return
ff.write{rawIn, rawOut->
def keyValueList = rawIn.withReader("UTF-8"){ new JsonSlurper().parse(it) }
new File('c:/Project/abc.txt').withReader("UTF-8"){reader->
rawOut.withWriter("UTF-8"){writer->
reader.eachLine{line->
keyValueList.each{ if(it.Key) line = line.replaceAll(it.Key, it.Value) }
writer << line << '\n'
}
}
}
}
REL_SUCCESS << ff
don't have time to test it...

Download a zip file using Groovy

I need to download a zip file from a url using groovy.
Test url: https://gist.github.com/daicham/5ac8461b8b49385244aa0977638c3420/archive/17a929502e6dda24d0ecfd5bb816c78a2bd5a088.zip
What I've done so far:
def static downloadArtifacts(url,filename) {
new URL(url).openConnection().with { conn ->
conn.setRequestProperty("PRIVATE-TOKEN", "xxxx")
url = conn.getHeaderField( "Location" )
if( !url ) {
new File((String)filename ).withOutputStream { out ->
conn.inputStream.with { inp ->
out << inp
inp.close()
}
}
}
}
}
But while opening the downloaded zip file I get an error "An error occurred while loading the archive".
Any help is appreciated.
URL url2download = new URL(url)
File file = new File(filename)
file.bytes = url2download.bytes
You can do it with HttpBuilder-NG:
// https://http-builder-ng.github.io/http-builder-ng/
#Grab('io.github.http-builder-ng:http-builder-ng-core:1.0.3')
import groovyx.net.http.HttpBuilder
import groovyx.net.http.optional.Download
def target = 'https://gist.github.com/daicham/5ac8461b8b49385244aa0977638c3420/archive/17a929502e6dda24d0ecfd5bb816c78a2bd5a088.zip'
File file = HttpBuilder.configure {
request.uri = target
}.get {
Download.toFile(delegate, new File('a.zip'))
}
You can do it:
import java.util.zip.ZipEntry
import java.util.zip.ZipOutputStream
class SampleZipController {
def index() { }
def downloadSampleZip() {
response.setContentType('APPLICATION/OCTET-STREAM')
response.setHeader('Content-Disposition', 'Attachment;Filename="example.zip"')
ZipOutputStream zip = new ZipOutputStream(response.outputStream);
def file1Entry = new ZipEntry('first_file.txt');
zip.putNextEntry(file1Entry);
zip.write("This is the content of the first file".bytes);
def file2Entry = new ZipEntry('second_file.txt');
zip.putNextEntry(file2Entry);
zip.write("This is the content of the second file".bytes);
zip.close();
}
}

Nifi: how to convert multiple log file in one file?

I have many internal log , which i write myself inside nifi environment, i want to input all this data inside one log flowFile, but this code trows Null pointer exception what should i change?
import java.nio.charset.StandardCharsets;
import org.apache.commons.io.IOUtils;
import org.apache.nifi.processor.FlowFileFilter;
import groovy.json.JsonSlurper;
import groovy.json.JsonBuilder;
def flowFile = session.get();
def n=0;
if(!flowFile)return
def size = flowFile.getAttribute('fileSize');
int value = size as Integer;
log.error("g");
if((value/338)>1){
def ffList = session.get(new FlowFileFilter(){
public FlowFileFilterResult filter(FlowFile ff) {
if( size == ff.getAttribute('fileSize') ){ n++; return FlowFileFilterResult.ACCEPT_AND_CONTINUE;
}
else{
return FlowFileFilterResult.REJECT_AND_CONTINUE
}
}
})
session.transfer(ffList[n-1],REL_SUCCESS);
session.remove( ffList[0..-2] )
session.remove(flowFile);
}
else{
session.transfer(flowFile,REL_SUCCESS);
}

Resources