The content is a valid XML text.
But XmlSlurper().parseText(content) produced a java.lang.reflect.UndeclaredThrowableException
Questions:
is there a possibility to verify the XML to find out, what the problem is?
How ca I handle this exception in the catch() part or how can I get the "original" exception type?
import org.apache.commons.io.IOUtils
import org.codehaus.groovy.runtime.StackTraceUtils
import java.nio.charset.*
import groovy.json.*
import groovy.util.*
def flowFile = session.get()
if (!flowFile) return
try {
flowFile = session.write(flowFile,
{inputStream, outputStream ->
def content = IOUtils.toString(inputStream, StandardCharsets.UTF_8)
def MyCatalog = new XmlSlurper().parseText(content)
} as StreamCallback )
session.transfer(flowFile, REL_SUCCESS)
} catch(Exception e) {
def err = "Error in parseText() ${e}"
log.error(err, e)
session.transfer(flowFile, REL_FAILURE)
}
Related
I have an requirement where I need to parse the data into the required format
Input:
{
"Message" : "\nRecord 1:\nRequired data is missing. \n\nRecord 2:\nprocessing failed\n"
}
Here the content and delimiters are not fixed. The fixed part is only /nRecord keyword on which I am writing the Script. But I am not getting desired Output using Groovy.
desired Output:
[
{
"Record 1": "nRequired data is missing"
},
{
"Record 2": "processing failed"
}
]
I have written Groovy Script for the same but I am getting empty array.
import org.apache.commons.io.IOUtils
import groovy.json.*
import java.util.ArrayList
import java.nio.charset.*
import java.nio.charset.StandardCharsets
import groovy.json.JsonSlurper
import groovy.json.JsonBuilder
def flowFile = session.get()
if(!flowFile) return
try {
flowFile = session.write(flowFile,
{ inputStream, outputStream ->
def text = IOUtils.toString(inputStream, StandardCharsets.UTF_8)
splitted = text.split('\nRecord')
int j = splitted.size()
final1 = []
for (int i=0;i<j-1;i++)
{
k = "Record " + splitted[i+1]
valid = k.replaceAll("\\n|\"|\\n|}","")
final1.add("{\"" + valid.replaceFirst(":",'":"')+ "\"}" )
}
def json = JsonOutput.toJson(final1)
outputStream.write(JsonOutput.prettyPrint(json).getBytes(StandardCharsets.UTF_8))
} as StreamCallback)
session.transfer(flowFile, REL_SUCCESS)
} catch(Exception e) {
log.error('Error during JSON operations', e)
flowFile = session.putAttribute(flowFile, "error", e.getMessage())
session.transfer(flowFile, REL_FAILURE)
}
Can you please help me with the same.
Thank you.
I would use a regex with a simple trick:
import groovy.json.*
def json = new JsonSlurper().parseText '{ "Message" : "\nRecord 1:\nRequired data is missing. \n\nRecord 2:\nprocessing failed\nRecord 3:\nprocessing failed badly\n" }'
String msg = json.Message.replaceAll( /\n+(Record \d+:)/, '=$1' ) // THE trick!
List res = ( msg =~ /(?m)(Record \d+):([^=]+)*/ ).collect{ _, rec, text -> [ (rec):text.trim() ] }
assert JsonOutput.toJson( res ) == '[{"Record 1":"Required data is missing."},{"Record 2":"processing failed"},{"Record 3":"processing failed badly"}]'
I have following json data and i want to convert data in expected result by ExecuteScript nifi processor
{
"time": "2017-01-01T01:14:55+00:00",
"any": {
"nested": "data"
}
}
expected Result
{
"time": 1483233295,
"any": {
"nested": "data"
}
}
I am using following groovy code but getting some error please help to find the solution
var flowFile = session.get();
if (flowFile !== null) {
var StreamCallback = Java.type("org.apache.nifi.processor.io.StreamCallback");
var IOUtils = Java.type("org.apache.commons.io.IOUtils");
var StandardCharsets = Java.type("java.nio.charset.StandardCharsets");
flowFile = session.write(flowFile, new
StreamCallback(function(inputStream, outputStream) {
var inputJSON = IOUtils.toString(inputStream,StandardCharsets.UTF_8);
var contentObj = JSON.parse(inputJSON);
contentObj.time = flowFile.getAttribute("timestamp");
outputStream.write(JSON.stringify(contentObj).getBytes(StandardCharsets.UTF_8));
}));
session.transfer(flowFile, REL_SUCCESS);
}
getting error
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
/home/jdoodle.groovy: 8: unable to resolve class StreamCallback
# line 8, column 36.
flowFile = session.write(flowFile, new
^
1 error
use ExecuteGroovyScript processor (it's optimized for groovy lang) with this kind of code:
import groovy.json.JsonSlurper
import groovy.json.JsonBuilder
def flowFile = session.get()
if (!flowFile) return
flowFile.write{rawIn, rawOut->
def json = rawIn.withReader("UTF-8"){ r-> new JsonSlurper().parse(r) }
json.time = Date.parse("yyyy-MM-dd'T'HH:mm:ssX", json.time).getTime()/1000
rawOut.withWriter("UTF-8"){ w-> new JsonBuilder(json).writeTo(w) }
}
REL_SUCCESS << flowFile
this code converts format of the field time with Date format to unix epoch time inside json content of flowfile.
I have a GitBlit instance on a windows server, and i want to set a hook on post receive callback to start a gitlab ci pipeline on another server.
I already have set a GitlabCi trigger who works well, but my hook doesn't. Here is build-gitlab-ci.groovy file :
import com.gitblit.GitBlit
import com.gitblit.Keys
import com.gitblit.models.RepositoryModel
import com.gitblit.models.UserModel
import com.gitblit.utils.JGitUtils
import org.eclipse.jgit.lib.Repository
import org.eclipse.jgit.revwalk.RevCommit
import org.eclipse.jgit.transport.ReceiveCommand
import org.eclipse.jgit.transport.ReceiveCommand.Result
import org.slf4j.Logger
logger.info("Gitlab-CI hook triggered by ${user.username} for ${repository.name}")
// POST :
def sendPostRequest(urlString, paramString) {
def url = new URL(urlString)
def conn = url.openConnection()
conn.setDoOutput(true)
def writer = new OutputStreamWriter(conn.getOutputStream())
writer.write(paramString)
writer.flush()
String line
def reader = new BufferedReader(new InputStreamReader(conn.getInputStream()))
while ((line = reader.readLine()) != null) {
println line
}
writer.close()
reader.close()
}
sendPostRequest("https://xxxxx/api/v4/projects/1/trigger/pipeline", "token=xxxxxxxx&ref=master")
The project configuration :
Moreover, i don't know where logger.info write the log, so i don't know if my script was executed well. Thanks for help
I found my problem, it was a SSL self-certificate problem. I added this code to ignore it :
import com.gitblit.GitBlit
import com.gitblit.Keys
import com.gitblit.models.RepositoryModel
import com.gitblit.models.UserModel
import com.gitblit.utils.JGitUtils
import org.eclipse.jgit.lib.Repository
import org.eclipse.jgit.revwalk.RevCommit
import org.eclipse.jgit.transport.ReceiveCommand
import org.eclipse.jgit.transport.ReceiveCommand.Result
import org.slf4j.Logger
logger.info("Gitlab-CI hook triggered by ${user.username} for ${repository.name}")
def nullTrustManager = [
checkClientTrusted: { chain, authType -> },
checkServerTrusted: { chain, authType -> },
getAcceptedIssuers: { null }
]
def nullHostnameVerifier = [
verify: { hostname, session -> hostname.startsWith('yuml.me')}
]
javax.net.ssl.SSLContext sc = javax.net.ssl.SSLContext.getInstance("SSL")
sc.init(null, [nullTrustManager as javax.net.ssl.X509TrustManager] as javax.net.ssl.X509TrustManager[], null)
javax.net.ssl.HttpsURLConnection.setDefaultSSLSocketFactory(sc.getSocketFactory())
javax.net.ssl.HttpsURLConnection.setDefaultHostnameVerifier(nullHostnameVerifier as javax.net.ssl.HostnameVerifier)
def url = new URL("https://xxxx/api/v4/projects/{idProject}/trigger/pipeline")
def conn = url.openConnection()
conn.setDoOutput(true)
def writer = new OutputStreamWriter(conn.getOutputStream())
writer.write("token={token}&ref={branch}")
writer.flush()
String line
def reader = new BufferedReader(new InputStreamReader(conn.getInputStream()))
while ((line = reader.readLine()) != null) {
println line
}
writer.close()
reader.close()
And I identified the error checking the logs in E:\gitblit-1.7.1\logs\gitblit-stdout.{date}.log.
NB : stdout file date can be very old. Gitblit doesn't create a file per day. Mine had a name expired 4 months ago.
I have many internal log , which i write myself inside nifi environment, i want to input all this data inside one log flowFile, but this code trows Null pointer exception what should i change?
import java.nio.charset.StandardCharsets;
import org.apache.commons.io.IOUtils;
import org.apache.nifi.processor.FlowFileFilter;
import groovy.json.JsonSlurper;
import groovy.json.JsonBuilder;
def flowFile = session.get();
def n=0;
if(!flowFile)return
def size = flowFile.getAttribute('fileSize');
int value = size as Integer;
log.error("g");
if((value/338)>1){
def ffList = session.get(new FlowFileFilter(){
public FlowFileFilterResult filter(FlowFile ff) {
if( size == ff.getAttribute('fileSize') ){ n++; return FlowFileFilterResult.ACCEPT_AND_CONTINUE;
}
else{
return FlowFileFilterResult.REJECT_AND_CONTINUE
}
}
})
session.transfer(ffList[n-1],REL_SUCCESS);
session.remove( ffList[0..-2] )
session.remove(flowFile);
}
else{
session.transfer(flowFile,REL_SUCCESS);
}
I want to get all flowfile from the queu which fileSize is greater than 831 and then put them into the list after that catch last flowfile from the list and transfer to success relationship and finally remove all other flowfiles , here is my code which throws exception that transfer relationship not specified , what should i change in this case?
import org.apache.nifi.processor.FlowFileFilter;
import groovy.json.JsonSlurper
import groovy.json.JsonBuilder
import java.nio.charset.StandardCharsets
import org.apache.commons.io.IOUtils
def flowFile = session.get()
def n=0;
if(!flowFile)return
def size = flowFile.getAttribute('fileSize');
log.error(size.toString())
int value = size as Integer;
if((value/831)>1){
def ffList = session.get(new FlowFileFilter(){
public FlowFileFilterResult filter(FlowFile ff) {
if( size == ff.getAttribute('fileSize') ) n++; return FlowFileFilterResult.ACCEPT_AND_CONTINUE
return FlowFileFilterResult.REJECT_AND_CONTINUE
}
})
session.transfer(ffList.get(n-1),REL_SUCCESS)
//session.remove(ffList);
}
session.remove(flowFile);
if you get flow file from queue you have to do something with it.
this code returns you a flowFile list:
def ffList = session.get(new FlowFileFilter(){...})
if you wand just remove all of them except the last one just put this code after transferring the last one:
session.remove( ffList[0..-2] )
and i guess there is a mistake in this line:
if( size == ff.getAttribute('fileSize') ) n++; return FlowFileFilterResult.ACCEPT_AND_CONTINUE
the command return FlowFileFilterResult.ACCEPT_AND_CONTINUE executed in any case because it's not under if.
i think should be like this:
if( size == ff.getAttribute('fileSize') ){
n++;
return FlowFileFilterResult.ACCEPT_AND_CONTINUE
}