Smooks GroovyContentHandlerFactory Exception when upgrading from 1.4 to 1.5.1? - groovy

I have recently upgraded my Smooks application from 1.4 to 1.5.1, but I keep getting the exception below:
Error when processing EDI file org.milyn.cdr.SmooksConfigurationException: Error
invoking #Initialize method 'initialize' on class 'org.milyn.smooks.scripting.groovy.GroovyContentHandlerFactory'.
I am pretty new to Smooks and Groovy, but this is an extract of my code, which was working in version 1.4.
I also have all the 1.5.1 classes in my classpath, including the 1.5 EDI definitions I am trying to load.
Smooks smooks = null;
try {
smooks = new Smooks();
}
catch(Exception exception) { System.out.println("Error " + exception); }
try {
smooks.setReaderConfig(new UNEdifactReaderConfigurator("urn:org.milyn.edi.unedifact:d01b-mapping:*"));
// Create an exec context - no profiles....
ExecutionContext executionContext = smooks.createExecutionContext();
DOMResult domResult = new DOMResult();
// Configure the execution context to generate a report...
executionContext.setEventListener(new HtmlReportGenerator("EDI/reports/report.html"));
smooks.filterSource(new StreamSource((InputStream) bufferedinputstream), domResult);
Extract from GroovyContentHandlerFactory
#Initialize
public void initialize() throws IOException {
String templateText = StreamUtils.readStreamAsString(getClass().getResourceAsStream("ScriptedGroovy.ftl"));
classTemplate = new FreeMarkerTemplate(templateText);
Any help or ideas would be much appreciated as I have spent hours on trying to figure this one out.
Cheers, Matt

Related

Register Java Class in Flink Cluster

I am running my Fat Jar in Flink Cluster which reads Kafka and saves in Cassandra, the code is,
final Properties prop = getProperties();
final FlinkKafkaConsumer<String> flinkConsumer = new FlinkKafkaConsumer<>
(kafkaTopicName, new SimpleStringSchema(), prop);
flinkConsumer.setStartFromEarliest();
final DataStream<String> stream = env.addSource(flinkConsumer);
DataStream<Person> sensorStreaming = stream.flatMap(new FlatMapFunction<String, Person>() {
#Override
public void flatMap(String value, Collector<Person> out) throws Exception {
try {
out.collect(objectMapper.readValue(value, Person.class));
} catch (JsonProcessingException e) {
logger.error("Json Processing Exception", e);
}
}
});
savePersonDetails(sensorStreaming);
env.execute();
and The Person POJO contains,
#Column(name = "event_time")
private Instant eventTime;
There is codec required to store Instant as below for Cassandra side,
final Cluster cluster = ClusterManager.getCluster(cassandraIpAddress);
cluster.getConfiguration().getCodecRegistry().register(InstantCodec.instance);
When i run standalone works fine, but when i run local cluster throws me an error as below,
Caused by: com.datastax.driver.core.exceptions.CodecNotFoundException: Codec not found for requested operation: [timestamp <-> java.time.Instant]
at com.datastax.driver.core.CodecRegistry.notFound(CodecRegistry.java:679)
at com.datastax.driver.core.CodecRegistry.createCodec(CodecRegistry.java:526)
at com.datastax.driver.core.CodecRegistry.findCodec(CodecRegistry.java:506)
at com.datastax.driver.core.CodecRegistry.access$200(CodecRegistry.java:140)
at com.datastax.driver.core.CodecRegistry$TypeCodecCacheLoader.load(CodecRegistry.java:211)
at com.datastax.driver.core.CodecRegistry$TypeCodecCacheLoader.load(CodecRegistry.java:208)
I read the below document for registering,
https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/custom_serializers.html
but InstantCodec is 3rd party one. How can i register it?
I solved the problem, there was LocalDateTime which was emitting from and when i was converting with same type, there was above error. I changed the type into java.util Date type then it worked.

I am trying to convert ppt into pdf using Apache POI but getting following error.Please help me out of this

Following code is used:
public static void main(String[] args) throws IOException {
FileInputStream is = new FileInputStream("C:/Users/hp/Downloads/sampPPT.ppt");
HSLFSlideShow ppt = new HSLFSlideShow(is);
is.close();
Dimension pgsize = ppt.getPageSize();
int idx = 1;
for (HSLFSlide slide : ppt.getSlides()) {
BufferedImage img = new BufferedImage(pgsize.width, pgsize.height, BufferedImage.TYPE_INT_RGB);
Graphics2D graphics = img.createGraphics();
// clear the drawing area
graphics.setPaint(Color.white);
graphics.fill(new Rectangle2D.Float(0, 0, pgsize.width, pgsize.height));
// render
slide.draw(graphics);
// save the output
FileOutputStream out = new FileOutputStream("C:/Users/hp/Downloads/slide-" + idx + ".png");
javax.imageio.ImageIO.write(img, "png", out);
out.close();
idx++;
}
}
This throws following exception:
Exception in thread "main" java.lang.IllegalAccessError: class org.apache.poi.hslf.usermodel.HSLFSlideShowImpl tried to access private field org.apache.poi.POIDocument.directory (org.apache.poi.hslf.usermodel.HSLFSlideShowImpl and org.apache.poi.POIDocument are in unnamed module of loader 'app')
at org.apache.poi.hslf.usermodel.HSLFSlideShowImpl.readCurrentUserStream(HSLFSlideShowImpl.java:340)
at org.apache.poi.hslf.usermodel.HSLFSlideShowImpl.<init>(HSLFSlideShowImpl.java:154)
at org.apache.poi.hslf.usermodel.HSLFSlideShowImpl.<init>(HSLFSlideShowImpl.java:127)
at org.apache.poi.hslf.usermodel.HSLFSlideShowImpl.<init>(HSLFSlideShowImpl.java:116)
at org.apache.poi.hslf.usermodel.HSLFSlideShow.<init>(HSLFSlideShow.java:138)
at PPTConv.PPTConv.main(PPTConv.java:27)
To make an answer why such exceptions occur. Maybe it is helpful for others too:
This kind of exception occur if you mix Apache POI jars from different versions. This is not supported. See FAQ.
In that special case there probably are poi-*.jar and poi-scratchpad-*.jar from different versions in classpath. The class org.apache.poi.hslf.usermodel.HSLFSlideShowImpl, which extends org.apache.poi.POIDocument, is contained in poi-scratchpad-*.jar while org.apache.poi.POIDocument is contained in poi-*.jar. If those *.jars are from different versions, then following can occur:
The org.apache.poi.hslf.usermodel.HSLFSlideShowImpl of poi-scratchpad-3.15.jar calls currentUser = new CurrentUserAtom(directory); in code line 340. This is possible because it extends org.apache.poi.POIDocument and this has field protected DirectoryNode directory; in version 3.15 (poi-3.15.jar).
But the same class org.apache.poi.POIDocument of version 3.16 (poi-3.16.jar) has field private DirectoryNode directory;. So if org.apache.poi.hslf.usermodel.HSLFSlideShowImpl of version 3.15 calls currentUser = new CurrentUserAtom(directory); in code line 340, but org.apache.poi.POIDocument is from version 3.16, then java.lang.IllegalAccessError: class org.apache.poi.hslf.usermodel.HSLFSlideShowImpl tried to access private field org.apache.poi.POIDocument.directory is thrown because it really tries to access a private field now.

dynamic template generation and formatting using freemarker

My goal is to format a collection of java map to a string (basically a csv) using free marker or anything else that would do smartly. I want to generate the template using a configuration data stored in database and managed from an admin application.
The configuration will tell me at what position a given data (key in hash map) need to go and also if any script need to run on this data before applying it at a given position. Several positions may be blank if the data in not in map.
I am thinking to use free-marker to build this generic tool and would appreciate if you could share how I should go about this.
Also would like to know if there is any built is support in spring-integration for building such process as the application is a SI application.
I am no freemarker expert, but a quick look at their quick start docs led me here...
public class FreemarkerTransformerPojo {
private final Configuration configuration;
private final Template template;
public FreemarkerTransformerPojo(String ftl) throws Exception {
this.configuration = new Configuration(Configuration.VERSION_2_3_23);
this.configuration.setDirectoryForTemplateLoading(new File("/"));
this.configuration.setDefaultEncoding("UTF-8");
this.template = this.configuration.getTemplate(ftl);
}
public String transform(Map<?, ?> map) throws Exception {
StringWriter writer = new StringWriter();
this.template.process(map, writer);
return writer.toString();
}
}
and
public class FreemarkerTransformerPojoTests {
#Test
public void test() throws Exception {
String template = System.getProperty("user.home") + "/Development/tmp/test.ftl";
OutputStream os = new FileOutputStream(new File(template));
os.write("foo=${foo}, bar=${bar}".getBytes());
os.close();
FreemarkerTransformerPojo transformer = new FreemarkerTransformerPojo(template);
Map<String, String> map = new HashMap<String, String>();
map.put("foo", "baz");
map.put("bar", "qux");
String result = transformer.transform(map);
assertEquals("foo=baz, bar=qux", result);
}
}
From a Spring Integration flow, send a message with a Map payload to
<int:transformer ... ref="fmTransformer" method="transform" />
Or you could do it with a groovy script (or other supported scripting language) using Spring Integration's existing scripting support without writing any code (except the script).

I am unable to fetch excel data to selenium code At ubuntu o/s

public class ReadAndWrite {
public static void main(String[] args) throws InterruptedException, BiffException, IOException
{
System.out.println("hello");
ReadAndWrite.login();
}
public static void login() throws BiffException, IOException, InterruptedException{
WebDriver driver=new FirefoxDriver();
driver.get("URL");
System.out.println("hello");
FileInputStream fi = new FileInputStream("/home/sagarpatra/Desktop/Xpath.ods");
System.out.println("hiiiiiii");
Workbook w = Workbook.getWorkbook(fi);
Sheet sh = w.getSheet(1);
//or w.getSheet(Sheetnumber)
//String variable1 = s.getCell(column, row).getContents();
for(int row=1; row <=sh.getRows();row++)
{
String username = sh.getCell(0, row).getContents();
System.out.println("Username "+username);
driver.get("URL");
driver.findElement(By.name("Email")).sendKeys(username);
String password= sh.getCell(1, row).getContents();
System.out.println("Password "+password);
driver.findElement(By.name("Passwd")).sendKeys(password);
Thread.sleep(10000);
driver.findElement(By.name("Login")).click();
System.out.println("Waiting for page to load fully...");
Thread.sleep(30000);
}
driver.quit();
}
}
I don't know what is wrong with my code, or how to fix it. It outputs the following error:
Exception in thread "main" jxl.read.biff.BiffException: Unable to recognize OLE stream
at jxl.read.biff.CompoundFile.<init>(CompoundFile.java:116)
at jxl.read.biff.File.<init>(File.java:127)
at jxl.Workbook.getWorkbook(Workbook.java:221)
at jxl.Workbook.getWorkbook(Workbook.java:198)
at test.ReadTest.main(ReadTest.java:19)
I would try using Apache MetaModel instead. I have had better luck with that, than using JXL. Here is a example project I wrote that reads from a .XLSX file. I use this library to run tests on a Linux Jenkins server from .XLS files generated on MS Windows.
Also, it should be noted that this library is also perfect for making a parameterized DataProvider that queries a database with JDBC.
Using JXL, you limit yourself to one data type, either .XLS or .CSV. I believe MetaModel is actually using JXL under the hood and wrapping it to make it easier to use. So, it also would support the OpenOffice documents in the same fashion and suffer the same file compatibility issues.

apache-poi-3.9 + creating Dropdown

I am trying to create dropdown list in XLS using Apache-poi-3.9 .
Following code I have written ::
public class TestMacroTemplate {
/**
* #param args
* #throws IOException
*/
public static void main(String args[]) throws FileNotFoundException {
HSSFWorkbook workbook = new HSSFWorkbook();
HSSFSheet sheet = workbook.createSheet("Data Validation");
CellRangeAddressList addressList = new CellRangeAddressList(0, 0, 0, 0);
DVConstraint dvConstraint = DVConstraint
.createExplicitListConstraint(new String[] { "10", "20", "30" });
DataValidation dataValidation = new HSSFDataValidation(addressList,
dvConstraint);
dataValidation.setSuppressDropDownArrow(false);
sheet.addValidationData(dataValidation);
FileOutputStream fileOut = new FileOutputStream("XLCellDropDown.xls");
try {
workbook.write(fileOut);
fileOut.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
But it gives the following Exception :
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.poi.hssf.usermodel.HSSFSheet.addValidationData(Lorg/apache/poi/ss/usermodel/DataValidation;)V
at ejb.TestMacroTemplate.main(TestMacroTemplate.java:31)
And the same code works with Apache-poi-3.2
Please help me.
Thanks ,
Nirav
Apache POI have a FAQ on this very problem. I'll quote from there, as it'll solve your problem
My code uses some new feature, compiles fine but fails when live with a "MethodNotFoundException", "NoSuchMethodError" or "IncompatibleClassChangeError"
You almost certainly have an older version of POI on your classpath. Quite a few runtimes and other packages will ship an older version of POI, so this is an easy problem to hit without your realising.
The best way to identify the offending earlier jar file is with a few lines of java. These will load one of the core POI classes, and report where it came from.
ClassLoader classloader =
org.apache.poi.poifs.filesystem.POIFSFileSystem.class.getClassLoader();
URL res = classloader.getResource(
"org/apache/poi/poifs/filesystem/POIFSFileSystem.class");
String path = res.getPath();
System.out.println("Core POI came from " + path);
It works fine in Apache poi 3.9 and i have tested it.just incluse these jars
poi-scratchpad-3.9-20121203.jar
poi-3.9-20121203.jar
poi-examples-3.9-20121203.jar
poi-excelant-3.9-20121203.jar
poi-ooxml-3.9-20121203.jar
poi-ooxml-schemas-3.9-20121203.jar

Resources