Primefaces is not immediately closing the stream of DefaultStreamedContent after read - jsf

I have the following problem:
I am displaying an image in my webapp using a <p:graphicImage> from Primefaces
The image displayed is delivered by a bean as a DefaultStreamedContent. In my application I am sometimes deleting images displayed this way during runtime.
This always takes a little time till I can delete the image. After debugging a little i used the Files.delete of Java 7 and got the following exception:
The process cannot access the file because it is being used by another process.
I thus suspect that Primefaces is not immediately closing the stream behind the DefaultStreamedContent after displaying and i am not able to delete the file whenever I want.
Is there any way to tell the DefaultStreamedContent to close itself imediately after read (I already looked into the documentation and didn't find any fitting method within the DefaultStreamedContent, but maybe one can tell the stream or something like that?)

Ok I finally found out what is happening using the Unlocker tool
(can be downloaded here: http://www.emptyloop.com/unlocker/#download)
I saw that the java.exe is locking the file once it is displayed. Therefor the Stream behind the StreamedContent is NOT immediately closed after reading.
My solution was as follows:
I made a superclass extending the StreamedContent and let it read the inputstream and "feed" the read bytes into a new InputStream. After that i closed the given stream so that the ressource behind it is released again.
the class looks something like this:
public class PersonalStreamedContent extends DefaultStreamedContent {
/**
* Copies the given Inputstream and closes it afterwards
*/
public PersonalStreamedContent(FileInputStream stream, String contentType) {
super(copyInputStream(stream), contentType);
}
public static InputStream copyInputStream(InputStream stream) {
if (stream != null) {
try {
byte[] bytes = IOUtils.toByteArray(stream);
stream.close();
return new ByteArrayInputStream(bytes);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
} else {
System.out.println("inputStream was null");
}
return new ByteArrayInputStream(new byte[] {});
}
}
I am quite sure that the image is retrieved 2 times by Primefaces but only closed the FIRST time it is loaded. I didn't realize this in the beginning.
I hope this can help some other people too :)

Related

append huge volume of text to javafx TextArea from another thread (i.e. javafx task)

I'm creating a simple text editor application like ms notepad using JavaFX. I want it to handle large files maximum of 10mb, to do this I have created a task using the JavaFX concurrent package. The task will read the file using a buffered reader and append it to the text area.
My problem is that when I run the task with small files like 8kb to 10kb it works perfectly but when I increase the file size UI starts freezing and after reading few lines it stops working and I have to force stop the program.
Here is the code of task I have created...
public class ReadFile extends Task<String> {
private TextArea writingPad;
private File source;
public ReadFile(TextArea writingPad, File source) {
this.writingPad = writingPad;
this.source = source;
}
#Override
protected String call() throws Exception {
if (source != null && Files.exists(source.toPath())) {
if (source.isFile()) {
if (source.canRead()) {
if ((source.length() / (1024 * 1024)) <= 10) {
try (BufferedReader reader = new BufferedReader(new FileReader(source.getAbsolutePath()))) {
writingPad.clear();
updateTitle("Reading " + source.getName() + "...");
int workDone = 0;
char[] buffer = new char[8192];
int read;
while ((read = reader.read(buffer, 0, 8192)) >= 0) {
writingPad.appendText(String.valueof(buffer, 0, 8192));
workDone += read;
updateProgress(workDone, source.length());
}
} catch (IOException ignored) {
}
} else {
System.out.println("File is loo large.");
}
} else {
System.out.println("Can't read file.");
}
} else {
System.out.println("Is a directory.");
}
} else {
System.out.println("Is null");
}
return null;
}}
The above code throws exceptions like NullPointerException and IndexOuofBoundException so to solve this exception i have used this -> Platform.runLater(() -> writingPad.appendText(String.valueOf(buffer, 0, finalread))); Thread.sleep(100); method, but it also doesn't help me to achieve my goal properly. As it has solved the problem of exception but it takes too much time to read even small files and for large files problem is still the same. I have searched the internet but I didn't get any solution which can solve my problem.
So here is what I want:-
I want an effective and efficient way to read a text file (max size
10mb) and display its content in a text area.
The whole process of reading and writing is going to take place
from another thread (i.e a JavaFX task) so UI should not freeze
during the process.
As my application is a text editor so, no other UI component like
listview is going to be helpful.
Please suggest me a simple and easy solution as I'm new to JavaFX and multithreading.
Thanks
This is an excerpt from the javadoc of the Task class:
"An implementation of Task must override the call() method. This method is invoked on the background thread. Any state which is used in this method must be safe to read and write from a background thread. For example, manipulating a live scene graph from this method is unsafe and will result in runtime exceptions."
This should make it clear why your code is completely wrong.

Uploading file in JSF (Need correct file pathway) [duplicate]

This question already has an answer here:
How to upload file using JSF 2.2 <h:inputFile>? Where is the saved File?
(1 answer)
Closed 5 years ago.
I am trying to get my JSF site to upload a picture to the server, but am having a time of it. I've found 4 methodologies to do, but I'd like to use h:InputFile as it seems the most direct.
It would seem I just need to supply the upload path correctly.
After adding #MultipartConfig I no longer get an exception, but I can't verify the file is uploaded or see any error.
public void AddPicture()
{
ConnInfo HitIt = new ConnInfo();
try
{
HitIt.save(fileCelebrityToAdd);
}
catch(Exception ex)
{
//?
}
}
#MultipartConfig(location="C:\\local\\pathway\\Netbeans\\project\\web\\Pictures\\items\\")
public class ConnInfo
{
private String uploadLocation;
public ConnInfo()
{
//uploadLocation = ".\\Pictures\\items\\";
uploadLocation = "C:\\local\\pathway\\Netbeans\\project\\web\\Pictures\\items\\";
}
public boolean TryOut(Part file) throws IOException
{
String monkey = uploadLocation+getFilename(file);
try
{
file.write(monkey);
}
catch(Exception ex)
{
return false;
}
return true;
}
}
Hopefully I've copied the necessary information correctly.
After going back and rereading all the articles I had bookmarked, it was actually the from the one Tam had suggested that I was able to strip out some information.
I didn't need the AJAX, or the #MultipartConfig, and my previous attempt was somehow incorrect, but the follow method allowed me to successfully upload a picture where I wanted it:
public boolean SaveHer(Part file)
{
String monkey = getFilename(file);
try (InputStream input = file.getInputStream())
{
Files.copy(input, new File(uploadLocation, monkey).toPath());
}
catch (IOException e)
{
// Show faces message?
return false;
}
return true;
}

Primefaces graphicImage stream not closed, file locked [duplicate]

This question already has answers here:
Display dynamic image from database or remote source with p:graphicImage and StreamedContent
(4 answers)
Closed 6 years ago.
I'm using primefaces to upload an image, crop it and then display the final image on a graphicImage.
The process works fine, but the problem is that when I retrieve the final image to display on the graphicImage, the stream is not closed and the file is being held up by java.exe, so I'm having problems on deleting the files/directory for example when the user logs out, because it's just a temp directory.
This is the getter of my StreamedContent:
public StreamedContent getGraphicCropped() {
try{
if (newImageName != null) {
File file2 = new File(pathCroppedImage);
InputStream input = new FileInputStream(file2);
graphicCropped = new DefaultStreamedContent(input);
showImageFinal = true;
}
} catch(Exception e){
e.printStackTrace();
}
return graphicCropped;
}
If I do input.close();then I'm able to delete the file, but it is not displayed, because I know that this getter is called more than once on the life cycle.
I've solved it by using the suggested getter of a StreamedContent:
public StreamedContent getGraphicCropped() throws FileNotFoundException {
FacesContext context = FacesContext.getCurrentInstance();
if (context.getCurrentPhaseId() == PhaseId.RENDER_RESPONSE) {
// So, we're rendering the HTML. Return a stub StreamedContent so that it will generate right URL.
return new DefaultStreamedContent();
}
else {
// So, browser is requesting the image. Return a real StreamedContent with the image bytes.
File file2 = new File(pathCroppedImage);
InputStream input = new FileInputStream(file2);
showImageFinal = true;
return new DefaultStreamedContent(input);
}
}

Saxon xslt transform to PDF slow on server but fast on local

I am facing an performance issue using saxon and apache fo to transform xml to PDF. the pdf we use to test has 85 pages and is around 320k. it spends almost 2 mins on transform method call and in local it only takes less than 5 seconds.
we monitor cpu usage and GC during that method call and found that on server, cpu usage stays steady at 5% and we did not have any limitation on cpu from server side. GC happens every 1 to 2 seconds but they are all minor GC and each one takes only 10 to 50ms. we also monitor io wait during test and it stayed very low.
libs we are using are: saxon 9.1 and apache fop 2.1 (we tested with different saxon and apache versions but issue remains)
xml and xsl files are too large so i am not able to post them. below is the sample code from transformation:
public static TransformerFactory transformerFactory;
public static Transformer xlsProcessor;
public static byte[] generatePDF(InputStream xmlData, String xslFile)
throws TransformerException, IOException {
byte[] fileArray = null;
InputStream xsltfile = null;
ByteArrayOutputStream outStream = null;
try {
xsltfile =
XmlToPdfGenerator.class.getClassLoader()
.getResourceAsStream(xslFile);
StreamSource source = new StreamSource(xmlData);
StreamSource transformSource = new StreamSource(xsltfile);
if (null== fopFactory){
File xconf= new File(XmlToPdfGenerator.class.getClassLoader().getResource("a xconf file").getFile());
fopFactory = FopFactory.newInstance(xconf);
}
FOUserAgent foUserAgent = fopFactory.newFOUserAgent();
outStream = new ByteArrayOutputStream();
Transformer xslfoTransformer =
getTransformer(transformSource);
if (xslfoTransformer != null) {
Fop fop;
try {
fop =
fopFactory.newFop(MimeConstants.MIME_PDF,
foUserAgent, outStream);
Result res = new SAXResult(fop.getDefaultHandler());
try {
xslfoTransformer.transform(source, res);
fileArray = outStream.toByteArray();
} catch (TransformerException e) {
// some error handling logic omitted
} catch (Exception e) {
// some error handling logic omitted
}
} catch (FOPException e) {
// some error handling logic omitted
}
}
} catch (TransformerFactoryConfigurationError e) {
// some error handling logic omitted
} catch (Exception e) {
// some error handling logic omitted
} finally {
if (null != xsltfile) {
xsltfile.close();
}
if (null != outStream) {
outStream.close();
}
}
return fileArray;
}
private static Transformer getTransformer(StreamSource streamSource) {
if (null==transformerFactory){
transformerFactory =
new net.sf.saxon.TransformerFactoryImpl();
}
try {
if (xlsProcessor == null) {
xlsProcessor =
transformerFactory.newTransformer(streamSource);
}
return xlsProcessor ;
} catch (TransformerConfigurationException e) {
// some error handling logic
}
return null;
}
I doubt there is any code issue caused this as it works normal on local.
greatly appreciated if any thoughts on this!
Clearly you haven't provided enough information to diagnose the problem, so all we can do is to offer advice on how to drill down further to get some diagnostic data. It's going to be a lot easier to help if you move to the current version (9.7), and it might even solve the problem.
Check whether the transformation is making any HTTP requests to the W3C server (or elsewhere). For example, to fetch common DTDs. W3C deliberately throttle these requests. Recent releases of Saxon intercept these requests and use a local copy of the file within the Saxon software, but you are using a very old version. There are various tools you can use to monitor HTTP traffic.
Run the transformation on its own without any Apache FOP processing to see how the figures compare. You need to determine whether the problem is during XSLT processing or XSL-FO processing, and the best way to do that is to run one without the other.
Check whether you get the same performance issues when you run the transformation on its own from the command line.
Check the Saxon execution profile obtained using -TP:profile.html, and see how the results compare on the two machines.
Check the Java profile data, e.g. using run=hprof, and see how it compares on the two machines. Any major differences provide a clue for further investigation.

Problem with J2ME RecordStore update, delete operation

I create a List showing data from a RecordStore. I tried to update a record and the re-display the list (re-open the same RecordStore), but the updated item doesn't change (still contain the old data).
I also tried to delete an item and the deleted item is still displayed in the list.
I run the program using emulator from NetBeans 7.0 with Java ME SDK 3.0
This is the code for updating the record
public void updateClient(Client cl) throws Exception{
RecordStore rs=RecordStore.openRecordStore(String.valueOf(clientsStoreKey), true);
int recNum=rs.getNumRecords();
if (recNum>0){
RecordEnumeration renum=rs.enumerateRecords(null, null,false);
while(renum.hasNextElement()){
int id = renum.nextRecordId();
byte[] buff=rs.getRecord(id);
Client temp=Client.createFrom(buff);
if(temp.clientId.compareTo(cl.clientId)==0){
temp.firstName=cl.firstName;
temp.lastName=cl.lastName;
temp.city=cl.city;
temp.state=cl.state;
temp.company=cl.company;
temp.phone=cl.phone;
ByteArrayOutputStream bos=new ByteArrayOutputStream();
DataOutputStream dos=new DataOutputStream(bos);
temp.writeTo(dos);
byte[] sData=bos.toByteArray();
rs.setRecord(id, sData, 0, sData.length);
dos.close();
bos.close();
break;
}
}
renum.destroy();
}
rs.closeRecordStore();
}
And this is the code to get the records
public Vector getClients()
throws Exception{
RecordStore rs=RecordStore.openRecordStore(String.valueOf(clientsStoreKey), true);
int recNum=rs.getNumRecords();
Vector cls=new Vector();
if (recNum>0){
RecordEnumeration renum=rs.enumerateRecords(null, null,false);
while(renum.hasNextElement()){
byte[] buff=renum.nextRecord();
Client cl=Client.createFrom(buff);
cls.addElement(cl);
}
renum.destroy();
}
rs.closeRecordStore();
return cls;
}
interesting - your code dealing with record store looks rather OK to me. Is there a chance for some glitch in UI - like say using old or incorrectly updated screen object?
How do you debug your application? Since you mention emulator, System.out/println looks like a natural choice doesn't it? I'd use it to output content of the record right after setting it in updateClient and after getting it in getClients

Resources