Codename One sound Media.play() memory leak? - audio

My app uses some short sounds for user feedback. I use the following code:
private void playSound(String fileName) {
try {
FileSystemStorage fss = FileSystemStorage.getInstance();
String sep = fss.getFileSystemSeparator() + "";
String soundDir; // sounds must be in a directory
if (fss.getAppHomePath().endsWith(sep)) {
soundDir = fss.getAppHomePath() + "sounds"; // device
} else {
soundDir = fss.getAppHomePath() + sep + "sounds"; // simulator/windows
}
if (!fss.exists(soundDir)) {
// first time a sound is played: create directory
fss.mkdir(soundDir);
}
String filePath = soundDir + sep + fileName;
if (!fss.exists(filePath)) {
// first time this sound is played: copy from resources (place file in <project>/src)
InputStream is = Display.getInstance().getResourceAsStream(getClass(), "/" + fileName);
OutputStream os = fss.openOutputStream(filePath);
com.codename1.io.Util.copy(is, os);
}
Media media = MediaManager.createMedia(filePath, false);
//media.setVolume(100);
media.play();
} catch (IOException ex) {
log("Error playing " + fileName + " " + ex.getMessage());
}
}
Example call:
playSound("error.mp3");
This works fine on devices and in the simulator. However, if I do a long automatic test in the simulator (using Windows), playing a sound about every second,
this eats up all the RAM until Windows crashes. The Windows task manager, however, shows no exceptional memory usage of NetBeans and the Java process.
So my questions are: Is my code correct? Can this happen on devices too? Or else is there a way to prevent this in the simulator/Windows?
P.S.
I also tried the code from How to bundle sounds with Codename One?. That has the same problem and also
some sounds get lost (are not played).
I also tried the simple code from Codename One - Play a sound but that doesn't work.

We generally recommend keeping the Media instance for this sort of use case.
But if you can't just make sure to call cleanup when you're done:
MediaManager.addCompletionHandler(media, () -> media.cleanup());
media.play();

Related

Azure Durable function removes files form local storage after it is downloaded

I am struggling a lot with this task. I have to download files from SFTP and then parse them. I am using Durable functions like this
[FunctionName("MainOrch")]
public async Task<List<string>> RunOrchestrator(
[OrchestrationTrigger] IDurableOrchestrationContext context, ILogger log)
{
try
{
var filesDownloaded = new List<string>();
var filesUploaded = new List<string>();
var files = await context.CallActivityAsync<List<string>>("SFTPGetListOfFiles", null);
log.LogInformation("!!!!FilesFound*******!!!!!" + files.Count);
if (files.Count > 0)
{
foreach (var fileName in files)
{
filesDownloaded.Add(await context.CallActivityAsync<string>("SFTPDownload", fileName));
}
var parsingTasks = new List<Task<string>>(filesDownloaded.Count);
foreach (var downlaoded in filesDownloaded)
{
var parsingTask = context.CallActivityAsync<string>("PBARParsing", downlaoded);
parsingTasks.Add(parsingTask);
}
await Task.WhenAll(parsingTasks);
}
return filesDownloaded;
}
catch (Exception ex)
{
throw;
}
}
SFTPGetListOfFiles: This functions connects to SFTP and gets the list of files in a folder and return.
SFTPDownload: This function is suppose to connect to SFTP and download each file in Azure Function's Tempt Storage. and return the download path. (each file is from 10 to 60 MB)
[FunctionName("SFTPDownload")]
public async Task<string> SFTPDownload([ActivityTrigger] string name, ILogger log, Microsoft.Azure.WebJobs.ExecutionContext context)
{
var downloadPath = "";
try
{
using (var session = new Session())
{
try
{
session.ExecutablePath = Path.Combine(context.FunctionAppDirectory, "winscp.exe");
session.Open(GetOptions(context));
log.LogInformation("!!!!!!!!!!!!!!Connected For Download!!!!!!!!!!!!!!!");
TransferOptions transferOptions = new TransferOptions();
transferOptions.TransferMode = TransferMode.Binary;
downloadPath = Path.Combine(Path.GetTempPath(), name);
log.LogInformation("Downloading " + name);
var transferResult = session.GetFiles("/Receive/" + name, downloadPath, false, transferOptions);
log.LogInformation("Downloaded " + name);
// Throw on any error
transferResult.Check();
log.LogInformation("!!!!!!!!!!!!!!Completed Download !!!!!!!!!!!!!!!!");
}
catch (Exception ex)
{
log.LogError(ex.Message);
}
finally
{
session.Close();
}
}
}
catch (Exception ex)
{
log.LogError(ex.Message);
_traceService.TraceException(ex);
}
return downloadPath;
}
PBARParsing: function has to get the stream of that file and process it (processing a 60 MB file might take few minutes on Scale up of S2 and Scale out with 10 instances.)
[FunctionName("PBARParsing")]
public async Task PBARParsing([ActivityTrigger] string pathOfFile,
ILogger log)
{
var theSplit = pathOfFile.Split("\\");
var name = theSplit[theSplit.Length - 1];
try
{
log.LogInformation("**********Starting" + name);
Stream stream = File.OpenRead(pathOfFile);
i want the download of all files to be completed using SFTPDownload thats why "await" is in a loop. and then i want parsing to run in parallel.
Question 1: Does the code in MainOrch function seems correct for doing these 3 things 1)getting the names of files, 2) downloading them one by one and not starting the parsing function until all files are downloaded. and then 3)parsing the files in parallel. ?
I observed that what i mentioned in Question 1 is working as expected.
Question 2: 30% of the files are parsed and for the 80% i see errors that "Could not find file 'D:\local\Temp\fileName'" is azure function removing the files after i place them ? is there any other approach i can take? If i change the path to "D:\home" i might see "File is being used by another process" error. but i haven't tried it yet. out the 68 files on SFTP weirdly last 20 ran and first 40 files were not found at that path and this is in sequence.
Question3: I also see this error " Singleton lock renewal failed for blob 'func-eres-integration-dev/host' with error code 409: LeaseIdMismatchWithLeaseOperation. The last successful renewal completed at 2020-08-08T17:57:10.494Z (46005 milliseconds ago) with a duration of 155 milliseconds. The lease period was 15000 milliseconds." does it tells something ? it came just once though.
update
after using "D:\home" i am not getting file not found errors
For others coming across this, the temporary storage is local to an instance of the function app, which will be different when the function scales out.
For such scenarios, D:\home is a better alternative as Azure Files is mounted here, which is the same across all instances.
As for the lock renewal error observed here, this issue tracks it but shouldn't cause issues as mentioned. If you do see any issue because of this, it would be best to share details in that issue.

P4API.net: how to use P4Callbacks delegates

I am working on a small tool to schedule p4 sync daily at specific times.
In this tool, I want to display the outputs from the P4API while it is running commands.
I can see that the P4API.net has a P4Callbacks class, with several delegates: InfoResultsDelegate, TaggedOutputDelegate, LogMessageDelegate, ErrorDelegate.
My question is: How can I use those, I could not find a single example online of that. A short example code would be amazing !
Note: I am quite a beginner and have never used delegates before.
Answering my own questions by an example. I ended up figuring out by myself, it is a simple event.
Note that this only works with P4Server. My last attempt at getting TaggedOutput from a P4.Connection was unsuccessful, they were never triggered when running a command.
So, here is a code example:
P4Server p4Server = new P4Server(syncPath);
p4Server.TaggedOutputReceived += P4ServerTaggedOutputEvent;
p4Server.ErrorReceived += P4ServerErrorReceived;
bool syncSuccess = false;
try
{
P4Command syncCommand = new P4Command(p4Server, "sync", true, syncPath + "\\...");
P4CommandResult rslt = syncCommand.Run();
syncSuccess=true;
//Here you can read the content of the P4CommandResult
//But it will only be accessible when the command is finished.
}
catch (P4Exception ex) //Will be caught only when the command has completely failed
{
Console.WriteLine("P4Command failed: " + ex.Message);
}
And the two methods, those will be triggered while the sync command is being executed.
private void P4ServerErrorReceived(uint cmdId, int severity, int errorNumber, string data)
{
Console.WriteLine("P4ServerErrorReceived:" + data);
}
private void P4ServerTaggedOutputEvent(uint cmdId, int ObjId, TaggedObject Obj)
{
Console.WriteLine("P4ServerTaggedOutputEvent:" + Obj["clientFile"]);
}

Sharedobject in BackGrounde Worker as3 Air (android,ios)

im Developing an Air app for android and ios and im using workers too preform a heavy task in background (two swfs),i want to have SharedObjects in the worker swf(the background worker) ,is this possible ? or the data will be lost?
Everything you need to write to the same SharedObject is to specify the same path for both like
mySO = SharedObject.getLocal("myObjectFile","/");
more info here http://help.adobe.com/en_US/as3/dev/WS5b3ccc516d4fbf351e63e3d118a9b90204-7d80.html
Here is a short code to make you sure that data won't be lost, workers are actually doing the same stuff as multiple flash players ran simultaneously. Just run 2 or more swf and see the result:
import flash.net.SharedObject;
var iterations:int = 100
function witeToSo()
{
var mySO:SharedObject = SharedObject.getLocal("myObjectFile", "/");
if (iterations > 0)
{
if (!mySO.data.str) mySO.data.str = ""
mySO.data.str += int(Math.random() * 10);
iterations--;
}
txt.text = "str: " + mySO.data.str + " symbolsTotal:" + (mySO.data.str.length) + "\n";
setTimeout(witeToSo, Math.random()*100);
}
setTimeout(witeToSo, 2000);
Also you need to think of how to synchronise your threads in case you want to write the data in specific order

CPU Utilisation 100% while using OpenOffice4

I'm trying to convert documents(.docx/.xlsx/.pptx) to PDF using JOD Converter. I'm using OpenOffice 4.1.2 on Centos 7. My problem is, I'm getting constant CPU usage of 100% while i'm converting the file, and this is impacting the performance of overall machine. I have tried every possible option in the command line options, but ,unfortunately, haven't been able to mitigate this issue. I have searched on a lot of forums, and found that lot of other people are also facing the same problem, however, there is no solution out there. Through my readings, I realize this could be because memory leak problems in OpenOffice. Can someone please help me resolve or at-least mitigate this?
Below is the command that I use to spawn the OpenOffice instance.
/opt/openoffice4/program/soffice.bin -accept=socket,host=127.0.0.1,port=8016;urp; -env:UserInstallation=file:///tmp/.jodconverter_socket_host-127.0.0.1_port-8016 -headless -nocrashreport -nodefault -nofirststartwizard -nolockcheck -nologo -norestore
The code I'm using to convert the files is as follows:
package org.samples.docxconverters.jodconverter.pdf;
import java.io.File;
import org.apache.commons.io.FilenameUtils;
import org.artofsolving.jodconverter.OfficeDocumentConverter;
import org.artofsolving.jodconverter.office.DefaultOfficeManagerConfiguration;
import org.artofsolving.jodconverter.office.OfficeManager;
public class Word2PdfJod {
public static void main(String[] args) {
// 1) Start LibreOffice in headless mode.
OfficeManager officeManager = null;
try {
officeManager = new DefaultOfficeManagerConfiguration()
.setOfficeHome(new File("/Applications/OpenOffice.app/Contents/")).buildOfficeManager();
officeManager.start();
// 2) Create JODConverter converter
OfficeDocumentConverter converter = new OfficeDocumentConverter(
officeManager);
// 3) Create PDF
createPDF(converter);
} finally {
// 4) Stop OpenOffice in headless mode.
if (officeManager != null) {
officeManager.stop();
}
}
}
private static void createPDF(OfficeDocumentConverter converter) {
try {
long start = System.currentTimeMillis();
String src_file = "/Users/Aman/Documents/WindowsData/DocumentConversionPoc/Powerpoint2Pdf/JODConverterV3/Sample_pptx_files/AdeemSample2.pptx";
System.out.println(src_file.substring(0, src_file.lastIndexOf(".")) + "_" + FilenameUtils.getExtension(src_file) );
//Actual Conversion
converter.convert( new File(src_file), new File( src_file.substring(0, src_file.lastIndexOf(".")) + "_"
+ FilenameUtils.getExtension(src_file) +"_Jod.pdf") );
System.out.println("Time Taken in conversion - "+ (System.currentTimeMillis() - start) + "ms");
} catch (Throwable e) {
e.printStackTrace();
}
}
}
And the relevant jars can be downloaded from :
https://drive.google.com/file/d/0B4hS5IGxGOh9OE5Ca0RlbTdVclU/view?usp=sharing
If CPU is idle, a process will take 100% CPU time by default. It's normal. If this is causing hinderance in executing other processes (highly unlikely), you can set up priorities using nice.
nice <your command>
Or, you can install cpulimit, which makes your program sleep if it reaches a predefined CPU usage. Read about it here.
By reducing the number of cores your application can use, you can prevent the system from being locked:
Process.GetCurrentProcess().ProcessorAffinity = (System.IntPtr)2;
To set the affinity of CPUs using C#

How do I tell my C# application to close a file it has open in a FileInfo object or possibly Bitmap object?

So I was writing a quick application to sort my wallpapers neatly into folders according to aspect ratio. Everything is going smoothly until I try to actually move the files (using FileInfo.MoveTo()). The application throws an exception:
System.IO.IOException
The process cannot access the file because it is being used by another process.
The only problem is, there is no other process running on my computer that has that particular file open. I thought perhaps that because of the way I was using the file, perhaps some internal system subroutine on a different thread or something has the file open when I try to move it. Sure enough, a few lines above that, I set a property that calls an event that opens the file for reading. I'm assuming at least some of that happens asynchronously. Is there anyway to make it run synchronously? I must change that property or rewrite much of the code.
Here are some relevant bits of code, please forgive the crappy Visual C# default names for things, this isn't really a release quality piece of software yet:
private void button1_Click(object sender, EventArgs e)
{
for (uint i = 0; i < filebox.Items.Count; i++)
{
if (!filebox.GetItemChecked((int)i)) continue;
//This calls the selectedIndexChanged event to change the 'selectedImg' variable
filebox.SelectedIndex = (int)i;
if (selectedImg == null) continue;
Size imgAspect = getImgAspect(selectedImg);
//This is gonna be hella hardcoded for now
//In the future this should be changed to be generic
//and use some kind of setting schema to determine
//the sort/filter results
FileInfo file = ((FileInfo)filebox.SelectedItem);
if (imgAspect.Width == 8 && imgAspect.Height == 5)
{
finalOut = outPath + "\\8x5\\" + file.Name;
}
else if (imgAspect.Width == 5 && imgAspect.Height == 4)
{
finalOut = outPath + "\\5x4\\" + file.Name;
}
else
{
finalOut = outPath + "\\Other\\" + file.Name;
}
//Me trying to tell C# to close the file
selectedImg.Dispose();
previewer.Image = null;
//This is where the exception is thrown
file.MoveTo(finalOut);
}
}
//The suspected event handler
private void filebox_SelectedIndexChanged(object sender, EventArgs e)
{
FileInfo selected;
if (filebox.SelectedIndex >= filebox.Items.Count || filebox.SelectedIndex < 0) return;
selected = (FileInfo)filebox.Items[filebox.SelectedIndex];
try
{
//The suspected line of code
selectedImg = new Bitmap((Stream)selected.OpenRead());
}
catch (Exception) { selectedImg = null; }
if (selectedImg != null)
previewer.Image = ResizeImage(selectedImg, previewer.Size);
else
previewer.Image = null;
}
I have a long-fix in mind (that's probably more efficient anyway) but it presents more problems still :/
Any help would be greatly appreciated.
Since you are using your selectedImg as a Class scoped variable it is keeping a lock on the File while the Bitmap is open. I would use an using statement and then Clone the Bitmap into the variable you are using this will release the lock that Bitmap is keeping on the file.
Something like this.
using ( Bitmap img = new Bitmap((Stream)selected.OpenRead()))
{
selectedImg = (Bitmap)img.Clone();
}
New answer:
I looked at the line where you do an OpenRead(). Clearly, this locks your file. It would be better to provide the file path instead of an stream, because you can't dispose your stream since bitmap would become erroneous.
Another thing I'm looking in your code which could be a bad practice is binding to FileInfo. Better create a data-transfer object/value object and bind to a collection of this type - some object which has the properties you need to show in your control -. That would help in order to avoid file locks.
In the other hand, you can do some trick: why don't you show streched to screen resolution images compressing them so image size would be extremly lower than actual ones and you provide a button called "Show in HQ"? That should solve the problem of preloading HD images. When the user clicks "Show in HQ" button, loads that image in memory, and when this is closed, it gets disposed.
It's ok for you?
If I'm not wrong, FileInfo doesn't block any file. You're not opening it but reading its metadata.
In the other hand, if you application shows images, you should move to memory visible ones and load them to your form from a memory stream.
That's reasonable because you can open a file stream, read its bytes and move them to a memory stream, leaving the lock against that file.
NOTE: This solution is fine for not so large images... Let me know if you're working with HD images.
using(selectedImg = new Bitmap((Stream)selected))
Will that do it?

Resources