I am trying to get to the bottom of an issue with MediaPlayer within and Android project, this issue is that the live stream stops after 22 minutes (exactly) consistently with the following displayed:
2021-09-15 19:00:59.600 com.projectname W/MediaHTTPConnection: readAt 20709376 / 32768 => java.net.ProtocolException
2021-09-15 19:00:59.600 com.projectname W/MediaHTTPConnection: readAt 20709209 / 32768 => java.net.ProtocolException
2021-08-30 19:45:30.559 com.projectname E/MediaPlayerNative: error (1, -1010)
2021-08-30 19:45:30.560 com.projectname E/MediaPlayerNative: error (1, -1010)
2021-08-30 19:45:30.560 com.projectname E/MediaPlayerNative: error (1, -1010)
2021-08-30 19:45:30.560 com.projectname E/MediaPlayer: Error (1,-1010)
2021-08-30 19:45:30.561 com.projectname E/MediaPlayer: Error (1,-1010)
I have discovered from this page http://androidxref.com/4.0.4/xref/frameworks/base/include/media/stagefright/MediaErrors.h#37 that -1010 means 'Unsupported Media' but the app plays the stream fine solidly for 22 minutes until the audio stream then stops and the log displays the above (the app does not crash, just the audio stops).
This is the code that starts the live stream playing:
#Override
public void setStreamSource(StreamSource streamSource) {
try {
if (!mediaPlayer.isPlaying()) {
mediaPlayer.setDataSource(streamSource.getAudioUrl());
// Prepare the media player
mediaPlayer.prepare();
// Start playing audio from http url
mediaPlayer.start();
}
} catch (IOException e) {
e.printStackTrace();
}
}
I have tried changing mediaPlayer.prepare(); to mediaPlayer.prepareAsync(); but the stream doesn't start then.
I'm not aware of anything within the app that happens at the 22 minute mark so my thoughts are going to something like the live stream running out of buffered audio and stops playing.
Has anyone seen anything similar and have an idea of what the problem could be?
Does anyone know how to specify a buffer size? (I'm not familiar).
I guess alternatively I could detect if the error happens and restart playback, but that doesn't sound like the best solution.
Thanks in advance for any assistance.
Related
First time posting here so sorry if the question is too vague or missing info. Anyway:
I'm trying to send OSC (open sound control) or MQTT messages when I hit specific notes on my midi controller. When I try to send a OSC message when I get a midi input, I get this unpredictable high latency, varying from about 200-300 milliseconds up to 10 seconds. Here is my test-code:
var easymidi = require('easymidi');
var osc = require("osc");
easymidi.getInputs().forEach((midiDevice)=> {
if (midiDevice.includes('Maschine')){
inputMidi = midiDevice
console.log(`Midi device input found: ${inputMidi}`)
}
})
var inputMikro = new easymidi.Input(inputMidi)
var rmeOSC = new osc.UDPPort ({
remoteAddress: "192.168.10.148",
remotePort: 9001
});
rmeOSC.open();
inputMikro.on('cc', function (msg) {
if (msg.controller == 7){
// This is instant
console.log(msg)
// This is what becomes unpredictable and slow
rmeOSC.send({
address: "/1/mastervolume",
args: [
{
type: "f",
value: msg.value / 127
}
]
});
}
}
I've tried to zero in what makes it slow. When just logging to console inside inputMikro.on('cc', ... ) event listener, it's instant. If I send that midi data to another midi device in that same event listener (also using easymidi library), it's also instant. But sending OSC or MQTT messages there creates that unpredictable latency. I've also tried setting an interval every 20 to send OSC just to make sure it's not a limitation of the OSC library in itself:
var vol = 0
var sendOsc = (()=>{
vol++
if (vol == 127){
vol = 0
}
rmeOSC.send({
address: "/1/mastervolume",
args: [
{
type: "f",
value: vol / 127
}
]
});
})
setInterval(sendOsc, 20)
That works great, same with MQTT. I also tried creating a separate volume variable with an event listener that listens for changes in that variable, and update that variable inside the inputMikro.on('cc', ... ) listener. Same there, if i just log to console whenever that variable changes, it's instant, but if I try to send OSC or MQTT messages when the variable changes, I get that latency.
I'm out of ideas and have no idea what's going on. I'd very much appreciate any insight on how I can fix this. I hope my question is clear enough.
I have a client application that receives video stream from a server via UDP or TCP socket.
Originally, when it was written using .NET 2.0 the code was using BeginReceive/EndReceive and IAsyncResult.
The client displays each video in it's own window and also using it's own thread for communicating with the server.
However, since the client is supposed to be up for a long period of time, and there might be 64 video streams simultaneously, there is a "memory leak" of IAsyncResult objects that are allocated each time the data receive callback is called.
This causes the application eventually to run out of memory, because the GC can't handle releasing of the blocks in time. I verified this using VS 2010 Performance Analyzer.
So I modified the code to use SocketAsyncEventArgs and ReceiveFromAsync (UDP case).
However, I still see a growth in memory blocks at:
System.Net.Sockets.Socket.ReceiveFromAsync(class System.Net.Sockets.SocketAsyncEventArgs)
I've read all the samples and posts about implementing the code, and still no solution.
Here's how my code looks like:
// class data members
private byte[] m_Buffer = new byte[UInt16.MaxValue];
private SocketAsyncEventArgs m_ReadEventArgs = null;
private IPEndPoint m_EndPoint; // local endpoint from the caller
Initializing:
m_Socket = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, ProtocolType.Udp);
m_Socket.Bind(m_EndPoint);
m_Socket.SetSocketOption(SocketOptionLevel.Socket, SocketOptionName.ReceiveBuffer, MAX_SOCKET_RECV_BUFFER);
//
// initalize the socket event args structure.
//
m_ReadEventArgs = new SocketAsyncEventArgs();
m_ReadEventArgs.Completed += new EventHandler<SocketAsyncEventArgs>(readEventArgs_Completed);
m_ReadEventArgs.SetBuffer(m_Buffer, 0, m_Buffer.Length);
m_ReadEventArgs.RemoteEndPoint = new IPEndPoint(IPAddress.Any, 0);
m_ReadEventArgs.AcceptSocket = m_Socket;
Starting the read process:
bool waitForEvent = m_Socket.ReceiveFromAsync(m_ReadEventArgs);
if (!waitForEvent)
{
readEventArgs_Completed(this, m_ReadEventArgs);
}
Read completion handler:
private void readEventArgs_Completed(object sender, SocketAsyncEventArgs e)
{
if (e.BytesTransferred == 0 || e.SocketError != SocketError.Success)
{
//
// we got error on the socket or connection was closed
//
Close();
return;
}
try
{
// try to process a new video frame if enough data was read
base.ProcessPacket(m_Buffer, e.Offset, e.BytesTransferred);
}
catch (Exception ex)
{
// log and error
}
bool willRaiseEvent = m_Socket.ReceiveFromAsync(e);
if (!willRaiseEvent)
{
readEventArgs_Completed(this, e);
}
}
Basically the code works fine and I see the video streams perfectly, but this leak is a real pain.
Did I miss anything???
Many thanks!!!
Instead of recursively calling readEventArgs_Completed after !willRaiseEvent use goto to return to the top of the method. I noticed I was slowly chewing up stack space when I had a pattern similar to yours.
I heve the next lines of code:
_printJob = new PrintJob();
if (_printJob.start2(null, _printWorkSettings.usePageSetupDialog)) {
MonsterDebugger.trace(this, new Date());
try {
_printJob.addPage(objectoToPrint, null, _printWorkSettings.jobOptions);
} catch (error:Error) {
MonsterDebugger.trace(this, error + "/n" + error.getStackTrace() + "/n" + error.toString());
}
_printJob.send();
MonsterDebugger.trace(this, 'job sended to printer');
MonsterDebugger.trace(this, new Date());
}
When it is executed on a Linux machine running AIR 2.5 I get the error 2057, the I check the time between the start2 method and the send and is on the same second. I also checked that the property objectToPrint is a MovieClip.
This works on a windows PC, and I'm unable to do a better debugging than the one is possible using the trace of MonsterDebugger, so any ideas on how can I get more information about why addPage is returning this error or any information about the printJob on Linux?
By the way, I also tryed:
_printJob.addPage(objectoToPrint);
And I get the same result.
The property _printWorkSettings.usePageSetupDialog is always true so I now showing the print menu for the user.
Thansk in advance folks :)
I found the problem, there is only one printer on the linux system and was not setted a default so addpage trowh errors.
I get a inspiration on this forum
I got problem about play wav file in my application.
This is my error:
java.lang.IllegalArgumentException
at javax.microedition.media.Manager.createPlayer(), bci=8
at Tajwid.Tajwid.run(Tajwid.java:649)
at Tajwid.Tajwid.actionPerformed(Tajwid.java:186)
at com.sun.lwuit.util.EventDispatcher.fireActionSync(), bci=19
at com.sun.lwuit.util.EventDispatcher.fireActionEvent(EventDispatcher.java:257)
This is my code:
public void run() {
try {
InputStream is = getClass().getResourceAsStream("/tes.wav");
player = Manager.createPlayer(is, "audio/x-wav");
player.realize();
// get volume control for player and set volume to max
vc = (VolumeControl) player.getControl("VolumeControl");
if (vc != null) {
vc.setLevel(100);
}
player.prefetch();
player.start();
} catch (Exception e) {
e.printStackTrace();
}
Device Configuration : CLDC-1.1
Device Profile MIDP 2.0
Error message you've got has sufficient information to figure what went wrong in the code.
Look at it a bit closer:
java.lang.IllegalArgumentException
at javax.microedition.media.Manager.createPlayer()...
It says something went wrong in Manager.createPlayer(). From your code, it is apparent that you use method Manager.createPlayer(java.io.InputStream stream, java.lang.String type).
If you look into API documentation for the method you use (available online), you'll find the explanation when this exception occurs:
Throws:
java.lang.IllegalArgumentException - Thrown if stream is null.
Above means that stream parameter (is in your code) passed to the method is null.
You could add some logging right after initialization of the is to debug this issue easier:
InputStream is = getClass().getResourceAsStream("/tes.wav");
// add some logging to see if initialization was OK or not:
System.out.println("input stream is null: [" + (is == null) + "]");
That way, when running your MIDlet in emulator, you will see whether is was initialized as expected or not.
Actually, looking at the code I would guess that you made a typo in file name passed to getResourceAsStream: "/tes.wav" looks like a mis-typed "/test.wav".
I am trying to write a listener using the CoreAudio API for when the default audio output is changed (e.g.: a headphone jack is plugged in). I found sample code, although a bit old and using deprecated functions (http://developer.apple.com/mac/library/samplecode/AudioDeviceNotify/Introduction/Intro.html, but it didn't work. Re-wrote the code in the 'correct' way using AudioHardwareAddPropertyListener method, but still it doesn't seem to work. When I plug in a headphone the function that I registered is not triggered. I'm a bit of a loss here... I suspect the problem may lay some where else, but I can't figure out where...
The Listener Registration Code:
OSStatus err = noErr;
AudioObjectPropertyAddress audioDevicesAddress = { kAudioHardwarePropertyDefaultOutputDevice, KAudioObjectPropertyScopeGlobal, KAudioObjectPropertyElementMaster };
err = AudioObjectAddPropertyListener ( KAudioObjectAudioSystemObject, &AudioDevicesAddress, coreaudio_property_listener, NULL);
if (err) trace ("error on AudioObjectAddPropertyListener");
After a search in sourceforge for projects that used the CoreAudio API, I found the rtaudio project, and more importantly these lines:
// This is a largely undocumented but absolutely necessary
// requirement starting with OS-X 10.6. If not called, queries and
// updates to various audio device properties are not handled
// correctly.
CFRunLoopRef theRunLoop = NULL;
AudioObjectPropertyAddress property = { kAudioHardwarePropertyRunLoop,
kAudioObjectPropertyScopeGlobal,
kAudioObjectPropertyElementMaster };
OSStatus result = AudioObjectSetPropertyData( kAudioObjectSystemObject, &property, 0, NULL, sizeof(CFRunLoopRef), &theRunLoop);
if ( result != noErr ) {
errorText_ = "RtApiCore::RtApiCore: error setting run loop property!";
error( RtError::WARNING );
}
After adding this code I didn't even need to register a listener myself.
Try CFRunLoopRun() - it has the same effect. i.e. making sure the event loop that is calling your listener is running.