I want to implement video streaming in java me using rtsp url. When tested the code on devices, I get Media Exception stating Prefetch Error-33. Here's my code
private void startStreaming()
{
try
{
mplayer=Manager.createPlayer(videourl);
mplayer.addPlayerListener(this);
mplayer.realize();
videoControl=(VideoControl) mplayer.getControl("VideoControl");
if(videoControl!=null)
{
Item video=(Item) videoControl.initDisplayMode(videoControl.USE_GUI_PRIMITIVE, null);
videoControl.setVisible(true);
System.out.println("Playing");
Form v=new Form("Playing Video");
StringItem si=new StringItem("Status", "Playing....");
v.append(video);
display.setCurrent(v);
}
mplayer.prefetch();
mplayer.start();
}
catch(Exception noCanDo)
{
Form f=new Form("Error");
f.append("Error : "+noCanDo);
display.setCurrent(f);
}
}
I have also tried using alternative method of using MIDlet.platformrequest(videourl) method which invokes default internal player of device to play video file. The player is being started but later on, a connection timeout prompt occurs. I have however tested the rtsp url and it works very much fine. Any suggestions as to how can I do video streaming using rtsp url in java me?
Use this code for streamin RTSP,it should work for nokia symbian belle sdk 1.1 and nokia sdk 2.0
protected void startApp() throws MIDletStateChangeException {
VideoCanvas VC = new VideoCanvas(this,url);
Display.getDisplay(this).setCurrent(VC); }
}
//videoCanvas Class
public VideoCanvas(ExampleStreamingMIDlet midlet, String url) {
this.midlet = midlet;
this.url = url;
addCommand(start);
addCommand(stop);
addCommand(back);
addCommand(exit);
setCommandListener(this);
this.setFullScreenMode(true);
}
public void commandAction(Command c, Displayable arg1) {
if(c == start) {
start();
}
public void start() {
try{
Player player = Manager.createPlayer(url);
player.addPlayerListener(this);
player.realize();
control = (VideoControl)player.getControl("VideoControl");
if (control != null) {
control.initDisplayMode(VideoControl.USE_DIRECT_VIDEO,this);
control.setDisplaySize(176,144);
int width = control.getSourceWidth();
int height = control.getSourceHeight();
status2 = "Before: SW=" + width + "-SH=" + height + "-DW=" + control.getDisplayWidth() + "-DH=" + control.getDisplayHeight();
}
player.start();
}
catch(Exception e) {
Alert erro = new Alert("Erro",e.getMessage(),null,AlertType.ERROR);
Display.getDisplay(midlet).setCurrent(erro);
}
}
public void stop() {
if(player != null) {
player.deallocate();
player.close();
}
}
Related
I have developed UWP application where it play audio files in Background or when the phone is locked. The application works fine and everything seems perfect for 5-10 minutes. After that when I run the app, I cannot play audio file and I am getting the exception attached in the subject. However, If I restart the app, everything works fine again. I have followed below steps and added following code and projects to do the task.
Created Universal Project (Windows Universal)
Added following code to send Background Message
BackgroundMediaPlayer.MessageReceivedFromBackground += BackgroundMediaPlayer_MessageReceivedFromBackground;
Added Windows Component Runtime (Windows Universal) with following code
Added Entry Point and Background Task in Package.appxmanifest
public sealed class AudioPlayer : IBackgroundTask {
private BackgroundTaskDeferral deferral;
private SystemMediaTransportControls systemmediatransportcontrol;
public void Run(IBackgroundTaskInstance taskInstance) {
systemmediatransportcontrol = BackgroundMediaPlayer.Current.SystemMediaTransportControls;
systemmediatransportcontrol.ButtonPressed += systemmediatransportcontrol_ButtonPressed;
systemmediatransportcontrol.PropertyChanged += systemmediatransportcontrol_PropertyChanged;
systemmediatransportcontrol.IsEnabled = true;
systemmediatransportcontrol.IsPauseEnabled = true;
systemmediatransportcontrol.IsPlayEnabled = true;
systemmediatransportcontrol.IsNextEnabled = true;
systemmediatransportcontrol.IsPreviousEnabled = true;
BackgroundMediaPlayer.Current.CurrentStateChanged += Current_CurrentStateChanged;
BackgroundMediaPlayer.MessageReceivedFromForeground += BackgroundMediaPlayer_MessageReceivedFromForeground;
deferral = taskInstance.GetDeferral();
taskInstance.Canceled += TaskInstance_Canceled;
taskInstance.Task.Completed += Taskcompleted;
}
void Taskcompleted(BackgroundTaskRegistration sender, BackgroundTaskCompletedEventArgs args) {
deferral.Complete();
}
private void TaskInstance_Canceled(IBackgroundTaskInstance sender, BackgroundTaskCancellationReason reason) {
try {
systemmediatransportcontrol.ButtonPressed -= systemmediatransportcontrol_ButtonPressed;
systemmediatransportcontrol.PropertyChanged -= systemmediatransportcontrol_PropertyChanged;
BackgroundMediaPlayer.Shutdown(); // shutdown media pipeline
}
catch (Exception) {
}
deferral.Complete();
}
void Current_CurrentStateChanged(MediaPlayer sender, object args) {
MediaPlayer player = sender;
switch (player.CurrentState) {
case MediaPlayerState.Playing:
systemmediatransportcontrol.PlaybackStatus = MediaPlaybackStatus.Playing;
break;
case MediaPlayerState.Paused:
systemmediatransportcontrol.PlaybackStatus = MediaPlaybackStatus.Stopped;
break;
}
}
void systemmediatransportcontrol_ButtonPressed(SystemMediaTransportControls sender, SystemMediaTransportControlsButtonPressedEventArgs args) {
try {
switch (args.Button) {
case SystemMediaTransportControlsButton.Play:
playTrack();
break;
case SystemMediaTransportControlsButton.Pause:
stopBeforePlaying();
break;
case SystemMediaTransportControlsButton.Next:
stopBeforePlaying();
nextTrack();
break;
case SystemMediaTransportControlsButton.Previous:
stopBeforePlaying();
previousTrack();
break;
}
}
catch (Exception) {
//Debug.WriteLine(ex.Message);
}
}
void stopBeforePlaying() {
MediaPlayer player = BackgroundMediaPlayer.Current;
if (player != null)
player.Pause();
}
void BackgroundMediaPlayer_MessageReceivedFromForeground(object sender, MediaPlayerDataReceivedEventArgs e) {
object foregroundMessageType;
if (e.Data.TryGetValue(ApplicationSettingsConstants.ChapterStatus, out foregroundMessageType)) {
//do something here
}
}
void UpdateUVCOnNewTrack() {
//update buttons here
}
async void playTrack() {
MediaPlayer player = BackgroundMediaPlayer.Current;
try {
if (...) {
//load track
player.Play();
}
else {
player.Pause();
MessageService.SendMessageToForeground(ApplicationSettingsConstants.ChapterStatus, (short)ChapterStatus.ForegroundFileNotFound);
}
}
catch (System.IO.DirectoryNotFoundException) {
player.Pause();
MessageService.SendMessageToForeground(ApplicationSettingsConstants.ChapterStatus, (short)ChapterStatus.ForegroundFileNotFound);
}
catch (System.IO.FileNotFoundException) {
player.Pause();
MessageService.SendMessageToForeground(ApplicationSettingsConstants.ChapterStatus, (short)ChapterStatus.ForegroundFileNotFound);
}
finally {
UpdateUVCOnNewTrack();
}
}
void nextTrack() {
//load next track
}
void previousTrack() {
//load previous here
}
}
Why I am getting the above error?
Note: I have followed Microsoft Sample Background Audio for Windows Phone 8.1 Sample to enable background audio player.
Thanks!
Some situations may cause this exception in BackgroundAudio.Reference to Windows-universal-samples/Samples/BackgroundAudio/cs/BackgroundAudio/Scenario1.xaml.cs, the comment of function ResetAfterLostBackground()
The background task did exist, but it has disappeared. Put the foreground back into an initial state. Unfortunately, any attempts to unregister things on BackgroundMediaPlayer.Current will fail with the RPC error once the background task has been lost..
So add this function, and invoke it where you catch the error.
const int RPC_S_SERVER_UNAVAILABLE = -2147023174; // 0x800706BA
private void ResetAfterLostBackground()
{
BackgroundMediaPlayer.Shutdown();
try
{
BackgroundMediaPlayer.MessageReceivedFromBackground += BackgroundMediaPlayer_MessageReceivedFromBackground;
}
catch (Exception ex)
{
if (ex.HResult == RPC_S_SERVER_UNAVAILABLE)
{
throw new Exception("Failed to get a MediaPlayer instance.");
}
else
{
throw;
}
}
}
I am working on a sound board for the raspberry pi with Java.
I used the libary Pi4j to make it possible to read Gpio pins.
To play a Mp3 we modified the code of an Pi4j example
button2.addListener(new GpioPinListenerDigital() {
#Override
public void handleGpioPinDigitalStateChangeEvent(GpioPinDigitalStateChangeEvent event) {
// display pin state on console
System.out.println(" --> GPIO PIN STATE CHANGE: " + event.getPin() + " = " + event.getState());
System.out.println(" Button Goat!");
play("goat.mp3");
}
});
our play method:
public static void play(String path) {
try {
FileInputStream FIS = new FileInputStream(path);
BufferedInputStream bis = new BufferedInputStream(FIS);
player = new Player(bis);
} catch (FileNotFoundException | JavaLayerException ex) {
}
new Thread() {
public void run() {
try {
player.play();
} catch (JavaLayerException ex) {
}
}
}.start();
}
When i put the java file on my RaspberryPi, and i compile the file i get this output
Image:
http://gyazo.com/0dc5e6cbe84ad00eed7d4a9df2b6b782
what i understand from this errors, is that the Libary's are not found.
how can i make this work?
Include the classpath to the javazoom lib on your command line.
I am working on audio recording in Nokia S40 series mobiles. I am able to record the message, but I am not able to play the recorded audio message.
Can anyone help me how to code for voice player for recorded .amr audio files?
Did any one come across this issue?
Here is my working example of recording and playing sound,
public class VoiceRecordMidlet extends MIDlet {
private Display display;
public void startApp() {
display = Display.getDisplay(this);
display.setCurrent(new VoiceRecordForm());
}
public void pauseApp() {
}
public void destroyApp(boolean unconditional) {
notifyDestroyed();
}
}
class VoiceRecordForm extends Form implements CommandListener {
private StringItem message;
private StringItem errormessage;
private final Command record, play;
private Player player;
private byte[] recordedAudioArray = null;
public VoiceRecordForm() {
super("Recording Audio");
message = new StringItem("", "Select Record to start recording.");
this.append(message);
errormessage = new StringItem("", "");
this.append(errormessage);
record = new Command("Record", Command.OK, 0);
this.addCommand(record);
play = new Command("Play", Command.BACK, 0);
this.addCommand(play);
this.setCommandListener(this);
}
public void commandAction(Command comm, Displayable disp) {
if (comm == record) {
Thread t = new Thread() {
public void run() {
try {
player = Manager.createPlayer("capture://audio");
player.realize();
RecordControl rc = (RecordControl) player.getControl("RecordControl");
ByteArrayOutputStream output = new ByteArrayOutputStream();
rc.setRecordStream(output);
rc.startRecord();
player.start();
message.setText("Recording...");
Thread.sleep(5000);
message.setText("Recording Done!");
rc.commit();
recordedAudioArray = output.toByteArray();
player.close();
} catch (Exception e) {
errormessage.setLabel("Error");
errormessage.setText(e.toString());
}
}
};
t.start();
}
else if (comm == play) {
try {
ByteArrayInputStream recordedInputStream = new ByteArrayInputStream(recordedAudioArray);
Player p2 = Manager.createPlayer(recordedInputStream, "audio/basic");
p2.prefetch();
p2.start();
} catch (Exception e) {
errormessage.setLabel("Error");
errormessage.setText(e.toString());
}
}
}
}
I am making an Application in J2ME, with the use of this application user will be able to capture an image and at the same time upload that image to Web Server, but whenever I use this app in my Nokia C series I am not able to capture an image and whenever use this application via Computer able to capture an image but send command is not working please see the problem and sort out this problem, and guide what I need to do to make this app helpful and useful for me …………….Thanks Amit here
public class myMidlet extends MIDlet implements CommandListener{
private Display display;
private Form form;
private Command exit, back, capture, camera, send;
private Player player;
private VideoControl videoControl;
private Video video;
int status = 0;
byte localData[];
public myMidlet() {
display = Display.getDisplay(this);
form = new Form("My Form");
exit = new Command("Exit", Command.EXIT, 0);
camera = new Command("Camera", Command.SCREEN, 1);
back = new Command("Back", Command.BACK, 2);
capture = new Command("Capture", Command.SCREEN, 3);
send = new Command("Send", Command.OK, 1);
form.addCommand(camera);
form.addCommand(exit);
form.setCommandListener(this);
}
public void startApp() {
display.setCurrent(form);
}
public void pauseApp() {}
public void destroyApp(boolean unconditional){
notifyDestroyed();
}
public void commandAction(Command c, Displayable s){
String label = c.getLabel();
if (label.equals("Exit")){
destroyApp(true);
} else if (label.equals("Camera")) {
showCamera();
} else if (label.equals("Back"))
display.setCurrent(form);
else if (label.equals("Capture")) {
video = new Video(this);
video.start();
form.addCommand(send);
form.removeCommand(camera);
}
else if( label.equalsIgnoreCase("Send") ){
try {
startSendOperation();
} catch (Exception ex) {
}
}
}
public boolean uploadImage( String uri, byte[] rawImage)throws Exception
{
HttpConnection httpConnection;
OutputStream out;
// Open connection to the script
httpConnection = (HttpConnection)Connector.open( uri );
// Setup the request as an HTTP POST and encode with form data
httpConnection.setRequestMethod( HttpConnection.POST );
httpConnection.setRequestProperty( "Content-type", "application/
x-www-form-urlencoded" );
// Encode the imagedata with Base64
String encoded = Base64.encode( rawImage ).toString();
// Build the output and encoded string
String output = "imgdata=" + encoded;
// Set the content length header
httpConnection.setRequestProperty("Content-Length", Integer.toString
((output.getBytes().length)));
// Open the output stream and publish data
out = httpConnection.openOutputStream();
out.write( output.getBytes() );
// Flush the buffer (might not be necessary?)
out.flush();
// Here you might want to read a response from the POST to make
// sure everything went OK.
// Close everything down
if( out != null )
if( httpConnection != null )
httpConnection.close();
// All good
return true;
}
public void startSendOperation() throws Exception{
boolean res = uploadImage( "http://www.xxx.com/postFolder?", localData);
}
public void showCamera(){
try{
player = Manager.createPlayer("capture://video");
player.realize();
videoControl = (VideoControl)player.getControl("VideoControl");
Canvas canvas = new VideoCanvas(this, videoControl);
canvas.addCommand(back);
canvas.addCommand(capture);
canvas.setCommandListener(this);
display.setCurrent(canvas);
player.start();
} catch (IOException ioe) {} catch (MediaException me) {}
}
class Video extends Thread {
myMidlet midlet;
public Video(myMidlet midlet) {
this.midlet = midlet;
}
public void run() {
captureVideo();
}
public void captureVideo() {
try {
byte[] photo = videoControl.getSnapshot(null);
localData = photo;
Image image = Image.createImage(photo, 0, photo.length);
form.append(image);
display.setCurrent(form);
player.close();
player = null;
videoControl = null;
} catch (MediaException me) { }
}
};
}
class VideoCanvas extends Canvas {
private myMidlet midlet;
public VideoCanvas(myMidlet midlet, VideoControl videoControl) {
int width = getWidth();
int height = getHeight();
this.midlet = midlet;
videoControl.initDisplayMode(VideoControl.USE_DIRECT_VIDEO, this);
try {
videoControl.setDisplayLocation(2, 2);
videoControl.setDisplaySize(width - 4, height - 4);
} catch (MediaException me) {}
videoControl.setVisible(true);
}
public void paint(Graphics g) {
int width = getWidth();
int height = getHeight();
g.setColor(255, 0, 0);
g.drawRect(0, 0, width - 1, height - 1);
g.drawRect(1, 1, width - 3, height - 3);
}
}
In ShowCamera method,Instead of
Manager.createPlayer("capture://video");
Try using
Manager.createPlayer("capture://image");
I have two questions.
The first is about updating the UI, the second is when I try to connect to the camera to get a mjpeg stream and run getResponseCode(), the app locks there. The MDS shows a lot of data transferring.
I have some classes like ....:
Http extends Thread {
public abstract String getUrl();
public abstract String getBase64Encode();
public abstract void onReturn(int responseCode, InputStream is,int lenght);
protected abstract void onError(Exception e);
}
CameraHttp extends Http and MjpegHttp extends CameraHttp.
http connects to a URL which is the jpeg or mjpeg camera adresses.
I have a Camera Class. It starts a connection with the overridden method mjpegconnection.go();
I also have a static bitmap on ViewCam screen which extends MainScreen.
After it starts:
url = getUrl();
queryString = encodeURL(queryString);
byte postmsg[] = queryString.getBytes("UTF-8");
httpConnection = (HttpConnection) Connector.open(url
+ ";deviceside=false", Connector.READ_WRITE);
httpConnection.setRequestMethod(HttpConnection.GET);
httpConnection.setRequestProperty("Authorization", getBase64Encode());
os = httpConnection.openDataOutputStream();
for (int i = 0; i < postmsg.length; i++) {
os.write(postmsg[i]);
}
{
if (!cancel) {
System.out.println(httpConnection.getURL()+
" *****"+httpConnection.getPort());
System.out.println("onreturn oncesi"
+ httpConnection.getResponseCode());
onReturn(httpConnection.getResponseCode(), httpConnection
.openInputStream(),(int) httpConnection.getLength());
System.out.println("onreturn sornrası");
}
os.close();
httpConnection.close();
}
} catch (Exception e) {
System.out.println("hata " + e.getMessage());
try {
httpConnection.close();
Thread.sleep(60);
} catch (Exception ie) {
}
onError(e);
}
After dosomething
// decides mjpeg-jpeg stream
// if it is mjpeg, direct to parser,
// else it sets image with setImage() and return to connection with go();
public void parse(InputStream is, int lenght) {
try {
if (!type.isMjpegStream()) {
setImage(is, lenght);
System.gc();
StaticVar.ActiveCam.setConnected(true);
} else {
if (parser == null) {
parser = new JpegParser(is, this);
} else {
parser.setInputSteam(is, this);
}
parser.parse();
is.close();
}
} catch (Exception e) {
}
}
and
public void setImage(InputStream is, int lenght) {
byte[] raw = new byte[lenght];
try {
is.read(raw);
currentImage = Bitmap.createBitmapFromBytes(raw, 0, raw.length, 1);
ViewCam.ViewCam=currentImage; //static var.
} catch (IOException e) {
System.out.println("catche***********");
// TODO Auto-generated catch block
e.printStackTrace();
}
}
How can I repaint the screen to show the bitmap?
And my ViewCam
public class ViewCam extends MainScreen {
Header header;
String headerString;
public static Bitmap ViewCam;// cam image shows
private static Thread My;// runs connection
void OnStart() {
My = new Thread() {
public void run() {
System.out.println("ONSTART");
StaticVar.ActiveCam.go();
};
};
My.start();
Bitmap bitmap = Bitmap.getBitmapResource("res/main.png");
Bitmap bmp2 = ResizeImage.resizeBitmap(bitmap, Display.getWidth(),
Display.getHeight());
Background bg = BackgroundFactory.createBitmapBackground(bmp2);
this.setBackground(bg);
this.getMainManager().setBackground(bg);
}
public ViewCam() {
StaticVar.ActiveCam.getIp();
OnStart();
headerString ="Cam View";
header = new Header("res/bartop.png", headerString, 0);
add(header);
ViewCam = Bitmap.getBitmapResource("res/spexco_splash.png");
ViewCam = ResizeImage.bestFit(ViewCam, Display.getWidth(), Display
.getHeight());
BitmapField bf = new BitmapField(ViewCam);
add(bf);
}
}
Try Screen.invalidate()
public void invalidate(int x, int y, int width, int height)
Invalidates a region of this screen.
This method marks a region of this screen as needing a repaint. The repainting is handled later by the main event dispatch thread.
Note: Any thread can safely invoke this method, and does not require to synchronize on the event lock.
Overrides:
invalidate in class Manager
Parameters:
x - Left edge of the region in ContentRect coordinates.
y - Top edge of the region in ContentRect coordinates.
width - Width (in pixels) of the region.
height - Height (in pixels) of the region.