How to simulate camera midlet in Nokia C6? - java-me

I need to simulate the behaviour of the default camera midlet from Nokia.
It's for Nokia C6, and I am writing it in J2ME.
I use MMAPI, the problem is the size of VideoControl item, I made it videoControl.setDisplayFulscreen(true); but it ain't fullscreen at all, the method setDisplaySize doesn't help, the size of videoControl itself is roughly one third of the display (the rest of desired displaySize is just black), here's a code sample:
public CameraCanvas (Evidence_elektromeru midlet, ManagePhotos caller,String name) {
super(true);
this.midlet = midlet;
this.caller = caller;
this.name = name;
this.setFullScreenMode(true);
try {
player = Manager.createPlayer("capture://devcam0");
player.realize();
// player.prefetch();
if (videoControl2 != null)
videoControl2.setVisible(false);
videoControl1 = (VideoControl) player.getControl("VideoControl");
videoControl1.initDisplayMode(VideoControl.USE_DIRECT_VIDEO,this);
videoControl1.setDisplayLocation(0, 0);
videoControl1.setDisplaySize(360,500);
}catch (MediaException me2) {
try {
videoControl1.setDisplayFullScreen(true);
} catch (Exception e) {}
}
catch (Exception e) {}
finally {
try {
player.start();
} catch (Exception e) {}
videoControl1.setVisible(true);
}

try to use
mCamera = Manager.createPlayer("capture://video");
mCamera.realize();
mCamera.prefetch();
or you can replace mCamera = Manager.createPlayer("capture://video"); by
mCamera = Manager.createPlayer("capture://image");

Related

Camera cannot take image when app go background but work fine when app in foreground

I am making application which take image from front camera on firebase remote command. App work fine and take picture without user interaction, but when app close or go to foreground, app start giving error that Fail to connect to camera service. As soon as app open it capture the image.
I run foreground service notification which working but still same camera fail error and can not take picture.
try {
Log.d("kkkk", "Preparing to take photo");
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
int frontCamera = cam;
//int backCamera=0;
Camera.getCameraInfo(frontCamera, cameraInfo);
try {
camera = Camera.open(frontCamera);
} catch (RuntimeException e) {
Log.d("kkkk", "Camera not available: " + e.getMessage());
camera = null;
// takePicture(0);
}
try {
if (null == camera) {
Log.d("kkkk", "Could not get camera instance");
} else {
Log.d("kkkk", "Got the camera, creating the dummy surface texture");
try {
camera.setPreviewTexture(new SurfaceTexture(0));
camera.startPreview();
} catch (Exception e) {
Log.d("kkkk", "Could not set the surface preview texture");
e.printStackTrace();
}
camera.takePicture(null, null, new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
Log.d("kkkk", "clicked");
// Encode the byte array into a base64 string
// String imageString = android.util.Base64.encodeToString(imageBytes, android.util.Base64.DEFAULT);
// Log.d("error200", imageString);
FirebaseStorage storage = FirebaseStorage.getInstance();
StorageReference storageRef = storage.getReference();
String path = "images/"+username.toLowerCase()+device.replace(" ","");
StorageReference imageRef = storageRef.child(path);
Bitmap bitmap = BitmapFactory.decodeByteArray(data, 0, data.length); // Replace this with your bitmap image
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 10, baos);
byte[] data0 = baos.toByteArray();
UploadTask uploadTask = imageRef.putBytes(data0);
uploadTask.addOnFailureListener(new OnFailureListener() {
#Override
public void onFailure(#NonNull Exception exception) {
// Handle unsuccessful uploads
Log.d("pic","fail"+exception.getMessage());
}
}).addOnSuccessListener(new OnSuccessListener<UploadTask.TaskSnapshot>() {
#Override
public void onSuccess(UploadTask.TaskSnapshot taskSnapshot) {
// Handle successful uploads
Log.d("pic","done");
}
});
camera.release();
}
});
}
} catch (Exception e) {
camera.release();
}
} catch (Exception e) {
Log.d("errorData", e.getMessage());
}
onDestroy method I release the camera but still same error.
#Override
public void onDestroy() {
super.onDestroy();
if (camera != null) {
camera.stopPreview();
camera.release();
camera = null;
}```
Can you add START_ACTIVITIES_FROM_BACKGROUND permission in the manifest if you have not added then you need to add this permission for using a camera in the foreground service. But I'm not sure the android newer version supports the used camera without user interaction
Please Refer Official Android Documentation here

how to stop MediaPlayer correctly?

I can stop the Audio but if I check my Android-Studio Debug windows, i see that the MediaPlayer works in Background !!!
I play the Sound like that :
try {
myMediaPlayer1 = new MediaPlayer();
myMediaPlayer1.setAudioStreamType(AudioManager.STREAM_MUSIC);
myMediaPlayer1.setDataSource("http://myweb.com/audios/1.mp3");
myMediaPlayer1.prepare();
myMediaPlayer1.start();
} catch (Exception e) {
// TODO: handle exception
}
I stop the MediaPlayer via a click on a Button or OnDestry like that :
#Override
protected void onDestroy() {
super.onDestroy();
myMediaPlayer1.stop();
}
I debug via USB Connection. After Stop the Sound I see the MediaPlay works in Background :
You need to release media player using mediaPlayer.release(). If MediaPlayer is not playing any song/audio you shouldn't stop MediaPlayer. See below code for more help.
private void releaseMediaPlayer() {
try {
if (mediaPlayer != null) {
if (mediaPlayer.isPlaying())
mediaPlayer.stop();
mediaPlayer.release();
mediaPlayer = null;
}
} catch (Exception e) {
e.printStackTrace();
}
}

Voice or Audio player for .amr file in Java ME

I am working on audio recording in Nokia S40 series mobiles. I am able to record the message, but I am not able to play the recorded audio message.
Can anyone help me how to code for voice player for recorded .amr audio files?
Did any one come across this issue?
Here is my working example of recording and playing sound,
public class VoiceRecordMidlet extends MIDlet {
private Display display;
public void startApp() {
display = Display.getDisplay(this);
display.setCurrent(new VoiceRecordForm());
}
public void pauseApp() {
}
public void destroyApp(boolean unconditional) {
notifyDestroyed();
}
}
class VoiceRecordForm extends Form implements CommandListener {
private StringItem message;
private StringItem errormessage;
private final Command record, play;
private Player player;
private byte[] recordedAudioArray = null;
public VoiceRecordForm() {
super("Recording Audio");
message = new StringItem("", "Select Record to start recording.");
this.append(message);
errormessage = new StringItem("", "");
this.append(errormessage);
record = new Command("Record", Command.OK, 0);
this.addCommand(record);
play = new Command("Play", Command.BACK, 0);
this.addCommand(play);
this.setCommandListener(this);
}
public void commandAction(Command comm, Displayable disp) {
if (comm == record) {
Thread t = new Thread() {
public void run() {
try {
player = Manager.createPlayer("capture://audio");
player.realize();
RecordControl rc = (RecordControl) player.getControl("RecordControl");
ByteArrayOutputStream output = new ByteArrayOutputStream();
rc.setRecordStream(output);
rc.startRecord();
player.start();
message.setText("Recording...");
Thread.sleep(5000);
message.setText("Recording Done!");
rc.commit();
recordedAudioArray = output.toByteArray();
player.close();
} catch (Exception e) {
errormessage.setLabel("Error");
errormessage.setText(e.toString());
}
}
};
t.start();
}
else if (comm == play) {
try {
ByteArrayInputStream recordedInputStream = new ByteArrayInputStream(recordedAudioArray);
Player p2 = Manager.createPlayer(recordedInputStream, "audio/basic");
p2.prefetch();
p2.start();
} catch (Exception e) {
errormessage.setLabel("Error");
errormessage.setText(e.toString());
}
}
}
}

Implement video streaming in java me using rtsp

I want to implement video streaming in java me using rtsp url. When tested the code on devices, I get Media Exception stating Prefetch Error-33. Here's my code
private void startStreaming()
{
try
{
mplayer=Manager.createPlayer(videourl);
mplayer.addPlayerListener(this);
mplayer.realize();
videoControl=(VideoControl) mplayer.getControl("VideoControl");
if(videoControl!=null)
{
Item video=(Item) videoControl.initDisplayMode(videoControl.USE_GUI_PRIMITIVE, null);
videoControl.setVisible(true);
System.out.println("Playing");
Form v=new Form("Playing Video");
StringItem si=new StringItem("Status", "Playing....");
v.append(video);
display.setCurrent(v);
}
mplayer.prefetch();
mplayer.start();
}
catch(Exception noCanDo)
{
Form f=new Form("Error");
f.append("Error : "+noCanDo);
display.setCurrent(f);
}
}
I have also tried using alternative method of using MIDlet.platformrequest(videourl) method which invokes default internal player of device to play video file. The player is being started but later on, a connection timeout prompt occurs. I have however tested the rtsp url and it works very much fine. Any suggestions as to how can I do video streaming using rtsp url in java me?
Use this code for streamin RTSP,it should work for nokia symbian belle sdk 1.1 and nokia sdk 2.0
protected void startApp() throws MIDletStateChangeException {
VideoCanvas VC = new VideoCanvas(this,url);
Display.getDisplay(this).setCurrent(VC); }
}
//videoCanvas Class
public VideoCanvas(ExampleStreamingMIDlet midlet, String url) {
this.midlet = midlet;
this.url = url;
addCommand(start);
addCommand(stop);
addCommand(back);
addCommand(exit);
setCommandListener(this);
this.setFullScreenMode(true);
}
public void commandAction(Command c, Displayable arg1) {
if(c == start) {
start();
}
public void start() {
try{
Player player = Manager.createPlayer(url);
player.addPlayerListener(this);
player.realize();
control = (VideoControl)player.getControl("VideoControl");
if (control != null) {
control.initDisplayMode(VideoControl.USE_DIRECT_VIDEO,this);
control.setDisplaySize(176,144);
int width = control.getSourceWidth();
int height = control.getSourceHeight();
status2 = "Before: SW=" + width + "-SH=" + height + "-DW=" + control.getDisplayWidth() + "-DH=" + control.getDisplayHeight();
}
player.start();
}
catch(Exception e) {
Alert erro = new Alert("Erro",e.getMessage(),null,AlertType.ERROR);
Display.getDisplay(midlet).setCurrent(erro);
}
}
public void stop() {
if(player != null) {
player.deallocate();
player.close();
}
}

How can I repaint part of screen on blackberry while connections run?

I have two questions.
The first is about updating the UI, the second is when I try to connect to the camera to get a mjpeg stream and run getResponseCode(), the app locks there. The MDS shows a lot of data transferring.
I have some classes like ....:
Http extends Thread {
public abstract String getUrl();
public abstract String getBase64Encode();
public abstract void onReturn(int responseCode, InputStream is,int lenght);
protected abstract void onError(Exception e);
}
CameraHttp extends Http and MjpegHttp extends CameraHttp.
http connects to a URL which is the jpeg or mjpeg camera adresses.
I have a Camera Class. It starts a connection with the overridden method mjpegconnection.go();
I also have a static bitmap on ViewCam screen which extends MainScreen.
After it starts:
url = getUrl();
queryString = encodeURL(queryString);
byte postmsg[] = queryString.getBytes("UTF-8");
httpConnection = (HttpConnection) Connector.open(url
+ ";deviceside=false", Connector.READ_WRITE);
httpConnection.setRequestMethod(HttpConnection.GET);
httpConnection.setRequestProperty("Authorization", getBase64Encode());
os = httpConnection.openDataOutputStream();
for (int i = 0; i < postmsg.length; i++) {
os.write(postmsg[i]);
}
{
if (!cancel) {
System.out.println(httpConnection.getURL()+
" *****"+httpConnection.getPort());
System.out.println("onreturn oncesi"
+ httpConnection.getResponseCode());
onReturn(httpConnection.getResponseCode(), httpConnection
.openInputStream(),(int) httpConnection.getLength());
System.out.println("onreturn sornrası");
}
os.close();
httpConnection.close();
}
} catch (Exception e) {
System.out.println("hata " + e.getMessage());
try {
httpConnection.close();
Thread.sleep(60);
} catch (Exception ie) {
}
onError(e);
}
After dosomething
// decides mjpeg-jpeg stream
// if it is mjpeg, direct to parser,
// else it sets image with setImage() and return to connection with go();
public void parse(InputStream is, int lenght) {
try {
if (!type.isMjpegStream()) {
setImage(is, lenght);
System.gc();
StaticVar.ActiveCam.setConnected(true);
} else {
if (parser == null) {
parser = new JpegParser(is, this);
} else {
parser.setInputSteam(is, this);
}
parser.parse();
is.close();
}
} catch (Exception e) {
}
}
and
public void setImage(InputStream is, int lenght) {
byte[] raw = new byte[lenght];
try {
is.read(raw);
currentImage = Bitmap.createBitmapFromBytes(raw, 0, raw.length, 1);
ViewCam.ViewCam=currentImage; //static var.
} catch (IOException e) {
System.out.println("catche***********");
// TODO Auto-generated catch block
e.printStackTrace();
}
}
How can I repaint the screen to show the bitmap?
And my ViewCam
public class ViewCam extends MainScreen {
Header header;
String headerString;
public static Bitmap ViewCam;// cam image shows
private static Thread My;// runs connection
void OnStart() {
My = new Thread() {
public void run() {
System.out.println("ONSTART");
StaticVar.ActiveCam.go();
};
};
My.start();
Bitmap bitmap = Bitmap.getBitmapResource("res/main.png");
Bitmap bmp2 = ResizeImage.resizeBitmap(bitmap, Display.getWidth(),
Display.getHeight());
Background bg = BackgroundFactory.createBitmapBackground(bmp2);
this.setBackground(bg);
this.getMainManager().setBackground(bg);
}
public ViewCam() {
StaticVar.ActiveCam.getIp();
OnStart();
headerString ="Cam View";
header = new Header("res/bartop.png", headerString, 0);
add(header);
ViewCam = Bitmap.getBitmapResource("res/spexco_splash.png");
ViewCam = ResizeImage.bestFit(ViewCam, Display.getWidth(), Display
.getHeight());
BitmapField bf = new BitmapField(ViewCam);
add(bf);
}
}
Try Screen.invalidate()
public void invalidate(int x, int y, int width, int height)
Invalidates a region of this screen.
This method marks a region of this screen as needing a repaint. The repainting is handled later by the main event dispatch thread.
Note: Any thread can safely invoke this method, and does not require to synchronize on the event lock.
Overrides:
invalidate in class Manager
Parameters:
x - Left edge of the region in ContentRect coordinates.
y - Top edge of the region in ContentRect coordinates.
width - Width (in pixels) of the region.
height - Height (in pixels) of the region.

Resources