Image capture in lwuit [duplicate] - java-me

I have tried using MediaComponent, but since it is now deprecated it wont be good moving forward. Also i was not able to get it to re-size to full screen on the form. I am trying to use VideoComponent to capture a screen shot in an S40 device. I cant find how to instantiate the VideoComponent properly for capturing an image and not playing a video.

You can use VideoComponent for capturing an image.
First, to instantiate the VideoComponent, you need to create a native peer:
VideoComponent videoComponent = VideoComponent.createVideoPeer("capture://video");
Player player = (Player) videoComponent.getNativePeer();
player.realize();
VideoControl videoControl = (VideoControl) player.getControl("VideoControl");
To capture the image, you have to start video component and use getSnapshot of Video Control:
videoComponent.start();
videoControl.getSnapshot(null);
If you want to resize the video component to full screen you can use:
videoComponent.setFullScreen(false);
Other posibility is:
videoComponent.setPreferredH(Display.getInstance().getDisplayHeight());
videoComponent.setPreferredW(Display.getInstance().getDisplayWidth());

Right now the VideoComponent is mostly designed for playback and doesn't really work well for capture. We will try to improve it in the near future to make it more flexible. Its relatively easy to migrate from MediaComponent to VideoComponent so it shouldn't be a problem.

Related

How can I start playing the audio track of a AVPlayer in the background?

How do you play another item when the one playing is finished (think playlist...)
I want to be able to create a new AVPlayerLayer and attach a player to it and start it in the background (ie. audio only).
Thanks
I know I can keep the audio of a streaming video in the background by doing self?.playerLayer?.player = nil when leaving the foreground. That is not what I am asking about.
You can skip a playerLayer if you like. AVPlayerLayer is only a visual class that uses an AVPlayer.
player = [AVPlayer playerWithURL:theURL];
[player play];
or
[yourPlayerLayer.player play]
Bob's your auntie

Detect whether RPScreenRecorder is recording

In my application, I use AVPlayer to play videos. However, I do not use FairPlay.
I tried to use RPScreenRecorder to detect whether screen recording is on:
[RPScreenRecorder sharedRecorder].isRecording
but that does not work.
Does anyone know how to get whether screen recording is on?
Get the window of your app delegate and check isCaptured:
[appDel.window.screen isCaptured]
That will indicate if anything is "using" the screen: RPScreenRecorder, airplay, or mirroring.
Normally RPScreenRecorder is not compatible with AVplayer. That might be the cause of what you are going through
It is said in actual Documentation here: Here
ReplayKit is incompatible with AVPlayer content.

Painting frames while media session is paused

I'm working on a custom video player using the Media Foundation framework.
Currently, I can play, pause, stop or change the rate of the playback using an IMFMediaSession.
I can also retrieve a single frame using an IMFSourceReader.
I am currently able to render a frame (IMFSample) to a window area (a HWND) but only when the media session is stopped.
My goal is to be able to render a frame while the media session is paused.
(= doing frame-stepping using a source reader and not the media session)
I'm using GetDC, CreateBitmap, SelectObject and BitBlt to render my frame.
I tried using directd3d interfaces to fill it with a solid color (I'm really new to direct3d so followed a basic tutorial) but it didn't work.
Here is what I did : retrieving an IDirect3DDeviceManager9 with MR_VIDEO_ACCELERATION_SERVICE, doing OpenDeviceHandle, LockDevice, Clear, Begin/EndScene and Present.
None of these calls fail but I suspect the EVR is still painting the last frame.
So basically, I want the EVR to stop repainting its frame when I want and of course, I need to re-enable its painting process.
Any idea how to do that ?
Thanks
I finally got it working.
If you're interested, do the following:
retrieve IMFVideoDisplayControl and IMFVideoMixerBitmap from the media session using MFGetService
set up MFVideoAlphaBitmap structure and feed it to IMFVideoMixerBitmap::SetAlphaBitmap (there is a working example at the dedicated MSDN page)
call IMFVideoDisplayControl::RepaintVideo to update the output
To hide the previous content, don't set the alpha so that it's opaque.
Call IMFVideoMixerBitmap::ClearAlphaBitmap to get the previous content back.
And voilĂ  !

Simple UIWebView

I'm making a tab bar app and one of the tabs is a youtube channel. It works fine and is perfect but the one problem is when i click on the video it plays but only in the standard view and not in Landscape. Any ideas on how to fix. Im sure its really simple but i've been having no luck.
This question probably answers what you are trying to do:
Webview resizes automatically to portrait and landscape view in iphone
The key thing to take away from it is that you need your view controller to tell the app that it supports the other orientations, not just portait.
- (BOOL) shouldAutorotateToInterfaceOrientation: (UIInterfaceOrientation) orientation
{
return YES;
}

How to take a Snapshot in J2me with the MMAPI while using LWUIT

I want to make an application that takes a snapshot from Camera and send it to the server. I can do this pretty easily in Highlevel UI using MediaPlayer but it seem that I can't use the same code in LWUIT.
VideoControl vc;
Item videoItem = (Item) vc.initDisplayMode(vc.USE_GUI_PRIMITIVE, null);
Since it didn't work I used the MediaComponent.
MediaComponent videoItem = (MediaComponent) vc.initDisplayMode(vc.USE_GUI_PRIMITIVE, null);
but it didn't work either. A class cast exception is thrown. Can you please tell me how to implement it.
You don't need to call initDisplayMode for LWUIT. This is invoked internally by LWUIT, just add the video item to the LWUIT Form and the initDisplayMode method will be invoked for you.

Resources