Get Camera resolution in Flex Builder - get

I am developing an application using Flex Builder and i am a newbie to it...
My task is to develop an application using flex builder having two cameras 1 for video chat and one for capturing images and send it to the friend with whom you are chatting....
I have developed the application and written a script for it...but i am facing a problem that if i am having an arrays of camera then how i will get to know that which one is HD cam and which one is normal web cam... and according to it i have to set the cameras for chatting and capturing ....
so my question is how can i set the HD cam in my application if we have two cameras simultaneously. Is there any built in function to know the resultion or something like that....

Related

Live streaming from UWP to Linux/Python Server

I have an UWP app that capture a live video stream (webcam), encodes it in h264, and sends it through a TCP socket (in a local network, I need high performance) to a Linux device.
Is there a way to do this? I need the video not for playing it but for extract single frames. I could do that with opencv but it requires a local video file, instead I'm using a live stream.
I would send photos instead of a video stream if the time needed for capture one was acceptable, but it requires about 250 ms.
Is RTP required? Does UWP (windows) provides a way to achive this?
Thank you
P.S.: The UWP app runs in Hololens.
You can use WebRTC to transmit live video from the HoloLens easily to any target. That's probably the easiest way to do it without going really low level.
For an introduction just grab this repo and try the sample app which runs perfectly on the HoloLens https://github.com/webrtc-uwp/PeerCC/tree/e95f231e1dc9c248ca2ffa040276b8a1265da145/Client

How do I make a game for the chromecast

I want to create a simple game for the Chromecast, where the game is on the TV and the phone is the controller.
I found two possible ways to do this: Google Manager API and Remote Display API, but both seem to be deprecated according to the documentation in these links:
https://developers.google.com/cast/docs/gaming
https://developers.google.com/cast/docs/remote
What are Googles future plans for developing games (and other non-media-streaming-apps) for the Chromecast?
Is it possible to create a game without the API above? (The game will be a simple quiz game)

Programming webcam on Linux

I want to be able to capture images from a webcam on Linux. This is still a project requirement and I'm having a difficulty finding up-to-date information about capturing images from a webcam on Linux. Is it true that every webcam has different API (unlike the Windows variant where I can use a common API), so I must write the program for a specific webcam?
Webcams on Linux are accessed through the Video4Linux API, which is common across all camera models.
There are plenty of existing framegrabbers for webcams that use this API - you could look at these for ideas, or just one as-is.

Mute application volume in win CE

I have developed a media based application that runs on a device with Win CE. I need to have a mute/unmute button for controlling the application volume. I have developed the app using .Net compact framework.
I've used waveOutSetVolume for mute in WinCE:
http://www.pinvoke.net/default.aspx/coredll/waveOutSetVolume.html
You cannot control the sound volume for certain application. What you could do is define a static Mute boolean. When it's set to true, the sounds are not being played.
OR
You could look for a custom library that allows playing audio files and controlling the volume - anyway, this isnt the case, if you are only interested in mute/unmute.

can J2ME access camera(image capture) event from N73 device

I am working on project where I need to catch the image capture event.
It's for nokia N73 having platform S60 3rd edition.
Is there any possible way using J2ME only (without using symbian).
Description:
J2ME application running in background, on click of capturing image from camera J2ME application initiates and comes in front. Takes the captured image and transfers it to J2ME app and displays on screen.
if not possible using J2ME , Is there any possible way using symbian? can anyone provide tutorial or code snippet?
Thank you.
Regards,
Rajiv
Not possible to access the native camera from J2ME. You'd need to get the user to start your app first, then access the camera from your app (using JSR 135, spec here, introduction and examples here). Then you can use the captured image however you wish.
HTH
The N73 in particular has a fairly large hardware limitation when you want to use the camera.
You need to have the user manually open the camera cover before you can use the camera.
This launches the native camera application included in S60.
The user then needs to close that application.
From that point on, J2ME can use the camera, via the mobile media API defined in JSR-135.
If the user reboots the phone, the camera cover needs to be re-opened before J2ME can use the camera again.
You may have better luck using J2ME and JSR-135 to capture images using the front camera on the N73.
I seriously doubt that J2ME would see the user pressing the camera key in javax.microedition.lcdui.Canvas.keyPressed();
JSR-135 doesn't really provide a system-wide camera capture event for J2ME.

Resources