Facing issue with Camera in Xamarin iOS 13 [duplicate] - xamarin.ios

Android emulators can simulate Camera device (see screenshot)
For example I can test how my video recording module works:
What about iOS-Simulators? When I try to run my app which uses camera I get the next error
Thread 5: Fatal error: Unexpectedly found nil while unwrapping an Optional value
at line
let videoDeviceInput = try AVCaptureDeviceInput(device: defaultVideoDevice!)
so no simulated devices are available for iOS-Simulators?

According to Apple documentation, using camera with Simulator is not supported:
The following hardware is not supported in Simulator:
Ambient light sensor
Audio input, except for using Siri by choosing Hardware > Siri.
Barometer
Bluetooth
Camera
Motion support (accelerometer and gyroscope)
Proximity sensor

There is one known workaround which could be useful sometimes: https://github.com/YuigaWada/iCimulator
But it does not work with 3-rd party libs like WebRTC though...

Related

kinect dk camera not showing color image

Situation is very weird, when I got it over a year ago, it was fine. I have updated the sdk, updated the windows 10, updated the firmware, all are latest.
behavior in the official "Azure Kinect Viewer" is the following:
device shown, opens fine.
if I hit start on the defaults I get a "camera failed: timed out" after a few seconds and all other sensors (depth, IR) show nothing, I think they are not the issue here (see 3)
if I de-select the color camera (after stop and start) the depth and IR are shown and getting 30fps as expected.
So, my problem is with the color camera connection, since everything runs on the same wires, I'm assuming the device is fine and that the cable works.
I have tried the color camera with and without the depth, on all options with respect to frame rate and resolution, none work at all, missing camera fails. In some cases I get an encrypted error message in the log, something with libUSB, I rarely get it so I'm not including the exact output.
When I open the "camera app" (standard windows 10) just to see if this app sees the color camera, I do get an image (Eureka!), but the refresh rate is like a frame every 5 seconds or more (!!!!), I've never seen anything like that.
I'm very puzzled with this behavior, I'm also including my device manager layout for the device perhaps that hints someone, as far as i can see, this should be a supported specification (I tried switching ports but in most cases the device was never detected or had the same color image behavior).
Any hints on how to move forward, much appreciated.
Bonus question:
If I solved this issue with the color camera, is there a way to work with the dk camera from a remote desktop session? (the microphones seem to not be detected when doing remote desktop). My USB device manager looks fine as far as I can see
ps.
I had also tried "disable streaming LED" and instead of camera failed in the dk viewer, I can get a color image but with a frame rate of about 1 frame per 4 seconds or more.

PCIe cards interfere with each others function

Hello Good People of the Internet!
First time asking...
I have a modern PC running Fedora 24 with a real-time patch (CCRMA audio tools) with an ASUS Essence STX II sterio sound card installed on PCIe. With it we run a playback/capture application. Also, we need to integrate CAN and BLE into the system and have a PCIe-card for each of these functions. The CAN PCIe card is from PEAK and the BLE card is an Intel 8260 M2 card that HP have put on a PCIe card (AFAIK).
With only the audio card installed it works fine (using ALSA as API). When the CAN and BLE is installed the following is observed:
Playback works as before.
One capture channel only returns zero (0) or minus one (-1) in all samples.
The other capture channel returns values in the range -2..2 and when applying our application signal processing low quality, but detectable, expected results are presented.
The ALSA API report no problems in setup and configuration.
CAN and BLE functions as expected.
Without any deeper PCIe experience I suspect that CAN and/or BLE PCIe cards jumble the mapping of the sound card functions.
Can someone:
- Tell me if my hunch is in the ballpark?
- Inform me on where I might go for information on how to rectify the problem?
- ...or, share a solution?
Thanks!

Bluetooth headphone music quality deteriorates when launching iOS simulator

The situation goes a little something like this:
I am programming Xcode whilst concurrently listening to music on my Bluetooth headphones... you know to block out the world.
Then, I go to launch my app in the iOS simulator and BOOM all of a sudden my crystal clear music becomes garbled and super low quality like it is playing in a bathtub 2 blocks away... in the 1940s.
Note: the quality deterioration does NOT occur if I am playing music on my laptop or cinema display and I launch the sim. It seems to be exclusively a Sim -> Bluetooth issue.
The problem is more than just annoying. Because often after stopping the simulator the crappy bathtub quality music continues. To fix it I have to open sound preferences in OSX and briefly toggle back to my laptop sound and then back to my Bluetooth headphones.
This is a big deal because I launch the simulator 50x a day and have to do this toggle thing every time as well as suffer listening to 40s era mono ham radio quality music.
For your information, the headphones I am using are Plantronics BackBeat Pro and I am up to date on firmware. I am on OSX 10.11.4 and Xcode 7.3... but this problem has persisted through all versions for 2+ years now. Can you save me from the 1940s?
I've managed to fix it, and it actually seems to be a microphone issue. Go to System Preferences -> Sound, select the Input tab and set Internal Microphone as the input (mine was set with my headphones').
Crappy sound goes way after that =)
EDIT (May 30 2018):
I've found out an easier way to do the same as above. Instead of opening the System Preferences, you can just go to the Mac OSX toolbar, press Option (alt) + click on the sound icon and then select "Internal Microphone" from the "Input Device" list. Print screen as follows.
If you're using Xcode 9 or higher, you can set a default audio input and output for the simulator. This can be done by launching the simulator from Xcode and navigating to I/O > Audio Input within the menu bar and selecting Internal Microphone. This solution will save your audio preference so you won't have to change it on every launch.
On Simulator, Select;
I/O -> Audio Input -> Macbook [Pro]
Done.
Seems like years of suffering are finally over, Xcode 12 Beta Release Notes:
Simulator defaults to the internal microphone unless you explicitly choose a different audio source. This avoids triggering phone call mode on Bluetooth headsets which degrades audio quality while listening to music. (59338925, 59803381)
You can also switch to Mac's internal mic in System Preferences -> Sound, that's how I usually fix this bug (I have Sony Wh-1000XM3)

Bypassing Android Invensense Motion Library to create custom magnetometer calibration

I am looking to get the raw values from the magnetometer chips, Raw as in uncalibrated, unaltered, straight from the I2C interface. I've traced down the android java, native source and invensense code API to the HAL and found that for the Galaxy Nexus (which uses the Yamaha GMR magnetometer) invensense applies an "adaptive filter", a threshold filter (dead zone), then performs a realtime magnetometer calibration to compensate for hard and soft iron biases. I would like to bypass that calibration algorithm and replace it with my own.
Is there a way to intercept the magnetometer data after the serial comm code but before the calibration? Can you access the invensense libraries through Android NDK?
Currently developing on :
- Galaxy Nexus
- Android 4.0 ICS
- Eclipse IDE, windows environment
You can change the kernel driver to print those raw values directly.

Capturing webcam stream under linux

I'm trying to get images from a minoru3d webcam, which is actually two Vimicro webcams plus a USB hub in a single package. The problem is, opencv always takes streams in maximum resolution, making simultaneous capture from two webcams impossible(due to usb constraints). How do I set resolution or FPS? For some reason, opencv calls
cvSetCaptureProperty( capture, CV_CAP_PROP_FRAME_WIDTH, 320 );
cvSetCaptureProperty( capture, CV_CAP_PROP_FRAME_HEIGHT, 240 );
don't work. I don't need to work with opencv, any other library doing the same job is good for me. The webcam uses uvc drivers from kernel 2.6.30, with v4l2. I tried the custom module here: http://linuxtv.org/hg/~pinchartl/uvcvideo on my Ubuntu box with 2.6.27 kernel.
I used luvcview and v4l2cam for my purposes. 2 is specifically written for the Minoru.

Resources