camera capture looks different on windows and linux - linux

hey im trying to use a raspberry pi with opencv to prosses some images but while i am capturing the images from the webcam using the same values using windows on my pc and Ubuntu mate on the pi they look very different
linux capture
windows capture
does anyone know why there any difference

It looks like your usb webcam probably has a compatibility problem with the pi (or its drivers in Ubuntu mate).
Try googling the camera model and linux compatibility to see if there are any known tricks to do.
Googling "raspberry pi usb camera compatibility" gives you various lists of known compatible cameras.
In terms of performance (e.g. frame rate) a usb webcam is always worse than the raspberry pi camera attached to the camera serial interface.
To get that working in opencv, you can use this library:
RaspiCam: C++ API for using Raspberry camera with/without OpenCv
www.uco.es/investiga/grupos/ava/node/40

Related

Raspberry Pi Camera Module undetectable by common applications

I bought a 5 mp camera module available at Amazon for my raspberry pi 4 2 gb model. Then I configured it for use and tested it with raspistill and raspivid, it is working as expected. But since it is a module connected to the CSI port and not a USB camera its is not detectable by some common applications. For eg.:- OBS(From Pi-Apps), Zoom(From PI-Apps, Pi-Kiss and its web portal).
What I tried ? --
Virtual camera through OBS. I was able to install OBS but I wasn't able to compile
its plugin for virtual camera and camera module. It had numerous errors.
IP camera adapter :- Idea was to stream the camera feed on local web and then convert the feed to a virtual camera. Yes, there are many such applications but all are available only for windows/Mac and not for Linux. Even the few, which are available doesn't support Raspberry pi's architecture.
Is there any workaround or a trick for make the module work like a normal camera ?
P.S.:- If you are wondering why the question is on Stackoverflow,then I feel this is a software related question and Stackoverflow is the best for that ;).
Have you tried looking into libcamera?
https://www.arducam.com/docs/cameras-for-raspberry-pi/raspberry-pi-libcamera-guide/
That might not be the best, straight-forward answer you're looking for, but I recently did some work with a PiCam and found libcamera to work wonders. I used it on a fairly low level and didn't try to point it to additional programs, but perhaps you can find something useful in there! Good luck.

How to play audio in docker container on raspberry pi?

I would like to play audio on my raspberry pi, the audio player (console application) should in a docker container. I have seen multiple articles on the web, they commonly suggest to add the --device /dev/snd to the docker run. I just can't seem to get the audio through. Audio should be played on the device that's attached via HDMI cable. I tried sox player, I took ubuntu 18.04 as base image and also this sample solution. I'm flexible with with what the media player application is as this is a hobby project (would be nice that I can "pipe" an mp3 file to the program via bash while it is being streamed from somewhere). On the host I recently installed "Raspberry Pi OS with desktop and recommended software". (Release date is October 30th 2021, Kernel version: 5.10, whatever came with it). Pi 2 model B 1.1 (2014).
UPDATE: Audio does play on 3.5mm jack
It seems that by using the --device /dev/snd option, audio does play on the 3.5mm jack. I don't know though how to make this work on the HDMI.

QR scanning using raspberryPi camera nodeJS

I am creating a nodeJS application that can be used to scan QR code on the raspberry Pi3 board.
I am able to successfully use a USB camera and scan the QR code using the Instascan node module.
However, when I try to use the Raspberry pi Camera, the Insta scan is not able to find it and not able to show the camera.
I have found many such options using python and OpenCV, however not with node js or electron.
Can someone help with this?
Finally this is resolved.
Actually by default rpi camera is not shown in /dev/video list
so I had to enable V4L2 driver by
modprobe bcm2835-v4l2
This made CSI camera list in /dev/video list and application started detecting RPI camera.
Thanks everyone

UVC function config interface

I'm reading Linux Documentation about UVC function. I'm struggling to understand an example that starts here and goes until here. What exactly is this going to do and where exactly do I create these files?
Any help is appreciated.
From your other posts I gather that you are attempting to implement a UVC gadget with a Xilinx device. Nonetheless, as Linux devices share the same opaque kernel documentation, the procedure is just as error-prone on the Raspberry Pi Zero and other OTG-enabled devices.
What exactly is this going to do
The idea of a UVC gadget is to build something that acts like a webcam. Once completed you could potentially connect that device into a Mac or PC and use it as your video for FaceTime or Skype.
Depending on your goals you could stream synthetic images, a recorded video, or passthrough video from an add-on like a MIPI CSI camera.
where exactly do I create these files?
Here's a great intro to ConfigFS: link. Again it's for Raspberry Pi Zero rather than your Xilinx device, but the same concepts apply.
While gadget-testing.txt is inconveniently curt, if you start off by running:
modprobe libcomposite
cd /sys/kernel/config/usb_gadget/
then you can proceed with the steps mkdir functions/uvc.usb0/control/header/h ...
Here is a more detailed post covering various caveats on Raspberry Pi Stack Exchange.

Kinect for Xbox 360 freezes and disconnects from USB after running Processing SimpleOpenNi depth image example

please help
I've been trying to set up kinect for XBOX 360 to run on ubuntu in order to start developing an application to control a humanoid robot. for the past four days I've been searching , downloading , installing and trying dozens of libraries and drivers to get the kinect to work on Ubuntu. in the beginning none was working and I was only able to read the RGB camera with "Camorama" and "guvcview" no matter what library or driver I attempted to run.. Finally, I installed a fresh copy of Ubuntu and installed libfreenect libraries using synaptic (I'm kinda newbie) and I also installed the following packages
https://code.google.com/p/simple-openni/downloads/detail?name=OpenNI_NITE_Installer-Linux64-0.27.zip&can=4&q=
along with Processing 2.0 and SimpleOpenNi-0.27
I start Processing -> examples -> OpenNi -> DepthImage & RUN
and the kinect starts for 3 to 10 seconds giving the image below some times along with the RGB image and some times with out it, then the frame freezes, and when I try listing the USB devices ($lsusb) there is no Kinect camera or audio devices listed, so the Kinect must be unplugged from the Adaptor and USB and then re-inserted and the problem still occur after running the sketch.
Attempted solutions:
1- removing and black listing gspca kernel module
2- disabling USB auto-suspend
but the problem still occur...
I'm using Kinect for XBOX 360 with (12V - 1.08A) USB AC Power Adapter
http://www.walmart.com/ip/INSTEN-USB-AC-Power-Adapter-For-Microsoft-Xbox-360-Kinect-Sensor/28882271
My laptop is : DELL Inspiron.1525 Intel Core2Duo RAM 2GB
Running Ubuntu 14.04.2 LTS ,, Release: 14.04 ,, Codename: trusty
Can any one help me please!!.
I had a similar problem and after I used Kinect With windows, I found that the problem was from the Kinect itself.
The following tips will help you get started using your Kinect:
If a non-Microsoft driver for the Kinect is installed on your computer, the Kinect for Windows drivers might not install or function correctly. To fix this, uninstall the non-Microsoft drivers before installing the Kinect for Windows SDK.
Connect the power supply for the Kinect to an external power source; if the Kinect has only power from the USB connection, it will be minimally functional and light the LED, but it must be connected to an external power source to be fully functional.
No tools are required for calibration of audio and video.
Your Kinect should be the only device plugged into a USB hub on your computer. If you have more than one Kinect, connect them to different USB controllers. If 2 hubs are connected to the same controller, only 1 Kinect can work at a time.
The Kinect is protected from overheating by a fan. It is controlled by the sensor's firmware, which turns off the camera at 90 degrees Celsius. There is no software interface for applications to control the fan.
Reasonable lighting, neither extremely dark nor extremely bright, is important for capturing images with the RGB camera. Incandescent, fluorescent, and natural lighting provide no special obstacles, but do not point an intense or constant light source at the camera because this can blind the RGB sensor.
The depth sensor functions adequately in typical and reduced lighting, although in near darkness there is increased noise in the signal.
The depth sensor reads depth information from reflected light. Objects that are highly reflective (mirrors and shiny metal) or highly absorptive (fluffy and/or dark materials) may not be registered by the depth sensor as successfully as other objects.
for detailed instructions on Setting Up a Kinect Sensor please follow
: https://msdn.microsoft.com/en-us/library/hh855356

Resources