Website audio output play on raspberry pi - audio

I create a website using iframe embed youtube video.I use apache2 web server on my raspberry pi and open the website by another device. The audio play on my own device instead of raspberry pi. I need it to play on raspberry pi.
My code:
<iframe width="560" height="315" src="https://www.youtube.com/embed/b4Bj7Zb-YD4" frameborder="0" allowfullscreen></iframe>
Should I use javascript or php?
How should I do?

You are loading it like webPage from server that's why it plays on Your device.
If You would like it to play that sound on RPI, the RPI itself should connect as client to that server.
I would recommend using sockets to play Your sound on node server. For that You will need to use node.js and sockets. There is a module for node https://www.npmjs.com/package/play-sound
You should make Your own server with node and use another device to send socket event to RPI server to play some mp3 files.
If You need to see video on Your other device and play sound on Your RPI like home theater style it will be a problem with usage of server because that sound has nothing to do with RPI, Your other device takes the link for youtube video.
Would recommend connecting display to RPI and browser on RPI making connection to localhost (Apache server).
I can post some example if there is a need to make sound player.
Also use this as start point for making server I made an answer there:
Server

Related

Live streaming from Raspberry Pi to NodeJS server hosted on Google App Engine

Description:
I have a Raspberry PI controlling a small vehicle, and it has a RealSense camera attached to it. What I need to do is to send a live stream from the camera to an HTML page/NodeJS server hosted on Google App Engine, so that the user can see the stream on his page. The stream will be used to manually control the vehicle, so low latency is very important.
What I attempted:
My current solution is just a simple socket connection using Socket.IO, I send a frame through the socket, decode it and display it in the page. The problem is that this method is extremely slow, and from what I understood not a good way to stream to a remote client, that is why I need to find a different way.
I tried using uv4l, when I run the line uv4l --driver uvc --device-id "realsense camera id" it says the camera is recognized, but then immediately stops without any error. When I try to open the stream with my IP and click "call" I get the error "invalid input device". Could not find any solution for this problem.
I also thought about using webRTC, I tried to follow this example (which is the closest I found to what I need): https://dev.to/whitphx/python-webrtc-basics-with-aiortc-48id , but it uses a Python server and I want to use my GAE/NodeJS server, and I'm struggling to figure out how to convert this code to use a python client and a NodeJS server.
If anyone can provide some information or advice I'd really appreciate it.
If want to control the vehicle, the latency is extremely important. I think the latency is better if about 100ms, and should not greater than 400ms if the network is jitter for a while.
The latency is introduced by everywhere, from your encoder on Raspberry PI, transfer to media server, and H5 player. Especially the encoder and player.
The best solution is use UDP based protocol like WebRTC:
Raspberry PI PC Chrome H5
Camera --> Encoder ---UDP--> Media Server --UDP---> WebRTC Player
So recommend to use WebRTC to encode and send the frame to media server, and H5 WebRTC player. You could test this solution by replace the encoder with H5 WebRTC publisher, the latency is about 100ms, please see this wiki.The arch is bellow:
Raspberry PI PC Chrome H5
Camera --> WebRTC ---UDP--> Media Server --UDP---> WebRTC Player
Note: The WebRTC stack is complex, so you could build from H5 to H5, test the latency, then move the media server from intranet to internet and test the latency, next replace the H5 publisher by your Raspberry PI and test the latency.
If want to run the solution ASAP, FFmpeg is a better encoder, to encode the frame from camera and package it as RTMP packet, then publish to media server by RTMP, finally play by H5 WebRTC player, please read this wiki. The latency is larger than WebRTC encoder, I think it might be around 600ms, but it should be OK to run the demo. The arch is bellow:
Raspberry PI PC Chrome H5
Camera --> FFmpeg ---RTMP--> Media Server --UDP---> WebRTC Player
or SRT
Note that SRT is also realtime protocol, about 200~500ms latency.
Note that you could also run media server on Raspberry PI, and use WebRTC player to play the stream from it, when they are in the same WiFi. The latency should be the minimum, because it's intranet transport.

Jack2 internal audio routing in Ubuntu 14.04

I am trying to accomplish the following in Ubuntu 14.04.
I have installed the SIP client Linphone and want to connect its audio to Adobe Connect that runs in a browser (Firefox, for instance). So what I need is a two-way communication such that:
Linphone audio output --> Adobe Connect audio input
Adobe Connect audio output --> Linphone audio input
I have understood that Jack2 (http://jackaudio.org/) is supposed to be able to route audio in between different applications. I guess what I have to do here is to configure it so that all the audio that comes from Firefox is routed to Linphone's input and all the audio that comes from Linphone is routed to Firefox's input.
I succeeded in installing Jack2 with QjackCtl but I am unable to configure it. When going to "Connect -> Audio" I was expecting to be able to select between the various running applications in order to reroute the audio. Instead, what I can do is only to connect my microphone's input to either of my speakers.
What would be the right workflow to follow here? Do I have to configure some virtual microphones/speakers to make it work? If so, how?
Any help would be greatly appreciated.

How to run Node.js on esp8266 (Nodemcu dev board)?

I am trying to connect Apple homekit to nodemcu board, I found a tutorial which works on my computer, but I am wonder if there is any way to load and run Node.JS on a NodeMCU board (ESP8266)?
You can run Espruino on it. I have the HiLetgo NodeMCU ESP-12E Module and was easily able to load the firmware.
Download and install flasher https://github.com/thingsSDK/flasher.js
Press and hold the flash button press the reset button once. It will flash once.
Open up flasher app and flash the firmware for Espruino.
Unplug the esp8266 and plug it back in
download the web ide for Espruino https://chrome.google.com/webstore/detail/espruino-web-ide/bleoifhkdalbjfbobjackfdifdneehpo?hl=en
Change the baud rate Settings --> Communications --> Baud Rate 115200
Connect and enjoy!
Some samples to try out
https://github.com/mertenats/NodeMCU-and-JavaScript
I think you're confusing nodemcu, which is a firmware for the ESP8266 boards that interprets the language lua, with the javascript runtime node.js, which runs on computers.
So, the answer is you cannot run node.js on an ESP8266.

Live stream video and capture input from the user and send it to Raspberry Pi Robot

Need Help!
Let's assume I have a robot in a room with a camera on it. I need the video source to be available live on a website. I don't want any latency at all (assuming that I have a good internet connection). Also if a user presses any keys while on the website, the robot needs to detect it and do actions accordingly. Now, I can handle all the actions the robot needs to do once I get the keys. There's a raspberry pi on the robot.
What would be the easiest way where we could achieve a bi-directional communication (one direction being video and another being plain text) between a browser and my robot, keeping the communication as fast as possible.
PS: I tried initiating a Google hangout and embedding the video, but there's a latency of atleast 1 minute.
Simple to do. get the camera for Raspberry Pi from here.
http://www.adafruit.com/products/1367
You could use Motion JPEG for transmitting the video. Follow the instructions below.
http://blog.miguelgrinberg.com/post/stream-video-from-the-raspberry-pi-camera-to-web-browsers-even-on-ios-and-android
Once you the the IP of your video stream, you could display it in a website.
To send commands to rap pi, whats the complication ? If your Rasp. pi have an internet (it should be for the video stream), you could write a program to read commands from your browser.

How to display webcam on client side only?

I'd like to show a webcam inside a browser without any server interaction.
Everything should happen client side with minimal plugins usage.
This would replicate most default webcam software bundled with the cam itself.
You would need a plugin for each major browser (or an active x control for IE) to communicate locally with the phone. Because of security, you can not interact with local devices in a browser.
You could however write a little server that runs on localhost and serves a flash or mpeg stream. Then it would be easy to link that webcam to a web page running on the same computer. This would not require any plugins, but it does require that you write a http server to talk to the cam and serve its stream to the browser.

Resources