Streaming live camera feed (RTSP) using J2ME on Nokia phones - java-me

I want to create an App that can play a live camera feed on nokia devices.
I created a sample app as described here:
http://www.developer.nokia.com/Community/Wiki/How_to_play_video_streaming_in_Java_ME
Using this I am able to play youtube or file based rtsp streams but not direct camera feeds.
Further details
IP camera is sending live camera captures as feed in RTSP format
MPEG4 format
It is possible to play this feed on RealPlayer and VLC in desktop systems
Not able to play this feed using RealPlayer on Mobile.
How do I create an App that can play live feed.
It is also ok for me if i can get some existing media players which are capable of playing live feeds.

Related

How to play RTSP stream from ip video camera and NVR on user web page

I want to play RTSP stream from ip video cameras (MP4, H264) on my intranet web page, I use React. I have 12 cameras and NVR.
I did not find a way to do this without an intermediate server (Webrtc is not suitable), that spends resources on transcoding h264 stream to the mjpeg.
If I set a high resolution and quality of the stream, then a lot of resources are spent on transcoding, and most importantly, the streaming of mjpeg images takes a lot of traffic.
Is there a way or solution to stream from the ip camera directly to the web page so that the decoding is on the user's webbrowser side.
This will free the intermediate server from a heavy load for big streams.
It is necessary that the playback work on mobile phones.
Thanks for the answer.
There is no way to stream RTSP camera's H264 video directly to web browser.
But cameras support outputting still jpeg images - you can create a webpage that will display such an image from a camera every 200ms or so.
If you are not happy with the above solution, you must use a media server in between, which will pull RTSP stream from the camera and will convert it to some protocol that browser understands. You are mistaken in one thing: no video transcoding is involved. I don't know why WebRTC is not an option for you, but most media servers will offer 4 types of output:
Low latency:
WebRTC
Websockets to MSE
High latency:
HLS
MPEG-Dash
All these methods do NOT require transcoding of your original H264 video, encoded by RTSP camera/NVR. Some media servers you can use: Unreal Media Server, Wowza, Janus.
Live demo: http://www.umediaserver.net/umediaserver/demos.html
No browser has native RTSP support, so if you want decoding to happen on the end user side, then you'll have to write your own custom web player.
You can start by looking at the open-source solution like this one:
git://github.com/Streamedian/html5_rtsp_player.git
It works on PC and Android, but didn't work with iPhone for me (but you can try it for yourself https://streamedian.com/demonstration/ maybe it's just my issue), but maybe you can find better alternative or fork it and make it work on all devices.
It still requires a middle-man proxy server though because it uses a websocket tech to work, but since it doesn't do any video converting or decoding, it don't suppose to take any resources at all.

How can i streaming and have the video available to download using Azure media services

I need to stream a tv signal (I have the rights) using azure media service. And at the same time i need to have as a video to be access and download it at least as a part But how can i access part of this continuous video. I thought that a job encoder was the tool but i can't find a way. Is any way to do it?
Solution 1: Use FFmpeg to download any Azure media service video or live stream.
For this you need to have FFmpeg installed. No matter you are using Windows, Linux, or Mac OS.
Download latest FFmpeg here: https://ffmpeg.org/download.html
And you need to get the Azure Media Service smooth streaming URL of the video you are watching. Typically, this URL ends with 'manifest'.
Example :
Refer this documents where you can find the step by step procedure to download or live stream
1) https://anduin.aiursoft.com/post/2020/5/15/download-any-azure-media-service-video-or-live-stream-with-ffmpeg
Solution 2: live event can be set to either a pass-through (an on-premises live encoder sends a multiple bitrate stream) or live encoding (an on-premises live encoder sends a single bitrate stream). For details about live streaming in Media Services v3, see Live events and live outputs.
Live Event:
When using the pass-through Live Event, you rely on your on-premises live encoder to generate a multiple bitrate video stream and send that as the contribution feed to the Live Event (using RTMP or fragmented-MP4 input protocol). The Live Event then carries through the incoming video streams to the dynamic packager (Streaming Endpoint) without any further transcoding.
Live Encoding:
When using cloud encoding with Media Services, you would configure your on-premises live encoder to send a single bitrate video as the contribution feed (up to 32Mbps aggregate) to the Live Event (using RTMP or fragmented-MP4 input protocol). The Live Event transcodes the incoming single bitrate stream into multiple bitrate video streams at varying resolutions to improve delivery and makes it available for delivery to playback devices via industry standard protocols like MPEG-DASH, Apple HTTP Live Streaming (HLS), and Microsoft Smooth Streaming.
For more details refer this document

Trying to route audio from Musescore to Ableton Live 10 via JACK audio connection?

My goal is to be able to write sheet music in Musescore and then have the audio output of the playback routed to Ableton Live.
I've tried using loopMIDI audio and LoopBe1 as virtual midi cables.
I have the Jack audio driver set in Ableton's audio preferences under ASIO drivers. As seen in the photo, it seems that Ableton is recognizing the virtual midi cables as an input. I have Musescore's Jack audio settings enabled. I have a midi instrument set up in Ableton. However, when I play back audio in Musescore Ableton doesn't seem to be recognizing any input.
I was trying to follow along with this tutorial. However, they seemed to omit certain details. For example, as seen in my image I was only able to route general sound/midi devices together not specific [left1,right1] to another [in1,in2]

Take the audio of the youtube video element

Intro:
I want to play a youtube video clip and be able to define its states during the session (to sync between users). I want that the youtube video will be played on the current chosen devices (webrtc app). E.g - I can choose specific audio output for the app from 3 that I have.
The problem that I have:
I am trying to get the youtube video audio in order to sink the audio to the relevant audio output device that I have. Currently, When I am playing the youtube video, the audio is being played through the current default audio output device and not by the chosen one on my app (I have the selected device id saved).
What I actually want to achieve:
I want to play the youtube player and hear the video audio track with the chosen audio output device (aka chosen speaker) and not by the default one.
What I am using
Currently using React-Player with my addons.
I am using React and Node
Again:
The problem here is that the video will be played on the default audio output of each client (cannot attach it to a specific one)
setSindId is not reachable
Ideas:
Take the video element and get the audio track - not possible with iframe
using youtube API for it - never seen an option
Have some ideas regarding saving it as mp3 and serve the audio + doing sync footwork, I prefer not to.
Please let me know if you have an idea.
Thanks!

How to stream media playing in songbird?

Songbird is an open source media player with lot of plugins. I want to broadcast media playing on my songbird over a network. Kindly, do not suggest me using another player.
Use pulseaudio as your audio framework - it has support for acting as a streaming server. It can play to both your local speakers and a network destination simultaneously.

Resources