I am using the navigator.webkitGetUserMedia API to capture the desktop and using microphone to capture audio. When I make the following call
navigator.webkitGetUserMedia({
audio:true,
video: {
mandatory: {
chromeMediaSource: 'desktop',
chromeMediaSourceId: id,
maxWidth:screen.width,
maxHeight:screen.height}
}
}, gotStream, getUserMediaError);
I am getting a screen capture error. Does this API not support the above scenario?
I am able to capture audio and desktop video individually but not together. Also, since I am capturing desktop and not the webcam video, does that make any difference?
Chrome does not allow you to request an audio stream alongside a chromeMediaSource.
See Why Screen Sharing Fails here for more info.
You may be able to circumvent this by sending individual getUserMedia requests - one for the audio stream and one for desktop.
Related
I am using Azure Media Service to live stream an event. My source is a HD SDI feed being captured via an AJA Kona LHi card and sent to Azure using Wirecast.
I'm using the default settings in Wirecast for Azure Media Service. I am using standard encoding and the 720p preset. I am using the Azure Media Player embedded in a private site.
Everything seems to work however iOS devices are unable to AirPlay the stream. The video plays correctly on the device but no AirPlay controls are available.
If I use the exact same setup to stream my webcam the AirPlay controls are available. Is there some specific configuration required to make this work reliably?
On iPad, Azure Media Player by default streams the videos in DASH. You need to specify the techorder to use give HLS a higher priority, e.g.:
var myOptions = {
techOrder: ["html5", "azureHtml5JS", "flashSS"],
autoplay: false,
controls: true,
width: "640",
height: "400",
};
var myPlayer = amp("azuremediaplayer", myOptions);
If you don't see the AirPlay button on the video player, you can do a swipe down gesture on the upper right region of the screen and bring up the "Quick controls" to access AirPlay:
If this still doesn't work, reply with the url of your webpage and I can take a look.
"fluent-ffmpeg": "^2.1.2",
"ffmpeg": "^0.0.4",
node : 8
Code to reproduce
let command = ffmpeg()
.input(tempFilePath)
.input(watermarkFilePath)
.complexFilter([
"[0:v][1:v]overlay=W-w-20:H-h-20"
])
.videoBitrate(2500)
.videoCodec('libx264')
.audioCodec('aac')
.format('mp4')
.output(targetTempFilePath)
When applying the ffmpeg encoding command on the attached video, it plays fine on a local device - the issue however is when uploading to Facebook/WhatsApp the audio/video becomes out of sync
Any ideas on what i need to change in terms of the video/audio settings so that the audio + video are in sync, even when uploaded to the various social networks?
Here's a link to the 3 video files (original, post ffmpeg, post whatsapp upload that includes delay) if you want to get a better idea!
https://wetransfer.com/downloads/445dfaf0f323a73c56201b818dc0267b20191213052112/24e635
Thank you and appreciate any help!!
I work on own specialized VoIP client for W10 mobile & desktop. Basic things work ok.
However I cannot get audio output to speakers on my old Lumia.
foreach (var item in (await DeviceInformation.FindAllAsync(DeviceClass.AudioRender)).Where(i => i.Name.Contains("Speakers")))
RendererId = item.Id;
There is "Speakers (WaveRT)" in device list so RendererId is valid.
Later application tries to open audio device (WSAPI) with found RendererId. But anyway phone plays to receiver only.
I modified Voip sample app in attempt to reproduce issue - yes, it happens with Voip sample app also.
My collegue confirms he has same problem on his phone.
Is it possible to play audio via speaker for voip app ?
Thank you!
On the Phone devices only, you can use the AudioRoutingManager to change the audio output.
// to get the audio manager
IsAudioRoutingSupported = ApiInformation.IsApiContractPresent(typeof(PhoneContract).FullName, 1);
if(IsAudioRoutingSupported)
{
// audio routing is supported, we register for the output change events
m_audioRoutingManager = AudioRoutingManager.GetDefault();
m_audioRoutingManager.AudioEndpointChanged += OnAudioEndpointChanged;
}
// to change to output
m_audioRoutingManager.SetAudioEndpoint(AudioRoutingEndpoint.Speakerphone);
I'm trying to get my receiver to play an mp3 file hosted on the server with the following function
playSound_: function(mp3_file) {
var snd = new Audio("audio/" + mp3_file);
snd.play();
},
However, most of the time it doesn't play and when it does play, it's delayed. When I load the receiver in my local browser, however, it works fine.
What's the correct way to play audio on the receiver?
You can use either a MediaElement tag or Web Audio API. Simplest is probably a MediaElement.
I am creating a Monotouch iPhone app that will display streaming videos. I have been able to get the MPMoviePlayerController working with a local file (NSUrl FromFile), but have not been able to get videos streamed from a media server.
Here is the code I am using to play the video:
string url = #"http://testhost.com/test.mp4";
var nsurl = NSUrl.FromString(url);
mp = new MPMoviePlayerController(nsurl);
mp.SourceType = MPMovieSourceType.Streaming;
//enable AirPlay
//mp.AllowsAirPlay = true;
//Add the MPMoviePlayerController View
this.View.AddSubview(mp.View);
//set the view to be full screen and show animated
mp.SetFullscreen(true, true);
//MPMoviePlayer must be set to PrepareToPlay before playback
mp.PrepareToPlay();
//Play Movie
mp.Play();
Is there something else in implementing the MPMoviePlayerController for video streaming that I am missing? I also read that videos for iOS should be streamed using Apple's HTTP Live Streaming on the media server, is that a requirement? I have never worked with video streaming on an iOS device before so I am not sure if there is something lacking in my code or the media server setup or a combination.
I'm pretty sure you need a streaming server to use a video file from an HTTP url. However it's a requirement for applications (on the AppStore) to do so:
Important iPhone and iPad apps that send large amounts of audio or
video data over cellular networks are required to use HTTP Live
Streaming. See “Requirements for Apps.”
The good news is that your existing code should not have to be changed to handle this (it's server, not client side).