I am writing an iOS app and I need to stream m4a files over HTTP Live Streaming. I am using AVPlayer and a HLS Simple Media Playlist file.
If I link AVPlayer directly to an mp3 or m4a, it streams with no issues.
URL = [NSURL URLWithString:#"http://fembotics.com/bells.m4a"];
item = [AVPlayerItem playerItemWithURL:URL];
player = [AVPlayer playerWithPlayerItem:item];
[player play];
The problem happens when I set URL to a HLS playlist. Strangely, mp3 works and m4a does not.
http://fembotics.com/basicmp3.m3u8 - Working
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:5220
#EXTINF:20.0,
/bells.mp3
#EXT-X-ENDLIST
http://fembotics.com/basicm4a.m3u8 - Broken
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:5220
#EXTINF:20.0,
/bells.m4a
#EXT-X-ENDLIST
I tried using absolute URLs in the playlists and it makes no difference.
Has anyone seen this type of issue before? Should I try other sources for my m4a files? (I tried one purchased from iTunes and one transcoded from wav by MediaHuman Audio Converter).
m3u8 on iOS only supports segmented .aac an .ts containers.
Related
"fluent-ffmpeg": "^2.1.2",
"ffmpeg": "^0.0.4",
node : 8
Code to reproduce
let command = ffmpeg()
.input(tempFilePath)
.input(watermarkFilePath)
.complexFilter([
"[0:v][1:v]overlay=W-w-20:H-h-20"
])
.videoBitrate(2500)
.videoCodec('libx264')
.audioCodec('aac')
.format('mp4')
.output(targetTempFilePath)
When applying the ffmpeg encoding command on the attached video, it plays fine on a local device - the issue however is when uploading to Facebook/WhatsApp the audio/video becomes out of sync
Any ideas on what i need to change in terms of the video/audio settings so that the audio + video are in sync, even when uploaded to the various social networks?
Here's a link to the 3 video files (original, post ffmpeg, post whatsapp upload that includes delay) if you want to get a better idea!
https://wetransfer.com/downloads/445dfaf0f323a73c56201b818dc0267b20191213052112/24e635
Thank you and appreciate any help!!
I am using the navigator.webkitGetUserMedia API to capture the desktop and using microphone to capture audio. When I make the following call
navigator.webkitGetUserMedia({
audio:true,
video: {
mandatory: {
chromeMediaSource: 'desktop',
chromeMediaSourceId: id,
maxWidth:screen.width,
maxHeight:screen.height}
}
}, gotStream, getUserMediaError);
I am getting a screen capture error. Does this API not support the above scenario?
I am able to capture audio and desktop video individually but not together. Also, since I am capturing desktop and not the webcam video, does that make any difference?
Chrome does not allow you to request an audio stream alongside a chromeMediaSource.
See Why Screen Sharing Fails here for more info.
You may be able to circumvent this by sending individual getUserMedia requests - one for the audio stream and one for desktop.
I am trying to play an HLS stream (.m3u8) containing .ts file buffers.
It was playing all right before when i tried with some open sources. Now when i am trying to play with the streams provided by my service provider, It does play in all major browsers except chrome.
P.s: I am using videojs for accomplishing these. I also tested using viblast but no luck there.
For reference i am posting my code :
<script>
//myplayer is my <video> object for videojs
myPlayer.src({
type: "application/x-mpegURL; application/vnd.apple.mpegurl",
src: encodeURI(some m3u8 link)
});
myPlayer.play();
</script>
Now this code here is ruling all browsers but when it faces chrome it kneels down.
The error response from chrome is like below :
VIDEOJS: ERROR: (CODE:3 MEDIA_ERR_DECODE) The media playback was
aborted due to a corruption problem or because the media used features
your browser did not support.
MediaError {code: 3, message: "The media playback was aborted due to a
corruption…media used features your browser did not support."}
Note: I am getting my streams from scaleEngine.
I came across this error when using an Mp4 with a Webm fallback. Everything worked well in Firefox but I was getting this error in Chrome. I switched the order of fallbacks so that videojs used Webm first and Mp4 as a fallback. This fixed it for me at least.
I am struggling to stream hosted audio files like the one below:
http://res.cloudinary.com/karmo/raw/upload/v1415468388/kdzu36kr8t7aowkeqrn7.mp4
I have tried using AVPLayer with the initWithURL and while I do not get any errors I never hear any audio play back.
NSURL *url = [NSURL URLWithString:#"http://res.cloudinary.com/karmo/raw/upload/v1415468388/kdzu36kr8t7aowkeqrn7.mp4"];
player = [[AVPlayer alloc]initWithURL:url];
songPlayer = player;
Any previous StackOverflow examples I have found I have the same issue... no errors but I never hear any audio playback
Streaming mp3 audio with AVPlayer
Audio streaming in ios using AVPlayer
What am I missing...
I am creating a Monotouch iPhone app that will display streaming videos. I have been able to get the MPMoviePlayerController working with a local file (NSUrl FromFile), but have not been able to get videos streamed from a media server.
Here is the code I am using to play the video:
string url = #"http://testhost.com/test.mp4";
var nsurl = NSUrl.FromString(url);
mp = new MPMoviePlayerController(nsurl);
mp.SourceType = MPMovieSourceType.Streaming;
//enable AirPlay
//mp.AllowsAirPlay = true;
//Add the MPMoviePlayerController View
this.View.AddSubview(mp.View);
//set the view to be full screen and show animated
mp.SetFullscreen(true, true);
//MPMoviePlayer must be set to PrepareToPlay before playback
mp.PrepareToPlay();
//Play Movie
mp.Play();
Is there something else in implementing the MPMoviePlayerController for video streaming that I am missing? I also read that videos for iOS should be streamed using Apple's HTTP Live Streaming on the media server, is that a requirement? I have never worked with video streaming on an iOS device before so I am not sure if there is something lacking in my code or the media server setup or a combination.
I'm pretty sure you need a streaming server to use a video file from an HTTP url. However it's a requirement for applications (on the AppStore) to do so:
Important iPhone and iPad apps that send large amounts of audio or
video data over cellular networks are required to use HTTP Live
Streaming. See “Requirements for Apps.”
The good news is that your existing code should not have to be changed to handle this (it's server, not client side).