I am struggling to stream hosted audio files like the one below:
http://res.cloudinary.com/karmo/raw/upload/v1415468388/kdzu36kr8t7aowkeqrn7.mp4
I have tried using AVPLayer with the initWithURL and while I do not get any errors I never hear any audio play back.
NSURL *url = [NSURL URLWithString:#"http://res.cloudinary.com/karmo/raw/upload/v1415468388/kdzu36kr8t7aowkeqrn7.mp4"];
player = [[AVPlayer alloc]initWithURL:url];
songPlayer = player;
Any previous StackOverflow examples I have found I have the same issue... no errors but I never hear any audio playback
Streaming mp3 audio with AVPlayer
Audio streaming in ios using AVPlayer
What am I missing...
Related
"fluent-ffmpeg": "^2.1.2",
"ffmpeg": "^0.0.4",
node : 8
Code to reproduce
let command = ffmpeg()
.input(tempFilePath)
.input(watermarkFilePath)
.complexFilter([
"[0:v][1:v]overlay=W-w-20:H-h-20"
])
.videoBitrate(2500)
.videoCodec('libx264')
.audioCodec('aac')
.format('mp4')
.output(targetTempFilePath)
When applying the ffmpeg encoding command on the attached video, it plays fine on a local device - the issue however is when uploading to Facebook/WhatsApp the audio/video becomes out of sync
Any ideas on what i need to change in terms of the video/audio settings so that the audio + video are in sync, even when uploaded to the various social networks?
Here's a link to the 3 video files (original, post ffmpeg, post whatsapp upload that includes delay) if you want to get a better idea!
https://wetransfer.com/downloads/445dfaf0f323a73c56201b818dc0267b20191213052112/24e635
Thank you and appreciate any help!!
Play audio is the main feature of my app, this is what I set:
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
[[AVAudioSession sharedInstance] setActive:YES error:nil];
When audio changed, I set metadata for nowPlayingInfo:
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:playingInfo];
It works well, audio can played in background, and lock screen will show the metadata I set.
But when I play a video in app, example a youtube link in webView. Even though the video has been closed, App will lost the background play audio permission. Lock screen not show metadata anymore, and app can't change audio in background.
This problem happened only in iOS 12, same code still work well in iOS 11. Is anyone have suggestion for this issue?
I am writing an iOS app and I need to stream m4a files over HTTP Live Streaming. I am using AVPlayer and a HLS Simple Media Playlist file.
If I link AVPlayer directly to an mp3 or m4a, it streams with no issues.
URL = [NSURL URLWithString:#"http://fembotics.com/bells.m4a"];
item = [AVPlayerItem playerItemWithURL:URL];
player = [AVPlayer playerWithPlayerItem:item];
[player play];
The problem happens when I set URL to a HLS playlist. Strangely, mp3 works and m4a does not.
http://fembotics.com/basicmp3.m3u8 - Working
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:5220
#EXTINF:20.0,
/bells.mp3
#EXT-X-ENDLIST
http://fembotics.com/basicm4a.m3u8 - Broken
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:5220
#EXTINF:20.0,
/bells.m4a
#EXT-X-ENDLIST
I tried using absolute URLs in the playlists and it makes no difference.
Has anyone seen this type of issue before? Should I try other sources for my m4a files? (I tried one purchased from iTunes and one transcoded from wav by MediaHuman Audio Converter).
m3u8 on iOS only supports segmented .aac an .ts containers.
I'm trying to get my receiver to play an mp3 file hosted on the server with the following function
playSound_: function(mp3_file) {
var snd = new Audio("audio/" + mp3_file);
snd.play();
},
However, most of the time it doesn't play and when it does play, it's delayed. When I load the receiver in my local browser, however, it works fine.
What's the correct way to play audio on the receiver?
You can use either a MediaElement tag or Web Audio API. Simplest is probably a MediaElement.
I am creating a Monotouch iPhone app that will display streaming videos. I have been able to get the MPMoviePlayerController working with a local file (NSUrl FromFile), but have not been able to get videos streamed from a media server.
Here is the code I am using to play the video:
string url = #"http://testhost.com/test.mp4";
var nsurl = NSUrl.FromString(url);
mp = new MPMoviePlayerController(nsurl);
mp.SourceType = MPMovieSourceType.Streaming;
//enable AirPlay
//mp.AllowsAirPlay = true;
//Add the MPMoviePlayerController View
this.View.AddSubview(mp.View);
//set the view to be full screen and show animated
mp.SetFullscreen(true, true);
//MPMoviePlayer must be set to PrepareToPlay before playback
mp.PrepareToPlay();
//Play Movie
mp.Play();
Is there something else in implementing the MPMoviePlayerController for video streaming that I am missing? I also read that videos for iOS should be streamed using Apple's HTTP Live Streaming on the media server, is that a requirement? I have never worked with video streaming on an iOS device before so I am not sure if there is something lacking in my code or the media server setup or a combination.
I'm pretty sure you need a streaming server to use a video file from an HTTP url. However it's a requirement for applications (on the AppStore) to do so:
Important iPhone and iPad apps that send large amounts of audio or
video data over cellular networks are required to use HTTP Live
Streaming. See “Requirements for Apps.”
The good news is that your existing code should not have to be changed to handle this (it's server, not client side).