How to make safari play an web audio api buffer source node? - audio

I have a code that load a audio file with fetch and decode to an audiobuffer and then i create a bufersourcenode in web audio api to receive that audio buffer and play it when i press a button in my web page.
In chrome my code works fine. But in safari ... no sound.
Reading web audio api related questions with safari some people say that web audio api need to receive input from user in order to play sound.
In my case i have a button to be tapped in order to play the sound, a user input already. But it is not working.
I found an answer that tells the web audio api decodeAudiodata do not work with promises in safari and it must use an old syntax. I have tried the way the answer treat the decodeAudiodata but still no sound....
Please somebody can help me here? thanks for any help!
<button ontouchstart="bPlay1()">Button to play sound</button>
window.AudioContext = window.AudioContext || window.webkitAudioContext;
const ctx = new AudioContext();
let au1;
window.fetch("./sons/BumboSub.ogg")
.then(response => response.arrayBuffer())
.then(arrayBuffer => ctx.decodeAudioData(arrayBuffer,
audioBuffer => {
au1 = audioBuffer;
return au1;
},
error =>
console.error(error)
));
function bPlay1(){
ctx.resume();
bot = "Botão 1";
var playSound1b = ctx.createBufferSource();
var vb1 = document.getElementById('sld1').value;
playSound1b.buffer = au1;
var gain1b = ctx.createGain();
playSound1b.connect(gain1b);
gain1b.connect(ctx.destination);
gain1b.connect(dest);
gain1b.gain.value = vb1;
console.log(au1); ///shows in console!
console.log(playSound1b); ///shows in console!
playSound1b.start(ctx.currentTime);
}

Related

Live Audio HLS stream fails to play

we are trying to play a HLS Live stream that is Audio-only.
It looks ok spec-wise and we're able to play it on all browsers and native player that we have, but it fails to play on Chromecast.
Url: http://rcavliveaudio.akamaized.net/hls/live/2006635/P-2QMTL0_MTL/playlist.m3u8
Content-Type: vnd.apple.mpegURL
Steps to reproduce
Force this content url and content type into the Chromecast player.
Expected
To hear audio playing like on any other player we try.
Actual result
There is no playback. The master playlist is fetched, the chunk playlist is fetched and the first chunks are fetched, but there is no playback. It stops after a few chunk.
The player is stuck in the "processing segment" phase, and it stops.
Please change the Content type to audio/mp4 and set AAC to segment format mediaInfo.hlsSegmentFormat = cast.framework.messages.HlsSegmentFormat.AAC;
Using Anjaneesh's comment, here's how I ended up solving this. On the receiver's JavaScript:
const instance = cast.framework.CastReceiverContext.getInstance();
const options = new cast.framework.CastReceiverOptions();
options.disableIdleTimeout = true;
options.supportedCommands = cast.framework.messages.Command.ALL_BASIC_MEDIA;
instance.start(options);
const playerManager = instance.getPlayerManager();
playerManager.setMessageInterceptor(cast.framework.messages.MessageType.LOAD, (req) => {
req.media.hlsSegmentFormat = cast.framework.messages.HlsSegmentFormat.TS_AAC;
req.media.streamType = cast.framework.messages.StreamType.LIVE;
return req;
});
The key is setting the message interceptor callback for the LOAD event/message. There, you can override hlsSegmentFormat from the client. In my case, I needed to indicate that my segments were in TS format.
I'm not entirely sure why this is necessary. It isn't necessary when there is a video track... only when video is missing.

Google Actions MediaResponse Callback Not Working on iPhone Google Assistant App, Works in Simulator and on Google Home Mini

I'm having an issue with my Google Assistant Action and using it in the Google Assistant Mobile app.
I am trying to play a tracklist of 1-3 minute mp3s using Media Responses and callbacks and it is working perfectly in the simulator and on my Google Home Mini, but not on the Google Assistant app on my phone.
What I've noticed happening is the MediaResponse callback isn't sent when I test on iPhone. The first MediaResponse will play but then the app is silent. It doesn't exit my action, though, it leaves the mic open and when I try to talk to it again whatever I say is sent to my action. This part is very similar to Starfish Mint's problem, though mine seems to work on my Google Home device. They said they fixed it by
"After waiting 6 months, We manage to solve it ourselves. On
MEDIA_FINISHED, we must return Audio text within your media response
to get subsequent MEDIA_FINISHED event. We tested this with playlist
of 100 media files and it plays like a charm."
though I'm not entirely sure what that means.
This might be an obvious answer to my question but where it says: Media responses are supported on Android phones and on Google Home , does this mean that they aren't supported on iPhone and that's the issue? Are there any workarounds for this, like using a Podcast action or something?
I have tried another audio playing app, the Music Player Sample app which is one of Google's sample Dialogflow apps and it also doesn't work on my phone though does in the other places. Maybe it is just an iPhone thing?
The thing that I find confusing, though, is when I look at the capabilities of the action on my phone: conv.surface.capabilities.has("actions.capability.MEDIA_RESPONSE_AUDIO")it includes actions.capability.MEDIA_RESPONSE_AUDIO in its capabilities. If it didn't have this I would be more inclined to believe it doesn't include iPhones but it seems weird that it would have it in the capabilities but then not work.
Here's the code where I am playing the first track:
app.intent('TreatmentResponse', (conv, context, params) => {
var treatmentTracks = [{url: 'url', name: 'name'},{url: 'url', name: 'name'}];
var result = playNext(treatmentTracks[0].url, treatmentTracks[0].name);
var response = result[0];
conv.data.currentTreatment = 'treatment';
conv.data.currentTreatmentName = 'treatmentName';
conv.data.treatmentPos = 1;
conv.data.treatmentTracks = treatmentTracks;
conv.ask("Excellent, I'll play some tracks in that category.");
conv.ask(response);
conv.ask(new Suggestions(['skip']));
});
and here is my callback function:
app.intent('Media Status', (conv) => {
const mediaStatus = conv.arguments.get('MEDIA_STATUS');
var { treatmentPos, treatmentTracks, currentTreatment, currentTreatmentName } = conv.data;
if (mediaStatus && mediaStatus.status === 'FINISHED' && treatmentPos < treatmentTracks.length) {
playNextTrack(conv, treatmentPos, treatmentTracks);
} else {
endConversation(conv, currentTreatment);
}
});
Here's playNextTrack()
function playNextTrack(conv, pos, medias) {
conv.data.treatmentPos = pos+1;
var result = playNext(medias[pos].url, medias[pos].name);
var response = result[0];
var ssml = result[1];
conv.ask(ssml);
conv.ask(response);
conv.ask(new Suggestions(['skip']));
}
and playNext()
function playNext(url, name) {
const response = new MediaObject({
name: name,
url: url,
});
var ssml = new SimpleResponse({
text: 'Up next:',
speech: '<speak><break time="1" /></speak>'
});
return [response, ssml];
}
The other issue is when the MediaResponse is playing on my iPhone if I interrupt it to say "Next" or "Skip", rather than using my "NextOrSkip" intent like it does in the simulator and on the Google Home Mini, it just says "sure" or "alright" [I don't have that in my code anywhere] and then is silent (and listening).

wavesurfer.js doesn't play before loading is completed on safari

I've set up a wavesurfer audio model, which is working perfectly fine on chrome and firefox. It starts right away. When I want hit play on safari it waits for the whole file to be downloaded complete and only then it plays...I've experienced a similar problem on other pages that I open on safari as well....any ideas, why this could be the case and what to against it?
audioModel = WaveSurfer.create({
container: createdContainer,
waveColor: waveColor,
progressColor: waveColorProg,
height: height,
backend: 'MediaElement',
cursorWidth:0,
});
This might be the bug https://github.com/katspaugh/wavesurfer.js/issues/1215
Try the work around proposed by DrLongGhost which uses MediaElement in Safari (only). Other browsers work better with WebAudio backend. https://github.com/katspaugh/wavesurfer.js/issues/1215#issuecomment-415083308
// Only use MediaElement backend for Safari
const isSafari = /^((?!chrome|android).)*safari/i.test(navigator.userAgent || '') ||
/iPad|iPhone|iPod/i.test(navigator.userAgent || '');
const wavesurferArgs = {
container: document.getElementById('wavesurferContainerInternal'),
plugins
};
if (isSafari) {
wavesurferArgs.backend = 'MediaElement';
}
_wavesurfer = window.WaveSurfer.create(wavesurferArgs);
Update: ah you already are using MediaElement. We'll I'm not sure what the problem is.

Using getUserMedia to get audio from the microphone as it is being recorded?

I'm trying to write a Meteor.JS app that uses peer to peer "radio" communication. When a user presses a button it broadcasts their microphone output to people.
I have some code that gets permission to record audio, and it successfully gets a MediaStream object, but I can't figure out how to get the data from the MediaStream object as it is being recorded.
I see there is an method defined somewhere for getting all of the tracks of the recorded audio. I'm sure I could find a way to write some kind of loop that notifies me when audio has been added, but it seems like there should be a native, event-driven way to retrieve the audio from getUserMedia. Am I missing something? Thanks
What you will want to do is to access the stream through the AudioAPI(for the recording part). This is after assigning a var to your stream that was grabbed through getUserMedia (I call it localStream). So, you can create as many MediaStreamsource nodes as you want from one stream, so you can record it WHILE sending it to numerous people through different rtcpeerconnections.
var audioContext = new webkitAudioContext() || AudioContext();
var source = audioContext.createMediastreamSource(localStream);
var AudioRecorder = function (source) {
var recording = false;
var worker = new Worker(WORKER_PATH);
var config = {};
var bufferLen = 4096;
this.context = source.context;
this.node = (this.context.createScriptProcessor ||
this.context.createJavaScriptNode).call(this.context,
bufferLen, 2, 2);
this.node.onaudioprocess = function (e) {
var sample = e.inputBuffer.getChannelData(0);
//do what you want with the audio Sample, push to a blob or send over a WebSocket
}
source.connect(this.node);
this.node.connect(this.context.destination);
};
Here is a version I wrote/modified to send audio over websockets for recording on a server.
For sending the audio only when it is available, you COULD use websockets or a webrtc peerconnection.
You will grab the stream through getUserMedia success object(you should have a global variable that will be the stream for all your connection). And when it becomes available, you can use a signalling server to forward the requesting SDPs to the audio supplier. You can set it the requesting SDPs to receive only and your connection.
PeerConnection example 1
PeerConnection example 2
Try with code like this:
navigator.webkitGetUserMedia({audio: true, video: false},
function(stream) { // Success Callback
var audioElement = document.createElement("audio");
document.body.appendChild(audioElement);
audioElement.src = URL.createObjectURL(stream);
audioElement.play();
}, function () { // Error callback
console.log("error")
});
You may use the stream from success callback to create a object URL and pass it into an HTML5 audio element.
Fiddle around in http://jsfiddle.net/veritas/2B9Pq/

iOS 7: Audio only plays in Safari, not Web App

I'm trying to build an iOS Webapp that uses audio. While it has been a very fickle endeavor, I finally managed to get it to work in Safari Mobile (interestingly enough it worked in chrome mobile a long time before, I don't know why…). Yet when I save it as a webapp on the home screen, the audio stops working mysteriously…
Here is the audio code. window.helpers.gongis a base64 encoded mp3 file.
I checked the console output in the webapp via the desktop safari, yet there are no errors thrown.
Any ideas what might be going wrong?
window.helpers.audio = {
myAudioContext: null,
mySource: null,
myBuffer: null,
init: function() {
if ('AudioContext' in window) {
this.myAudioContext = new AudioContext();
} else if ('webkitAudioContext' in window) {
this.myAudioContext = new webkitAudioContext();
} else {
alert('Your browser does not support yet Web Audio API');
}
var self = this;
var load = (function (url) {
var arrayBuff = window.helpers.Base64Binary.decodeArrayBuffer(window.helpers.gong);
self.myAudioContext.decodeAudioData(arrayBuff, function(audioData) {
self.myBuffer = audioData;
});
}());
},
play: function() {
this.mySource = this.myAudioContext.createBufferSource();
this.mySource.buffer = this.myBuffer;
this.mySource.connect(this.myAudioContext.destination);
if ('AudioContext' in window) {
this.mySource.start(0);
} else if ('webkitAudioContext' in window) {
this.mySource.noteOn(0);
}
}
};
The code is called like this on load:
window.helpers.audio.init();
And later it is triggered through user action:
...
$('#canvas').click(function() {
if(this.playing == false) {
window.helpers.audio.play();
}
}.bind(this));
...
Ouch, the answer was blindingly simple:
I had the mute switch on the side of the iPhone set to mute the whole time.
So it turns out that safari plays audio even when the switch is on mute, yet when you save it as a web app, it doesn't work anymore.
If I understand correctly the audio works on desktop Safari, and not on mobile Safari?
This could be a result of a limitation placed on mobile Safari that requires any sound that is played to be a triggered in a user action (for example, a click).
Read more here:
http://buildingwebapps.blogspot.com/2012/04/state-of-html5-audio-in-mobile-safari.html

Resources