I've set up a wavesurfer audio model, which is working perfectly fine on chrome and firefox. It starts right away. When I want hit play on safari it waits for the whole file to be downloaded complete and only then it plays...I've experienced a similar problem on other pages that I open on safari as well....any ideas, why this could be the case and what to against it?
audioModel = WaveSurfer.create({
container: createdContainer,
waveColor: waveColor,
progressColor: waveColorProg,
height: height,
backend: 'MediaElement',
cursorWidth:0,
});
This might be the bug https://github.com/katspaugh/wavesurfer.js/issues/1215
Try the work around proposed by DrLongGhost which uses MediaElement in Safari (only). Other browsers work better with WebAudio backend. https://github.com/katspaugh/wavesurfer.js/issues/1215#issuecomment-415083308
// Only use MediaElement backend for Safari
const isSafari = /^((?!chrome|android).)*safari/i.test(navigator.userAgent || '') ||
/iPad|iPhone|iPod/i.test(navigator.userAgent || '');
const wavesurferArgs = {
container: document.getElementById('wavesurferContainerInternal'),
plugins
};
if (isSafari) {
wavesurferArgs.backend = 'MediaElement';
}
_wavesurfer = window.WaveSurfer.create(wavesurferArgs);
Update: ah you already are using MediaElement. We'll I'm not sure what the problem is.
Related
I have a code that load a audio file with fetch and decode to an audiobuffer and then i create a bufersourcenode in web audio api to receive that audio buffer and play it when i press a button in my web page.
In chrome my code works fine. But in safari ... no sound.
Reading web audio api related questions with safari some people say that web audio api need to receive input from user in order to play sound.
In my case i have a button to be tapped in order to play the sound, a user input already. But it is not working.
I found an answer that tells the web audio api decodeAudiodata do not work with promises in safari and it must use an old syntax. I have tried the way the answer treat the decodeAudiodata but still no sound....
Please somebody can help me here? thanks for any help!
<button ontouchstart="bPlay1()">Button to play sound</button>
window.AudioContext = window.AudioContext || window.webkitAudioContext;
const ctx = new AudioContext();
let au1;
window.fetch("./sons/BumboSub.ogg")
.then(response => response.arrayBuffer())
.then(arrayBuffer => ctx.decodeAudioData(arrayBuffer,
audioBuffer => {
au1 = audioBuffer;
return au1;
},
error =>
console.error(error)
));
function bPlay1(){
ctx.resume();
bot = "Botão 1";
var playSound1b = ctx.createBufferSource();
var vb1 = document.getElementById('sld1').value;
playSound1b.buffer = au1;
var gain1b = ctx.createGain();
playSound1b.connect(gain1b);
gain1b.connect(ctx.destination);
gain1b.connect(dest);
gain1b.gain.value = vb1;
console.log(au1); ///shows in console!
console.log(playSound1b); ///shows in console!
playSound1b.start(ctx.currentTime);
}
Recently we have decided to play some video in browser at my company. We want to support Safari, Firefox and Chrome. To stream video, Safari requires that we implement range http requests in servicestack. Our server supports range requests as indicated by the 'Accept-Ranges: bytes' header being returned in the response.
Looking at previous questions we would want to add a prerequest filter, but I don't understand the details of doing so. Adding this to our AppHost.cs's configure function does do something:
PreRequestFilters.Add((req, res) => {
if (req.GetHeader(HttpHeaders.Range) != null) {
var rangeHeader = req.GetHeader(HttpHeaders.Range);
rangeHeader.ExtractHttpRanges(req.ContentLength, out var rangeStart, out var rangeEnd);
if (rangeEnd > req.ContentLength - 1) {
rangeEnd = req.ContentLength - 1;
}
res.AddHttpRangeResponseHeaders(rangeStart, rangeEnd, req.ContentLength);
}
});
Setting a breakpoint I can see that this code is hit. However rangeEnd always equals -1, and ContentLength always equals 0. A rangeEnd of -1 is invalid as per spec, so something is wrong. Most importantly, adding this code breaks video playback in Chrome as well. I'm not sure I'm on the right track. It does not break the loading of pictures.
If you would like the details of the Response/Request headers via the network let me know.
I'm having an issue with my Google Assistant Action and using it in the Google Assistant Mobile app.
I am trying to play a tracklist of 1-3 minute mp3s using Media Responses and callbacks and it is working perfectly in the simulator and on my Google Home Mini, but not on the Google Assistant app on my phone.
What I've noticed happening is the MediaResponse callback isn't sent when I test on iPhone. The first MediaResponse will play but then the app is silent. It doesn't exit my action, though, it leaves the mic open and when I try to talk to it again whatever I say is sent to my action. This part is very similar to Starfish Mint's problem, though mine seems to work on my Google Home device. They said they fixed it by
"After waiting 6 months, We manage to solve it ourselves. On
MEDIA_FINISHED, we must return Audio text within your media response
to get subsequent MEDIA_FINISHED event. We tested this with playlist
of 100 media files and it plays like a charm."
though I'm not entirely sure what that means.
This might be an obvious answer to my question but where it says: Media responses are supported on Android phones and on Google Home , does this mean that they aren't supported on iPhone and that's the issue? Are there any workarounds for this, like using a Podcast action or something?
I have tried another audio playing app, the Music Player Sample app which is one of Google's sample Dialogflow apps and it also doesn't work on my phone though does in the other places. Maybe it is just an iPhone thing?
The thing that I find confusing, though, is when I look at the capabilities of the action on my phone: conv.surface.capabilities.has("actions.capability.MEDIA_RESPONSE_AUDIO")it includes actions.capability.MEDIA_RESPONSE_AUDIO in its capabilities. If it didn't have this I would be more inclined to believe it doesn't include iPhones but it seems weird that it would have it in the capabilities but then not work.
Here's the code where I am playing the first track:
app.intent('TreatmentResponse', (conv, context, params) => {
var treatmentTracks = [{url: 'url', name: 'name'},{url: 'url', name: 'name'}];
var result = playNext(treatmentTracks[0].url, treatmentTracks[0].name);
var response = result[0];
conv.data.currentTreatment = 'treatment';
conv.data.currentTreatmentName = 'treatmentName';
conv.data.treatmentPos = 1;
conv.data.treatmentTracks = treatmentTracks;
conv.ask("Excellent, I'll play some tracks in that category.");
conv.ask(response);
conv.ask(new Suggestions(['skip']));
});
and here is my callback function:
app.intent('Media Status', (conv) => {
const mediaStatus = conv.arguments.get('MEDIA_STATUS');
var { treatmentPos, treatmentTracks, currentTreatment, currentTreatmentName } = conv.data;
if (mediaStatus && mediaStatus.status === 'FINISHED' && treatmentPos < treatmentTracks.length) {
playNextTrack(conv, treatmentPos, treatmentTracks);
} else {
endConversation(conv, currentTreatment);
}
});
Here's playNextTrack()
function playNextTrack(conv, pos, medias) {
conv.data.treatmentPos = pos+1;
var result = playNext(medias[pos].url, medias[pos].name);
var response = result[0];
var ssml = result[1];
conv.ask(ssml);
conv.ask(response);
conv.ask(new Suggestions(['skip']));
}
and playNext()
function playNext(url, name) {
const response = new MediaObject({
name: name,
url: url,
});
var ssml = new SimpleResponse({
text: 'Up next:',
speech: '<speak><break time="1" /></speak>'
});
return [response, ssml];
}
The other issue is when the MediaResponse is playing on my iPhone if I interrupt it to say "Next" or "Skip", rather than using my "NextOrSkip" intent like it does in the simulator and on the Google Home Mini, it just says "sure" or "alright" [I don't have that in my code anywhere] and then is silent (and listening).
I'm trying to build an iOS Webapp that uses audio. While it has been a very fickle endeavor, I finally managed to get it to work in Safari Mobile (interestingly enough it worked in chrome mobile a long time before, I don't know why…). Yet when I save it as a webapp on the home screen, the audio stops working mysteriously…
Here is the audio code. window.helpers.gongis a base64 encoded mp3 file.
I checked the console output in the webapp via the desktop safari, yet there are no errors thrown.
Any ideas what might be going wrong?
window.helpers.audio = {
myAudioContext: null,
mySource: null,
myBuffer: null,
init: function() {
if ('AudioContext' in window) {
this.myAudioContext = new AudioContext();
} else if ('webkitAudioContext' in window) {
this.myAudioContext = new webkitAudioContext();
} else {
alert('Your browser does not support yet Web Audio API');
}
var self = this;
var load = (function (url) {
var arrayBuff = window.helpers.Base64Binary.decodeArrayBuffer(window.helpers.gong);
self.myAudioContext.decodeAudioData(arrayBuff, function(audioData) {
self.myBuffer = audioData;
});
}());
},
play: function() {
this.mySource = this.myAudioContext.createBufferSource();
this.mySource.buffer = this.myBuffer;
this.mySource.connect(this.myAudioContext.destination);
if ('AudioContext' in window) {
this.mySource.start(0);
} else if ('webkitAudioContext' in window) {
this.mySource.noteOn(0);
}
}
};
The code is called like this on load:
window.helpers.audio.init();
And later it is triggered through user action:
...
$('#canvas').click(function() {
if(this.playing == false) {
window.helpers.audio.play();
}
}.bind(this));
...
Ouch, the answer was blindingly simple:
I had the mute switch on the side of the iPhone set to mute the whole time.
So it turns out that safari plays audio even when the switch is on mute, yet when you save it as a web app, it doesn't work anymore.
If I understand correctly the audio works on desktop Safari, and not on mobile Safari?
This could be a result of a limitation placed on mobile Safari that requires any sound that is played to be a triggered in a user action (for example, a click).
Read more here:
http://buildingwebapps.blogspot.com/2012/04/state-of-html5-audio-in-mobile-safari.html
I've read all the cross domain iframe posts here (my thanks to all of you!) and elsewhere.
The postMessage script at cross-domain iframe resizer? works beautifully in Firefox 5 and up. It resizes the iframe every time a page is clicked within the iframe perfectly.
But it doesn't resize at all in IE (7 8 or 9) on my computer. I checked the security settings and the one in IE for access across domains was checked to enable.
Does postMessage not work in IE? - Or is there something else that needs to be added? thanks
It's a great script from thomax - it also works on so you can use iframes on mobile - iphones and android
For IE7 and IE8, you have to use window.attachEvent instead of window.addEventListener
It should also be onmessage instead of message (see below) ps you also need to do the same on the server with the content posting its size
<script type="text/javascript">
if (window.addEventListener)
{
function resizeCrossDomainIframe(id) {
var iframe = document.getElementById(id);
window.addEventListener('message', function(event) {
var height = parseInt(event.data) + 32;
iframe.height = height + "px";
}, false);
}
}
else if (window.attachEvent)
{
function resizeCrossDomainIframe(id) {
var iframe = document.getElementById(id);
window.attachEvent('onmessage', function(event) {
var height = parseInt(event.data) + 32;
iframe.height = height + "px";
}, false);
}
}
</script>
Using Peter's code and some ideas from here, you could separate out the compatibility from the executable code, and add some cross-site validation.
<script type="text/javascript">
// Create browser compatible event handler.
var eventMethod = window.addEventListener ? "addEventListener" : "attachEvent";
var eventer = window[eventMethod];
var messageEvent = eventMethod == "attachEvent" ? "onmessage" : "message";
// Listen for a message from the iframe.
eventer(messageEvent, function(e) {
if (e.origin !== 'http://yourdomain.com' || isNaN(e.data)) return;
document.getElementById('iframe_id_goes_here').style.height = e.data + 'px';
}, false);
</script>
Also, for completeness, you could use the following code within the iframe whenever you want to trigger the resize.
parent.postMessage(document.body.offsetHeight, '*');
You can use the implementation of Ben Alman. Here is an example of cross-domain communication, including an example of iframe resize.
http://benalman.com/code/projects/jquery-postmessage/examples/iframe/
According to the documentation, it works on Internet Explorer 6-8, Firefox 3, Safari 3-4, Chrome, Opera 9.
Having looked a lots of different solutions to this I ended up writing a simple jQuery plugin to take a account of a number of different use cases. As I needed a solution that supported multiple user generated iFrames on a portal page, supported browser resizes and could cope with the host page JavaScript loading after the iFrame. I also added support for sizing to width and a callback function and allow the override of the body.margin, as you will likely want to have this set to zero.
https://github.com/davidjbradshaw/iframe-resizer
The host page users jQuery, the iframe code is just a little self-contained JavaScript, so that it's a good guest on other people pages.
The jQuery is then initialised on the host page and has the following available options. More details on what these do on the GitHub page.
$('iframe').iFrameSizer({
log: false,
contentWindowBodyMargin:8,
doHeight:true,
doWidth:false,
enablePublicMethods:false,
interval:33,
autoResize: true,
callback:function(messageData){
$('p#callback').html('<b>Frame ID:</b> ' + messageData.iframe.id +
' <b>Height:</b> ' + messageData.height +
' <b>Width:</b> ' + messageData.width +
' <b>Event type:</b> ' + messageData.type);
}
});
If you set enablePublicMethods, it gives you access in the iframe to manually set the iFrame size and even remove the iframe from the host page.