I have a sound component with 1 slot set to autoplay. On iOS if I tap the screen then the audio plays as expected. However, if I swipe the screen first rather than tapping then the audio does not play and subsequent taps do not cause the audio to play.
Here is a simple example. It’s just a blank project with a sound component added.
I wonder if this is a known play canvas issue or an Apple issue?
Is there anything I can do in javascript to fix it?
I can reproduce the issue on my phone, it seems to be a design decision on iOS. Here is a relevant thread, the following may be a solution when using a really large element as the initial input receiver:
Thanks both your comments were helpful. All the touch events are received on the canvas. The iOS default behaviour is that the time between touchStart and touchEnd need to be the length of a “click” in order for the sound to play. If you delay the touchEnd too much then the sound doesn’t play. Also if you do a really quick swipe eg just a couple of touchMove events then the sound plays. I have been able to override this behaviour by registering a touchStart handler as starting the sound manually in there.
var Touchtest = pc.createScript('touchtest');
Touchtest.prototype.initialize = function() {
this.slots = [];
for (var slot in this.entity.sound.slots) {
this.slots.push(this.entity.sound.slots[slot]);
}
var touch = this.app.touch;
if (touch) {
touch.on(pc.EVENT_TOUCHSTART, this.onTouchStart, this);
}
};
Touchtest.prototype.onTouchStart = function (event) {
this.entity.sound.play(this.slots[0].name);
};
This seems to be related to the SoundManager class. IOS does seems to ignore touch drag as a valid interaction to play audio, but subsequent clicks should correctly trigger the audio. It looks like the Sound Manager gets in a unlocking state. I’ve raised an issue here and you can test it on this project