How Do You Actually Use an Audio Sprite?

I have looked around at the GFS tutorial, the Zenva tutorial, and the phaser examples. I’ve also looked in this forum, and the HTML5gamedevs forum. There are a few questions, but none of them have been answered. The current examples are completely useless to get a basic understanding, and the docs don’t help. e.g. This syntatic poop.

How do you play a specific sound after you load an audiosprite?
###Preload

   this.load.audioSprite('soundsprite', 'assets/soundsprite.json', [
        'assets/soundsprite.mp3'
    ]);

Do you load it like this?

var specificSound = this.sound.add('soundsprite', 'specificSound');

Or do you just do this?

var specificSound = this.sound.addAudioSprite(‘soundsprite’);

Secondly, what is the difference in Web Audio, HTML5 Audio, and No Audio? Is there something that explains this?

Finally, is the time property in decimalized seconds or is it in ms? Is there a way to convert to ms if this is true?

If you use the audiosprite utility that creates the json and 4 different audio formats then

When loading you only need to load the json:
this.load.audioSprite(‘your_cache_key’, ‘assets/audio/youraudio.json’);

You play your audio like this:
this.sound.playAudioSprite(‘your_cache_key’, ‘collect_coin_sound’);

Anyway you shouldn’t worry too much about what kind of web audio it is since phaser takes care of that and with 4 different formats it will work on most devices. I’m not sure what you mean by seconds and milliseconds but audiosprite functionality finds the right time places in the audiofiles automatically just like with spritesheets for pictures.

1 Like

I would do it like this

export default class MainScene extends Phaser.Scene {
  sfx

  constructor() {
    super({ key: 'MainScene' })
  }

  preload() {
    this.load.crossOrigin = 'anonymous'
    this.load.baseURL = 'https://labs.phaser.io/'

    this.load.audioSprite('sfx', 'assets/audio/SoundEffects/fx_mixdown.json', [
      'assets/audio/SoundEffects/fx_mixdown.ogg',
      'assets/audio/SoundEffects/fx_mixdown.mp3'
    ])
  }

  create() {
    this.sfx = this.sound.addAudioSprite('sfx')
    this.sfx.play('escape')
  }
}

1 Like

That’s a perfect example, thank you. I just got it to work frantically trying all of the possibilities.

What really got me off track was that the evil folks at google decided they knew best. I was logging too many things and missed the Web Audio warning about auto play. I think I need to switch browsers for development now. Since I am doing this for myself, I don’t need to support chrome if I get too frustrated. :slight_smile:

The followup I guess now is how would one stop google from being evil and blocking your audio?

Is a preloader with a forced click the only solution?

It’s in seconds as I’ve found out after now getting it to work. :slight_smile: Milliseconds is the standard in the audio world for SMPTE stuff. I can’t think of an immediately bad reason for having it as a floating point number though. It does save weight, so that’s one in the positive column.

Yes you need mouse interaction/click before audio can start playing. For example start button. Better take that into account in your game design.

1 Like

Thank you! I was afraid of that.

The problem is that it’s not so much a game, it’s a dapp where stuff plays and people watch before interacting. I guess I will just default the mute button to off for Chrome. Google seems to think people only use sound for ads and video playback.

I’m having a fun time learning about Phaser. I hope I will be able to contribute some to the docs when I get figured out. There’s a lot of stuff just missing basic examples. The majority of devs have gaming experience, but Phaser 3 + custom build via Webpack opens the door to a whole bunch of uses.