Thanks to visit codestin.com
Credit goes to github.com

Skip to content

AppleSequencer: mismatch between hostTime(forBeats:) and actual play time #2944

@ImmuneToBoredom

Description

@ImmuneToBoredom

macOS Version(s) Used to Build

macOS 13 Ventura

Xcode Version(s)

Xcode 14

Description

In AppleSequencer, there's a mismatch between:

  • the return value of hostTime(forBeats:) method;
  • the actual time at which the sequencer plays that beat.

More precisely:
Setup: I add a track to the sequencer which plays a note at beat 1.0 with a MIDISampler, then I start the sequencer and get t = hostTime(forBeats:1.0), finally I ask an AudioPlayer to play(at: t).
Result: the first output of the AudioPlayer lands ~ 1300 frames (at 44.1 kHz) later than the first output of the MIDISampler.

This lag does not happen when I replace AppleSequencer by AVAudioSequencer.

Crash Logs, Screenshots or Other Attachments (if applicable)

Here's the repro code:

import AVFoundation
import AudioKit

class Conductor: ObservableObject {
    let syncStart = true
    let beatShift = 1.0

    let engine = AudioEngine()
    let player = AudioPlayer()
    let mixer: Mixer
    let sampler = MIDISampler()
    let sequencer = AppleSequencer()
    let midiCallback = MIDICallbackInstrument()

    init() {
        mixer = Mixer(player, sampler)
        engine.output = mixer
        loadMp3()
        loadMidi()
        plugSampler()
        if syncStart {
            shiftAllNotes(by: beatShift)
        }
        globalSyncDebug()
        do { try engine.start() } catch {
            print("Could not start engine: \(error)")
        }
    }

    func play() {
        prepare()
        sequencer.play()
        if syncStart {
            /// taken from https://stackoverflow.com/questions/52902746/avaudioengine-synchronization-for-midi-playback-and-recording/52960011#52960011
            let hostTimeStartMargin = 0.1
            while sequencer.currentPosition.beats <= hostTimeStartMargin {
                usleep(UInt32(1_000.0))
            }
            let hostTime = try! sequencer.hostTime(forBeats: beatShift)
            let t = AVAudioTime(hostTime: hostTime)
            player.play(at: t)
        } else {
            player.play()
        }
    }

    func stop() {
        player.stop()
        sequencer.stop()
        sequencer.rewind()
    }

    private func loadMp3() {
        do {
            try player.load(
                url: Bundle.main.url(
                    forResource: "Drums", withExtension: "mp3")!, buffered: true
            )
        } catch {
            print("Could not load mp3 file: \(error)")
            return
        }
    }

    private func loadMidi() {
        sequencer.addMIDIFileTracks(
            Bundle.main.url(forResource: "D", withExtension: "midi")!,
            useExistingSequencerLength: false)
    }

    private func plugSampler() {
        for track in sequencer.tracks {
            track.setMIDIOutput(sampler.midiIn)
        }
    }

    private func shiftAllNotes(by beats: Double) {
        for track in sequencer.tracks {
            let shiftedNotes = track.getMIDINoteData().map {
                return MIDINoteData(
                    noteNumber: $0.noteNumber,
                    velocity: $0.velocity,
                    channel: $0.channel,
                    duration: $0.duration,
                    position: $0.position + Duration(beats: beatShift))
            }
            track.replaceMIDINoteData(with: shiftedNotes)
        }
    }

    private func prepare() {
        sequencer.preroll()
        player.playerNode.prepare(withFrameCount: 8192)
    }

    private func globalSyncDebug() {
        installOuputTap(sampler.avAudioNode, msg: "=== SAMPLER STARTED ===")
        installOuputTap(player.avAudioNode, msg: "=== PLAYER STARTED ===")
    }

    private func installOuputTap(_ node: AVAudioNode, msg: String) {
        node.installTap(
            onBus: 0, bufferSize: 1, format: node.outputFormat(forBus: 0)
        ) { buffer, when in
            guard let channelData = buffer.floatChannelData else { return }
            let frameCount = Int(buffer.frameLength)
            let audioData = Array(
                UnsafeBufferPointer(start: channelData[0], count: frameCount))
            guard let index = audioData.firstIndex(where: { $0 != 0 }) else {
                return
            }
            print(msg)
            print("Audio Format: \(node.outputFormat(forBus: 0).settings)")
            print("Output Time: \(when)")
            print(
                "The first nonzero frame is at index: \(index) / \(frameCount)."
            )
        }
    }
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions