synaudio is a JavaScript and WebAssembly library that finds the synchronization points between two or more similar audio clips.
Audio clips are synchronized using the Pearson correlation coefficient algorithm to compare each sample of the digital audio. The audio clips are shifted slightly, compared, and the correlation coefficient is stored. The shift offset with the highest correlation coefficient is returned as the synchronization point.
- Correlation algorithm implemented as hand optimized WebAssembly 128bit SIMD instructions.
- Works in all major browsers, and JavaScript runtimes
- Built in Web Worker implementations for parallel processing.
- Syncing audio playback for streaming radio when the stream switches bit-rates, codecs, or reconnects. See IcecastMetadataPlayer
- Syncing multiple recordings from analog tape where the speed has slightly varied between captures. See synaudio-cli
- Syncing multiple digital audio recordings where the digital clocks were mismatched.
- Syncing audio to an existing audio / video stream.
- Syncing audio clips that have been cut or recorded in multiple segments so they can be reconstructed.
- Syncing multitrack recordings that have slight delay, phase differences, or miss-matched clocks.
View the live Demo!
Install via NPM
npm i synaudio
-
Create a new instance of
SynAudio.import SynAudio from 'synaudio'; const synAudio = new SynAudio({ correlationSampleSize: 5000, initialGranularity: 16, });
SynAudio can synchronize two clips: one base and one comparison. The comparison clip must be a subset of the base clip in order for there to be a valid match. If you don't know the ordering of the clips, see Sync Multiple Clips
-
Call the
sync,syncWorker, orsyncWorkerConcurrentmethod on the instance to find the synchronization point in samples between two audio clips.- See the API section below for details on these methods.
// example "base" object const base = { channelData: [leftChannelFloat32Array, rightChannelFloat32Array] samplesDecoded: 5678 }; // example "comparison" object const comparison = { channelData: [leftChannelFloat32Array, rightChannelFloat32Array] samplesDecoded: 1234 } const { sampleOffset, // position relative to `base` where `comparison` matches best correlation, // covariance coefficient of the match [ ranging -1 (worst) to 1 (best) ] } = await synAudio.syncWorkerConcurrent( base, // audio data to use a base for the comparison comparison, // audio data to compare against the base 4 // number of threads to spawn );
syncOneToMany will synchronize one base to one or more comparison clips. The comparison clip(s) must be derived from the same audio source as the base clip in order for there to be a valid match.
-
Note: This method requires
SharedMemoryto be enabled on your runtime platform. -
Call the
syncOneToManymethod on the instance to find the synchronization point in samples between base and the comparison audio clips.- See the API section below for details on these methods.
// example "base" object const base = { channelData: [leftChannelFloat32Array, rightChannelFloat32Array] samplesDecoded: 20000 }; // example "comparisonClips" array const comparisonClips = [ { name: "clip1", data: { // first comparison clip channelData: [leftChannelFloat32Array, rightChannelFloat32Array] samplesDecoded: 1234 }, syncStart: 0, syncEnd: 10000 }, { name: "clip2", data: { // second comparison clip channelData: [leftChannelFloat32Array, rightChannelFloat32Array] samplesDecoded: 1234 } syncStart: 5000, syncEnd: 15000 } ] const results = await synAudio.syncOneToMany( base, // audio data to use a base for the comparison comparisonClips, // array of audio data to compare against the base 16 // number of comparison clips to sync concurrently onProgressUpdate // optional callback function );
-
The return value will contain an array of
MultipleClipMatchobjects that represent the correlation and sample offset for each comparison clip. The order of the results matches the order of the input comparison clips.// results example [ { // first comparison clip name: "clip1", sampleOffset: 1234, // position relative to `base` where the first `comparison` matches best correlation: 0.8, }, { // second comparison clip name: "clip2", sampleOffset: 5678, // position relative to `base` where the second `comparison` matches best correlation: 0.9, } ]
syncMultiple will find the best linear match(es) between a set of two or more clips. Internally, SynAudio will determine the correlation of every order combination of each clip and then will find the path(s) in this graph where the correlation is the highest.
-
Call the
syncMultiplemethod on the instance to find the best synchronization path between two or more audio clips.// example "clips" array const clips = [ { name: "clip1", data: { channelData: [leftChannelFloat32Array, rightChannelFloat32Array] samplesDecoded: 64445 } }, { name: "clip2", data: { channelData: [leftChannelFloat32Array, rightChannelFloat32Array] samplesDecoded: 24323 } }, { name: "clip3", data: { channelData: [leftChannelFloat32Array, rightChannelFloat32Array] samplesDecoded: 45675 } } ]; const results = await synAudio.syncMultiple( clips, // array of clips to compare 8 // number of threads to spawn );
-
The
resultsobject will contain a two dimensional array of of match groups containing matching clips. Each match group represents an ordered list of matching audio clips where each clip relates to the previous. The sample offset within each match group is relative to the first clip in the series.- In the below example, there are two match groups with the first group containing three clips, and the second containing two clips. There was no significant correlation (no correlation >=
options.correlationThreshold) found between the clips in the two match groups. If a clip were to exist that relates the two groups together, then the result would contain only one match group, and relate all other clips to the first one in sequential order.
// results example [ // first match group (no relation to second group) [ { name: "cut_1601425_Mpeg", // first clip in match sampleOffset: 0, }, { name: "cut_2450800_Mpeg", correlation: 0.9846370220184326, sampleOffset: 849375, // position where this clip starts relative to the first clip }, { name: "cut_2577070_Mpeg", correlation: 0.9878544973345423, sampleOffset: 975645, // position where this clip starts relative to the first clip }, ], // second match group (no relation to first group) [ { name: "cut_194648_Mpeg", sampleOffset: 0, }, { name: "cut_287549_Mpeg", correlation: 0.9885798096656799, sampleOffset: 92901, // position where this clip starts relative to the first clip }, ] ]
- In the below example, there are two match groups with the first group containing three clips, and the second containing two clips. There was no significant correlation (no correlation >=
Class that that finds the synchronization point between two or more similar audio clips.
new SynAudio({
correlationSampleSize: 1234,
initialGranularity: 1,
});declare interface SynAudioOptions {
correlationSampleSize?: number; // default 11025
initialGranularity?: number; // default 16
correlationThreshold?: number; // default 0.5
shared?: boolean; // default false
}correlationSampleSizeoptional, defaults to 11025- Number of samples to compare while finding the best offset
- Higher numbers will increase accuracy at the cost of slower computation
initialGranularityoptional, defaults to 16- Number of samples to jump while performing the first pass search
- Higher numbers will decrease accuracy at the benefit of much faster computation
correlationThresholdoptional, defaults to 0.5- Threshold that will filter out any low correlation matches
- Only applicable to
syncMultiple
sharedoptional, defaults to false- Enables the use of
SharedMemorybetween the main and worker threads - Runtime environment must support Web Assembly SIMD and Shared Memory features
- Must be enabled to use
syncOneToMany
- Enables the use of
declare class SynAudio {
constructor(options?: SynAudioOptions);
/*
* Two Clips
*/
public async sync(
base: PCMAudio,
comparison: PCMAudio
): Promise<TwoClipMatch>;
public syncWorker(
base: PCMAudio,
comparison: PCMAudio
): Promise<TwoClipMatch>;
public syncWorkerConcurrent(
base: PCMAudio,
comparison: PCMAudio,
threads?: number // default 1
): Promise<TwoClipMatch>;
/*
* One Base, Multiple Comparison Clips
*/
public syncOneToMany(
base: PCMAudio,
comparisonClips: AudioClip[],
threads?: number, // default 1
onProgressUpdate?: (progress: number) => void
): Promise<MultipleClipMatch[]>;
/*
* Multiple Clips
*/
public syncMultiple(
clips: AudioClip[],
threads?: number // default 8
): Promise<MultipleClipMatchList[]>;
}sync(base: PCMAudio, comparison: PCMAudio): Promise<TwoClipMatch>- Executes on the main thread.
- Parameters
baseAudio data to compare againstcomparisonAudio data to use as a comparison
- Returns
- Promise resolving to
TwoClipMatchthat contains thecorrelationandsampleOffset
- Promise resolving to
syncWorker(base: PCMAudio, comparison: PCMAudio): Promise<TwoClipMatch>- Execute in a separate thread as a web worker.
- Parameters
baseAudio data to compare againstcomparisonAudio data to use as a comparison
- Returns
- Promise resolving to
TwoClipMatchthat contains thecorrelationandsampleOffset
- Promise resolving to
syncWorkerConcurrent(base: PCMAudio, comparison: PCMAudio, threads: number): Promise<TwoClipMatch>- Splits the incoming data into chunks and spawns multiple workers that execute concurrently.
- Parameters
baseAudio data to compare againstcomparisonAudio data to use as a comparisonthreadsNumber of threads to spawn optional, defaults to 1
- Returns
- Promise resolving to
TwoClipMatchthat contains thecorrelationandsampleOffset
- Promise resolving to
syncOneToMany(base: PCMAudio, comparisonClips: AudioClip[], threads?: number, onProgressUpdate?: (progress: number) => void): Promise<MultipleClipMatch[]>- Executes on worker threads for each comparison clip
- Parameters
baseAudio data to use as the base of comparisoncomparisonClipsArray ofAudioClip(s) to comparesyncStartandsyncEndcan be set for each comparisonAudioClipto define sync range of the base clip. If you have a known lower or upper bound, this will speed up the sync operation by limiting the search to those bounds.
threadsMaximum number of concurrent comparisons to run. optional, defaults to 1onProgressUpdateCallback function that is called with a number from 0 to 1 indicating the progress of the sync operation.
- Returns
- Promise resolving to an array of
MultipleClipMatchthat contains thename,correlation, andsampleOffsetfor each comparison clip
- Promise resolving to an array of
syncMultiple(clips: AudioClip[], threads?: number): Promise<MultipleClipMatch[][]>- Executes on the worker threads for each comparison.
- Parameters
clipsArray ofAudioClip(s) to comparethreadsMaximum number of threads to spawn optional, defaults to 8
- Returns
- Promise resolving to
MultipleClipMatchList[]Two dimensional array containing a list of each matching audio clip groups
- Promise resolving to
interface PCMAudio {
channelData: Float32Array[];
samplesDecoded: number;
}channelData- Array of Float32Array of audio data
- Each Float32Array represents a single channel
- Each channel should be exactly the same length
samplesDecoded- Total number of samples in a single audio channel
interface AudioClip {
name: string;
data: PCMAudio;
syncStart?: number;
syncEnd?: number;
}name- Name of the audio clip
data- Audio data for the clip
syncStart- Audio sample starting offset from the base file to start syncing
- Defaults to
0(start of the base audio clip)
syncEnd- Audio sample ending offset from the base file to end syncing
- Defaults to the length of the base audio clip (end of the base audio clip)
interface TwoClipMatch {
correlation: number;
sampleOffset: number;
}correlation- Correlation coefficient of the
baseandcomparisonaudio at thesampleOffset - Ranging from -1 (worst) to 1 (best)
- Correlation coefficient of the
sampleOffset- Number of samples relative to
basewherecomparisonhas the highest correlation
- Number of samples relative to
declare type MultipleClipMatchList =
| []
| [MultipleClipMatchFirst, ...MultipleClipMatch];
declare interface MultipleClipMatchFirst {
name: string;
sampleOffset: 0;
}
declare interface MultipleClipMatch {
name: string;
correlation: number;
sampleOffset: number;
}name- Name of the matching clip
correlation- Correlation coefficient between the previous clip and this cli
- Ranging from -1 (worst) to 1 (best)
sampleOffset- Number of samples relative to the root clip (first clip in the match)
- Install Emscripten by following these instructions.
- This repository has been tested with Emscripten 4.0.7.
- Make sure to
sourcethe Emscripten path in the terminal you want build in.- i.e.
$ source path/to/emsdk/emsdk_env.sh
- i.e.
- Run
npm ito install the dependencies. - Run
make cleanandmaketo build the libraries.- You can run
make -j8where8is the number of CPU cores on your system to speed up the build.
- You can run
- The builds will be located in the
distfolder.
- Run
npm run testto run the test suite.