The definitive source of the best
JavaScript libraries, frameworks, and plugins.

  • ×

    A Web Audio framework for making interactive music in the browser.
    Filed under 

    • 🔾81%Overall
    • 9,594
    • 1.1 days
    • 🕩720
    • 👥7


    Build Status codecov

    Tone.js is a Web Audio framework for creating interactive music in the browser. The architecture of Tone.js aims to be familiar to both musicians and audio programmers looking to create web-based audio applications. On the high-level, Tone offers common DAW (digital audio workstation) features like a global transport for scheduling events and prebuilt synths and effects. For signal-processing programmers (coming from languages like Max/MSP), Tone provides a wealth of high-performance building blocks to create your own synthesizers, effects, and complex control signals.





    • download
    • npm install tone
    • dev -> npm install tone@next


    You can import the entire library:

    import * as Tone from "tone";

    or individual modules:

    import { Synth } from "tone";

    Hello Tone

    //create a synth and connect it to the master output (your speakers)
    const synth = new Tone.Synth().toMaster();
    //play a middle 'C' for the duration of an 8th note
    synth.triggerAttackRelease("C4", "8n");


    Tone.Synth is a basic synthesizer with a single oscillator and an ADSR envelope.


    triggerAttackRelease is a combination of two methods: triggerAttack when the amplitude is rising (for example from a 'key down' or 'note on' event), and triggerRelease is when the amplitude is going back to 0 ('key up' / 'note off').

    The first argument to triggerAttackRelease is the frequency which can either be a number (like 440) or as "pitch-octave" notation (like "D#2"). The second argument is the duration that the note is held. This value can either be in seconds, or as a tempo-relative value. The third (optional) argument of triggerAttackRelease is when along the AudioContext time the note should play. It can be used to schedule events in the future.


    Tone.js abstracts away the AudioContext time. Instead of defining all values in seconds, any method which takes time as an argument can accept a number or a string. For example "4n" is a quarter-note, "8t" is an eighth-note triplet, and "1m" is one measure.

    Read about Time encodings.

    Starting Audio

    Browsers will not play any audio until a user clicks something (like a play button) and the AudioContext has had a chance to start. Run your Tone.js code only after calling Tone.start() from a event listener which is triggered by a user action such as "click" or "keydown".

    Tone.start returns a promise, the audio will be ready only after that promise is resolved. Scheduling or playing audio before the AudioContext is running will result in silence or wrong scheduling.

    //attach a click listener to a play button
    document.querySelector('button').addEventListener('click', async () => {
        await Tone.start()
        console.log('audio is ready')



    Tone.Transport is the master timekeeper, allowing for application-wide synchronization and scheduling of sources, signals and events along a shared timeline. Time expressions (like the ones above) are evaluated against the Transport's BPM which can be set like this: Tone.Transport.bpm.value = 120.


    Tone.js provides higher-level abstractions for scheduling events. Tone.Loop is a simple way to create a looped callback that can be scheduled to start and stop.

    const synth = new Tone.Synth().toDestination();
    //play a note every quarter-note
    const loop = new Tone.Loop(time => {
        synth.triggerAttackRelease("C2", "8n", time);
    }, "4n");

    Since Javascript callbacks are not precisely timed, the sample-accurate time of the event is passed into the callback function. Use this time value to schedule the events.

    You can then start and stop the loop along the Transport's timeline.

    //loop between the first and fourth measures of the Transport's timeline

    Then start the Transport to hear the loop:


    Read about Tone.js' Event classes and scheduling events with the Transport.


    //pass in some initial values for the filter and filter envelope
    const synth = new Tone.Synth({
        oscillator : {
            type : "pwm",
            modulationFrequency : 0.2
        envelope : {
            attack : 0.02,
            decay : 0.1,
            sustain : 0.2,
            release : 0.9,
    //start the note "D3" one second from now
    synth.triggerAttack("D3", "+1");

    All instruments are monophonic (one voice) but can be made polyphonic when the constructor is passed in as the first argument to Tone.PolySynth.

    //a 4 voice Synth
    const polySynth = new Tone.PolySynth(Tone.Synth).toDestination();
    //play a chord
    polySynth.triggerAttackRelease(["C4", "E4", "G4", "B4"], "2n");

    Read more about Instruments.


    In the above examples, the synthesizer was always connected directly to the speakers, but the output of the synth could also be routed through one (or more) effects before going to the speakers.

    //create a distortion effect
    const distortion = new Tone.Distortion(0.4).toDestination();
    //connect a synth to the distortion

    Read more about Effects


    Tone has a few basic audio sources like Tone.Oscillator which has sine, square, triangle, and sawtooth waveforms, a buffer player (Tone.Player), a noise generator (Tone.Noise), a few additional oscillator types (pwm, pulse, fat, fm) and external audio input (when WebRTC is supported).

    //a pwm oscillator which is connected to the speaker and started right away
    const pwm = new Tone.PWMOscillator("Bb3").toDestination().start();

    Read more


    Like the underlying Web Audio API, Tone.js is built with audio-rate signal control over nearly everything. This is a powerful feature which allows for sample-accurate synchronization and scheduling of parameters.

    Read more.


    Tone.js creates an AudioContext when it loads and shims it for maximum browser compatibility using standardized-audio-context. The AudioContext can be accessed at Tone.context. Or set your own AudioContext using Tone.setContext(audioContext).


    To use MIDI files, you'll first need to convert them into a JSON format which Tone.js can understand using Midi.


    Tone.js makes extensive use of the native Web Audio Nodes such as the GainNode and WaveShaperNode for all signal processing, which enables Tone.js to work well on both desktop and mobile browsers.

    This wiki article has some suggestions related to performance for best practices.


    Tone.js runs an extensive test suite using mocha and chai with nearly 100% coverage. Each commit and pull request is run on Travis-CI across browsers and versions. Passing builds on the 'dev' branch are published on npm as tone@next.


    There are many ways to contribute to Tone.js. Check out this wiki if you're interested.

    If you have questions (or answers) that are not necessarily bugs/issues, please post them to the forum.

    References and Inspiration

    Show All