Interactive navigable audio visualization using Web Audio and Canvas.
wavesurfer.js works only in modern browsers supporting Web Audio (Chrome, Firefox, Safari, Opera etc).
It will fallback to Audio Element in other browsers (without graphics). You can also try wavesurfer.swf which is a Flash-based fallback with graphics.
To report a bug, please create a failing test case and submit a pull request. The test case can be either a Jasmine spec, or a simple HTML page demonstrating the problem.
For payed consultancy, feel free to email me at katspaugh@gmail.com. For trivial questions about programming, please refer to StackOverflow.
Yes, if you use the backend: 'MediaElement' option. See here: http://wavesurfer-js.org/example/audio-element/. The audio will start playing as you press play. A thin line will be displayed until the whole audio file is downloaded and decoded to draw the waveform.
No. Web Audio needs the whole file to decode it in the browser. You can however load pre-decoded waveform data to draw the waveform immediately. See here: http://wavesurfer-js.org/example/audio-element/ (the "Pre-recoded Peaks" section).
Create an instance:
var wavesurfer = Object.create(WaveSurfer);Initialize it with a container element (plus some options):
wavesurfer.init({
container: '#wave',
waveColor: 'violet',
progressColor: 'purple'
});Subscribe to some events:
wavesurfer.on('ready', function () {
wavesurfer.play();
});Load an audio file from a URL:
wavesurfer.load('example/media/demo.wav');See the example code here.
For a list of other projects using wavesurfer.js, check out the wiki where you can also add your own project.
| option | type | default | description |
|---|---|---|---|
audioContext |
object | none | Use your own previously initialized AudioContext or leave blank. |
audioRate |
float | 1 |
Speed at which to play audio. Lower number is slower. |
backend |
string | WebAudio |
WebAudio or MediaElement. In most cases you don't have to set this manually. MediaElement is a fallback for unsupported browsers. |
barWidth |
number | none | If specified, the waveform will be drawn like this: ▁ ▂ ▇ ▃ ▅ ▂ |
container |
mixed | none | CSS-selector or HTML-element where the waveform should be drawn. This is the only required parameter. |
cursorColor |
string | #333 |
The fill color of the cursor indicating the playhead position. |
cursorWidth |
integer | 1 |
Measured in pixels. |
fillParent |
boolean | true |
Whether to fill the entire container or draw only according to minPxPerSec. |
height |
integer | 128 |
The height of the waveform. Measured in pixels. |
hideScrollbar |
boolean | false |
Whether to hide the horizontal scrollbar when one would normally be shown. |
interact |
boolean | true |
Whether the mouse interaction will be enabled at initialization. You can switch this parameter at any time later on. |
minPxPerSec |
integer | 50 |
Minimum number of pixels per second of audio. |
normalize |
boolean | false |
If true, normalize by the maximum peak instead of 1.0. |
pixelRatio |
integer | window.devicePixelRatio |
Can be set to 1 for faster rendering. |
progressColor |
string | #555 |
The fill color of the part of the waveform behind the cursor. |
scrollParent |
boolean | false |
Whether to scroll the container with a lengthy waveform. Otherwise the waveform is shrunk to the container width (see fillParent). |
skipLength |
float | 2 |
Number of seconds to skip with the skipForward() and skipBackward() methods. |
waveColor |
string | #999 |
The fill color of the waveform after the cursor. |
All methods are intentionally public, but the most readily available are the following:
init(options)– Initializes with the options listed above.destroy()– Removes events, elements and disconnects Web Audio nodes.empty()– Clears the waveform as if a zero-length audio is loaded.getCurrentTime()– Returns current progress in seconds.getDuration()– Returns the duration of an audio clip in seconds.isPlaying()– Returns true if currently playing, false otherwise.load(url)– Loads audio from URL via XHR. Returns XHR object.loadBlob(url)– Loads audio from aBloborFileobject.on(eventName, callback)– Subscribes to an event. See WaveSurfer Events section below for a list.un(eventName, callback)– Unsubscribes from an event.unAll()– Unsubscribes from all events.pause()– Stops playback.play([start[, end]])– Starts playback from the current position. Optionalstartandendmeasured in seconds can be used to set the range of audio to play.playPause()– Plays if paused, pauses if playing.seekAndCenter(progress)– Seeks to a progress and centers view[0..1](0 = beginning, 1 = end).seekTo(progress)– Seeks to a progress[0..1](0=beginning, 1=end).setFilter(filters)- For inserting your own WebAudio nodes into the graph. See Connecting Filters below.setPlaybackRate(rate)– Sets the speed of playback (0.5is half speed,1is normal speed,2is double speed and so on).setVolume(newVolume)– Sets the playback volume to a new value[0..1](0 = silent, 1 = maximum).skip(offset)– Skip a number of seconds from the current position (use a negative value to go backwards).skipBackward()- RewindskipLengthseconds.skipForward()- Skip aheadskipLengthseconds.stop()– Stops and goes to the beginning.toggleMute()– Toggles the volume on and off.toggleInteraction()– Toggle mouse interaction.toggleScroll()– TogglesscrollParent.zoom(pxPerSec)– Horiontally zooms the waveform in and out. The parameter is a number of horizontal pixels per second of audio. It also changes the parameterminPxPerSecand enables thescrollParentoption.
You can insert your own Web Audio nodes into the graph using the method setFilter(). Example:
var lowpass = wavesurfer.backend.ac.createBiquadFilter();
wavesurfer.backend.setFilter(lowpass);General events:
audioprocess– Fires continuously as the audio plays. Also fires on seeking.error– Occurs on error. Callback will receive (string) error message.finish– When it finishes playing.loading– Fires continuously when loading via XHR or drag'n'drop. Callback will receive (integer) loading progress in percents [0..100] and (object) event target.mouseup- When a mouse button goes up. Callback will receiveMouseEventobject.pause– When audio is paused.play– When play starts.ready– When audio is loaded, decoded and the waveform drawn.scroll- When the scrollbar is moved. Callback will receive aScrollEventobject.seek– On seeking. Callback will receive (float) progress [0..1].
Region events (exposed by the Regions plugin):
region-in– When playback enters a region. Callback will receive theRegionobject.region-out– When playback leaves a region. Callback will receive theRegionobject.region-mouseenter- When the mouse moves over a region. Callback will receive theRegionobject, and aMouseEventobject.region-mouseleave- When the mouse leaves a region. Callback will receive theRegionobject, and aMouseEventobject.region-click- When the mouse clicks on a region. Callback will receive theRegionobject, and aMouseEventobject.region-dblclick- When the mouse double-clicks on a region. Callback will receive theRegionobject, and aMouseEventobject.region-created– When a region is created. Callback will receive theRegionobject.region-updated– When a region is updated. Callback will receive theRegionobject.region-update-end– When dragging or resizing is finished. Callback will receive theRegionobject.region-removed– When a region is removed. Callback will receive theRegionobject.
Regions are visual overlays on waveform that can be used to play and loop portions of audio. Regions can be dragged and resized.
Visual customization is possible via CSS (using the selectors
.wavesurfer-region and .wavesurfer-handle).
To enable the plugin, add the script plugin/wavesurfer.regions.js to
your page.
After doing that, use wavesurfer.addRegion() to create Region objects.
addRegion(options)– Creates a region on the waveform. Returns aRegionobject. See Region Options, Region Methods and Region Events below.clearRegions()– Removes all regions.enableDragSelection(options)– Lets you create regions by selecting. areas of the waveform with mouse.optionsare Region objects' params (see below).
| option | type | default | description |
|---|---|---|---|
id |
string | random | The id of the region. |
start |
float | 0 |
The start position of the region (in seconds). |
end |
float | 0 |
The end position of the region (in seconds). |
loop |
boolean | false |
Whether to loop the region when played back. |
drag |
boolean | true |
Allow/dissallow dragging the region. |
resize |
boolean | true |
Allow/dissallow resizing the region. |
color |
string | "rgba(0, 0, 0, 0.1)" |
HTML color code. |
remove()- Remove the region object.update(options)- Modify the settings of the region.play()- Play the audio region from the start to end position.
General events:
in- When playback enters the region.out- When playback leaves the region.remove- Happens just before the region is removed.update- When the region's options are updated.
Mouse events:
click- When the mouse clicks on the region. Callback will receive aMouseEvent.dblclick- When the mouse double-clicks on the region. Callback will receive aMouseEvent.over- When mouse moves over the region. Callback will receive aMouseEvent.leave- When mouse leaves the region. Callback will receive aMouseEvent.
Install grunt-cli using npm:
npm install -g grunt-cli
Install development dependencies:
npm install
Build a minified version of the library and plugins. This command also checks for code-style mistakes and runs the tests:
grunt
Generated files are placed in the dist directory.
Running tests only:
grunt test
Creating a coverage report:
grunt coverage
The HTML report can be found in coverage/html/index.html.
Initial idea by Alex Khokhulin. Many thanks to the awesome contributors!
This work is licensed under a Creative Commons Attribution 3.0 Unported License.

