JavaScripting

The definitive source of the best
JavaScript libraries, frameworks, and plugins.


  • ×

    Curtainsjs

    curtains.js is a lightweight vanilla WebGL javascript library that turns HTML elements into interactive textured planes.
    Filed under  › 

    • 🔾27%Overall
    • 369
    • 4.5 days
    • 🕩13
    • 👥1

    What is it ?

    Shaders are the next front-end web developpment big thing, with the ability to create very powerful 3D interactions and animations. A lot of very good javascript libraries already handle WebGL but with most of them it's kind of a headache to position your meshes relative to the DOM elements of your web page.

    curtains.js was created with just that issue in mind. It is a small vanilla WebGL javascript library that converts HTML elements containing images and videos into 3D WebGL textured planes, allowing you to animate them via shaders.
    You can define each plane size and position via CSS, which makes it super easy to add WebGL responsive planes all over your pages.

    Knowledge and technical requirements

    It is easy to use but you will of course have to possess good basics of HTML, CSS and javascript.

    If you've never heard about shaders, you may want to learn a bit more about them on The Book of Shaders for example. You will have to understand what are the vertex and fragment shaders, the use of uniforms as well as the GLSL syntax basics.

    Installation

    In a browser:
    
    <script src="curtains.min.js"></script>
    
        
    Using npm:
    
    npm i curtainsjs
    
        
    Load ES module:
    
    import {Curtains} from 'curtainsjs';
    
        

    Examples

    Images

    Vertex coordinates helper
    Simple plane
    Multiple textures with a displacement shader
    Multiple planes
    Asynchronous textures loading
    AJAX navigation
    AJAX navigation with plane removal
    Canvas size, performance and perspective

    Video

    Simple video plane
    Multiple video textures with a displacement shader

    Canvas

    Simple canvas plane

    Basic setup example

    See it live

    HTML

    The HTML set up is pretty easy. Just create a div that will hold your canvas and a div that will hold your image.

    
    <body>
        <!-- div that will hold our WebGL canvas -->
        <div id="canvas"></div>
        <!-- div used to create our plane -->
        <div class="plane">
            <!-- image that will be used as texture by our plane -->
            <img src="path/to/my-image.jpg" />
        </div>
    </body>
    
        

    CSS

    The CSS is also very easy. Make sure the div that will wrap the canvas fits the document, and apply any size you want to your plane div element.

    
    body {
        / make the body fits our viewport /
        position: relative;
        width: 100%;
        height: 100vh;
        margin: 0;
        overflow: hidden;
    }
    
    #canvas {
        / make the canvas wrapper fits the document /
        position: absolute;
        top: 0;
        right: 0;
        bottom: 0;
        left: 0;
    }
    
    .plane {
        / define the size of your plane /
        width: 80%;
        height: 80vh;
        margin: 10vh auto;
    }
    
    .plane img {
        / hide the img element /
        display: none;
    }
    
        

    Javascript

    There's a bit more work in the javascript part : we need to instanciate our WebGL context, create a plane with basic uniforms parameters and use it.

    
    window.onload = function() {
        // get our canvas wrapper
        var canvasContainer = document.getElementById("canvas");
        // set up our WebGL context and append the canvas to our wrapper
        var webGLCurtain = new Curtains("canvas");
        // get our plane element
        var planeElement = document.getElementsByClassName("plane")[0];
        // set our initial parameters (basic uniforms)
        var params = {
            vertexShaderID: "plane-vs", // our vertex shader ID
            fragmentShaderID: "plane-fs", // our framgent shader ID
            uniforms: {
                time: {
                    name: "uTime", // uniform name that will be passed to our shaders
                    type: "1f", // this means our uniform is a float
                    value: 0,
                },
            }
        }
        // create our plane mesh
        var plane = webGLCurtain.addPlane(planeElement, params);
        // use the onRender method of our plane fired at each requestAnimationFrame call
        plane.onRender(function() {
            plane.uniforms.time.value++; // update our time uniform value
        });
    }
    
        

    Shaders

    Here are some basic vertex and fragment shaders. Just put it inside your body tag, right before you include the library.

    
    <!-- vertex shader -->
    <script id="plane-vs" type="x-shader/x-vertex">
        #ifdef GL_ES
        precision mediump float;
        #endif
        // those are the mandatory attributes that the lib sets
        attribute vec3 aVertexPosition;
        attribute vec2 aTextureCoord;
        // those are mandatory uniforms that the lib sets and that contain our model view and projection matrix
        uniform mat4 uMVMatrix;
        uniform mat4 uPMatrix;
        // if you want to pass your vertex and texture coords to the fragment shader
        varying vec3 vVertexPosition;
        varying vec2 vTextureCoord;
        void main() {
            vec3 vertexPosition = aVertexPosition;
            gl_Position = uPMatrix  uMVMatrix  vec4(vertexPosition, 1.0);
            // set the varyings
            vTextureCoord = aTextureCoord;
            vVertexPosition = vertexPosition;
        }
    </script>
    <!-- fragment shader -->
    <script id="plane-fs" type="x-shader/x-fragment">
        #ifdef GL_ES
        precision mediump float;
        #endif
        // get our varyings
        varying vec3 vVertexPosition;
        varying vec2 vTextureCoord;
        // the uniform we declared inside our javascript
        uniform float uTime;
        // our texture sampler (default name, to use a different name please refer to the documentation)
        uniform sampler2D uSampler0;
        void main() {
            vec2 textureCoord = vec2(vTextureCoord.x, vTextureCoord.y);
            // displace our pixels along the X axis based on our time uniform
            // textures coords are ranging from 0.0 to 1.0 on both axis
            textureCoord.x += sin(textureCoord.y  25.0)  cos(textureCoord.x  25.0)  (cos(uTime / 50.0)) / 25.0;
            gl_FragColor = texture2D(uSampler0, textureCoord);
        }
    </script>
    
        

    Et voilà !

    Images uniform sampler names

    Let's say you want to build a slideshow with 3 images and a displacement image to create a nice transition effect.
    By default, the textures uniforms sampler will be named upon their indexes inside your plane element. If you got something like that :

    
    <!-- div used to create our plane -->
    <div class="plane">
        <!-- images that will be used as textures by our plane -->
        <img src="path/to/displacement.jpg" />
        <img src="path/to/my-image-1.jpg" />
        <img src="path/to/my-image-2.jpg" />
        <img src="path/to/my-image-3.jpg" />
    </div>
    
    

    Then, in your shaders, your textures samplers would have to be declared that way :

    
    uniform sampler2D uSampler0 // bound to displacement.jpg
    uniform sampler2D uSampler1 // bound to my-image-1.jpg
    uniform sampler2D uSampler2 // bound to my-image-2.jpg
    uniform sampler2D uSampler3 // bound to my-image-3.jpg
    
    

    It is handy but you could also get easily confused.
    By using a data-sampler attribute on the <img /> tag, you could specify a custom uniform sampler name to use in your shader. With the example above, this would become :

    
    <!-- div used to create our plane -->
    <div class="plane">
        <!-- images that will be used as textures by our plane -->
        <img src="path/to/displacement.jpg" data-sampler="uDisplacement" />
        <img src="path/to/my-image-1.jpg" data-sampler="uSlide1" />
        <img src="path/to/my-image-2.jpg" data-sampler="uSlide2" />
        <img src="path/to/my-image-3.jpg" data-sampler="uLastSlide" />
    </div>
    
    

    
    uniform sampler2D uDisplacement // bound to displacement.jpg
    uniform sampler2D uSlide1       // bound to my-image-1.jpg
    uniform sampler2D uSlide2       // bound to my-image-2.jpg
    uniform sampler2D uLastSlide    // bound to my-image-3.jpg
    
    

    Using videos as textures

    Yes, videos as textures are supported ! However there are a few downsides you need to know.
    First, the videos will always fit the plane : your plane's size ratio would have to be the same as your videos so they won't appear distorted (you can handle that with CSS).
    We can't autoplay videos without a user gesture on most mobile devices. Unless you don't care about mobile users, you will have to start the videos playback after a user interaction like a click event.
    Besides that, videos are really easy to use (and can be mixed with images as well). Let's see how we can handle them :

    HTML

    
    <!-- div used to create our plane -->
    <div class="plane">
        <!-- video that will be used as a texture by our plane -->
        <video src="path/to/my-video.mp4"></video>
    </div>
    
    

    Like with images, you can use a data-sampler attribute to set a uniform sampler name. You can use one or more videos, or mixed them with images if you want :

    
    <!-- div used to create our plane -->
    <div class="plane">
        <!-- elements that will be used as textures by our plane -->
        <img src="path/to/displacement.jpg" data-sampler="displacement" />
        <video src="path/to/my-video-1.mp4" data-sampler="firstVideo"></video>
        <video src="path/to/my-video-2.mp4" data-sampler="secondVideo"></video>
    </div>
    
    

    Javascript

    There's only one change inside our javascript : we need to tell our plane when to start playing the videos. We've got a playVideos() method that we will put inside an event listener in our onReady() method :

    
    window.onload = function() {
        // get our canvas wrapper
        var canvasContainer = document.getElementById("canvas");
        // set up our WebGL context and append the canvas to our wrapper
        var webGLCurtain = new Curtains("canvas");
        // get our plane element
        var planeElement = document.getElementsByClassName("plane")[0];
        // set our initial parameters (basic uniforms)
        var params = {
            vertexShaderID: "plane-vs", // our vertex shader ID
            fragmentShaderID: "plane-fs", // our framgent shader ID
            uniforms: {
                time: {
                    name: "uTime", // uniform name that will be passed to our shaders
                    type: "1f", // this means our uniform is a float
                    value: 0,
                },
            }
        }
        // create our plane mesh
        var plane = webGLCurtain.addPlane(planeElement, params);
        plane.onReady(function() {
            // set an event listener to start our playback
            document.getElementbyId("start-playing").addEventListener("click", function() {
                plane.playVideos();
            });
        }).onRender(function() {
            // use the onRender method of our plane fired at each requestAnimationFrame call
            plane.uniforms.time.value++; // update our time uniform value
        });
    }
    
    

    And that's it. Check the video examples (and source codes) if you want to see what's possible.

    Using canvas as texture

    Last but not least, you can use a canvas as a texture. It is once again really easy to use. You just have to insert a canvas tag inside your HTML, or eventually create it in your javascript and load it using the loadCanvases() method.

    HTML

    
    <!-- div used to create our plane -->
    <div class="plane">
    
        <!-- canvas that will be used as a texture by our plane -->
        <canvas id="canvas-texture"></canvas>
    
    </div>
    
    

    You can use multiple canvases and data-sampler attributes as well, like you'd do with images or videos.

    Javascript

    The javascript code remains almost the same. We just set the size of our canvas, get its context and draw a simple rotating red rectangle inside our animation loop.

    
    window.onload = function() {
        // get our canvas wrapper
        var canvasContainer = document.getElementById("canvas");
    
        // set up our WebGL context and append the canvas to our wrapper
        var webGLCurtain = new Curtains("canvas");
    
        // get our plane element
        var planeElement = document.getElementsByClassName("plane")[0];
    
        // set our initial parameters (basic uniforms)
        var params = {
            vertexShaderID: "plane-vs", // our vertex shader ID
            fragmentShaderID: "plane-fs", // our framgent shader ID
            uniforms: {
                time: {
                    name: "uTime", // uniform name that will be passed to our shaders
                    type: "1f", // this means our uniform is a float
                    value: 0,
                },
            }
        }
    
        // create our plane mesh
        var plane = webGLCurtain.addPlane(planeElement, params);
    
        // our texture canvas
        var textureCanvas = document.getElementById("canvas-texture");
        var textureCanvasContext = textureCanvas.getContext("2d");
    
        // set the size of our canvas
        textureCanvas.width = planeElements[0].clientWidth;
        textureCanvas.height = planeElements[0].clientHeight;
    
        // use the onRender method of our plane fired at each requestAnimationFrame call
        plane.onRender(function() {
            plane.uniforms.time.value++; // update our time uniform value
    
            // here we will handle our canvas texture animation
            // clear scene
            textureCanvasContext.clearRect(0, 0, textureCanvas.width, textureCanvas.height);
    
            // continuously rotate the canvas
            textureCanvasContext.translate(textureCanvas.width / 2, textureCanvas.height / 2);
            textureCanvasContext.rotate(Math.PI / 360);
            textureCanvasContext.translate(-textureCanvas.width / 2, -textureCanvas.height / 2);
    
            // draw a red rectangle
            textureCanvasContext.fillStyle = "#ff0000";
            textureCanvasContext.fillRect(textureCanvas.width / 2 - textureCanvas.width / 8, textureCanvas.height / 2 - textureCanvas.height / 8, textureCanvas.width / 4, textureCanvas.height / 4);
        });
    
    }
    
    

    Documentation

    Curtains object

    Instanciate

    You will have to create a Curtains object first that will handle the scene containing all your planes. It will also create the WebGL context, append the canvas and handle the requestAnimationFrame loop. You just have to pass the ID of the HTML element that will wrap the canvas :

    
    var curtains = new Curtains("canvas"); // "canvas" is the ID of our HTML element
    
    

    You can pass a boolean as a second parameter to indicate whether you want to use the library in production mode. If set to true it will remove all console warnings. Default to false.

    
    var curtains = new Curtains("canvas", true); // use the library in "production" mode
    
    

    Methods

    • addPlane(planeElement, params) :
      planeElement (HTML element) : a HTML element
      params (object) : an object containing the plane parameters (see the Plane object).

      This function will add a plane to our Curtains wrapper.

    • removePlane(plane) :
      plane (plane object) : the plane to remove

      This function will remove a plane from our Curtains object.

    • dispose() :

      This function will cancel the requestAnimationFrame loop, remove all planes and delete the WebGL context.

    Plane object

    Those are the planes we will be manipulating. They are instanciate internally each time you call the addPlane method on the parent Curtains object.

    Properties

    • vertexShader (string) : Your vertex shader as a string. Be careful with the line-breaks as it may throw javascript errors. Will look for vertexShaderID param if not specified.
    • vertexShaderID (string) : the vertex shader ID. If ommited, will look for a data attribute data-vs-id on the plane HTML element. Will use a default vertex shader and throw a warning if nothing specified.
    • fragmentShader (string) : Your fragment shader as a string. Be careful with the line-breaks as it may throw javascript errors. Will look for fragmentShaderID param if not specified.
    • fragmentShaderID (string) : the fragment shader ID. If ommited, will look for a data attribute data-fs-id on the plane HTML element. Will use a default fragment shader that draws only black pixels and throw a warning if nothing specified.
    • widthSegments (integer, optionnal) : plane definition along the X axis (1 by default).
    • heightSegments (integer, optionnal) : plane definition along the Y axis (1 by default).
    • mimicCSS (bool, optionnal) : define if the plane should copy it's HTML element position (true by default).
    • imageCover (bool, optionnal) : define if the images must imitate css background-cover or just fit the plane (true by default).
    • crossOrigin (string, optionnal) : define the cross origin process to load images if any.
    • fov (integer, optionnal) : define the perspective field of view (default to 75).
    • uniforms (object, otpionnal): the uniforms that will be passed to the shaders (if no uniforms specified there won't be any interaction with the plane). Each uniform should have three properties : a name (string), a type (string, see here) and a value.

    Parameters basic example

    
    var params = {
        vertexShaderID: "plane-vs", // our vertex shader ID
        fragmentShaderID: "plane-fs", // our framgent shader ID
        uniforms: {
            time: {
                name: "uTime", // uniform name that will be passed to our shaders
                type: "1f", // this means our uniform is a float
                value: 0,
            },
        }
    }
    
    

    Methods

    • loadImages(imgElements) :
      imgElements (HTML image elements) : a collection of HTML image elements to load into your plane.

      This function is automatically called internally on a new Plane instanciation, but you can use it if you want to create an empty plane and then assign it some textures later. See asynchronous textures loading example.

    • loadVideos(videoElements) :
      videoElements (HTML video elements) : a collection of HTML video elements to load into your plane.

      This function is automatically called internally on a new Plane instanciation. It works exactly the same as the loadImages() method.

    • loadCanvases(canvasElements) :
      canvasElements (HTML canvas elements) : a collection of HTML canvas elements to load into your plane.

      This function is automatically called internally on a new Plane instanciation. It works exactly the same as the loadImages() method.

    • onLoading() :

      This function will be fired each time an image of the plane has been loaded. Useful to handle a loader.

    • onReady() :

      This function will be called once our plane is all set up and ready to be drawn. This is where you may want to add event listener to interact with it or update its uniforms.

    • onRender() :

      This function will be triggered at each requestAnimationFrame call. Useful to update a time uniform, change plane rotation, scale, etc.

    • playVideos() :

      This function will automatically start all of your plane videos playback. If you are not calling it after a user action it might not work on mobile.

    • planeResize() :

      This method is called internally each time the WebGL canvas is resized, but if you remove the plane HTML element and append it again later (typically with an AJAX navigation, see the AJAX navigation example), you would have to manually reset the plane size by calling it.

    • updatePosition() :

      The planes positions are updated only when the canvas container is resized. But if you are updating your plane HTML element position without resizing the container (typically animating its CSS position or transform values), call this method in your animation loop at the same time.
      Only effective if the mimicCSS property is set to true.

    • setPerspective(fieldOfView, nearPlane, farPlane) :
      fieldOfView (integer) : the perspective field of view. Should be greater than 0 and lower than 180. Default to 75.
      nearPlane (float, optionnal) : closest point where a mesh vertex is displayed. Default to 0.1.
      farPlane (float, optionnal) : farthest point where a mesh vertex is displayed. Default to 150 (two times the field of view).

      Reset the perspective. The smaller the field of view, the more perspective.

    • setScale(scaleX, scaleY) :
      scaleX (float) : the scale to set along the X axis.
      scaleY (float) : the scale to set along the Y axis.

      Set the plane new scale.

    • setRotation(angleX, angleY, angleZ) :
      angleX (float) : the angle in radians to rotate around the X axis.
      angleY (float) : the angle in radians to rotate around the Y axis.
      angleZ (float) : the angle in radians to rotate around the Z axis.

      Set the plane rotation.

    • setRelativePosition(translationX, translationY) :
      translationX (float) : the translation value to apply on the X axis in pixel.
      translationY (float) : the translation value to apply on the Y axis in pixel.

      Set the plane translation based on pixel units.

    • mouseToPlaneCoords(xMousePosition, yMousePosition) :
      xMousePosition (float) : mouse event clientX value.
      yMousePosition (float) : mouse event clientY value.

      Get the mouse coordinates relative to the plane clip space values. Use it to send to a uniform and interact with your plane. A plane coordinates ranges from (-1, 1) in the top left corner to (1, -1) in the bottom right corner, which means the values along the Y axis are inverted.

    • enableDepthTest(shouldEnableDepthTest) :
      shouldEnableDepthTest (bool) : enable or disable the depth test for that plane.

      Switches on/off the depth test for that plane. You might want to disable the depth test if you got transparency issues.

    • moveToFront() :

      Let the plane overlay all other planes. Be careful as it is silently disabling depth test for that plane, you might want to switch it back on later.

    Canvas height, perspective and performance

    The canvas size will directly impact the global perspective and performance.

    The perspective is calculated based on each plane position inside the canvas as well as the canvas width and height. The center of the field of view will be located at the center of your canvas, which means that with tall or wide canvas you might end with exagerated perspectives.
    Also keep in mind that a wide or tall canvas will pretty much impact performance (thanks to Colin Peyrat for pointing that out).

    There's a way to avoid these annoyances: set your canvas to fit the window size instead of the document size, and set the position of each plane based on the scroll inside your render loop. Check the canvas size, performance and perspective example if you want to see the differences.

    Other performance tips

    • Be careful with each plane definition. A lot of vertices implies a big impact on performance. If you plan to use more than one plane, try to reduce the number of vertices.
    • Large images have a bigger impact on performance. Try to scale your images so they will fit your plane maximum size. It goes the same for videos of course : try to keep them as light as possible.
    • Try to use as less javascript as possible in the onRender() planes methods as this get executed at each draw call. Try not to use too many uniforms as they are updated at every draw call as well.
    • If you use multiple planes with multiple textures, you should set the dimensions of your plane to fit the aspect ratio of your images in CSS (you could use the padding-bottom hack, see the multiple planes example HTML & CSS) and set the imageCover plane property to false when adding it.

    Changelog

    Version 1.6

    • Added an updatePosition() method to the Plane object.
    • Added a "production" parameter to the Curtains instanciation process to remove console warnings in production.
    • Improved video handling and removed warnings.

    Version 1.5

    • Added a removePlane() method to the Curtains object.
    • Slightly improved video textures performance.

    Version 1.4

    • Added support for canvases as textures.
    • Improved pixel ratio handling.

    Version 1.3

    • Added the possibility to set/unset depth test for each plane via a enableDepthTest() method. This might be useful to handle transparency problems.
    • Added a plane moveToFront() method so that a plane could overlay all other planes.

    Version 1.2

    • Added support for videos as textures.
    • Sort planes by their vertices length in order to avoid redundant buffer binding calls during draw loop.
    • Refactored and cleaned code.

    Version 1.1

    • WebGL context viewport size now based on drawingBufferWidth and drawingBufferHeight.
    • Cleaned and refactored code in order to add support for lost and restored context events.

    About

    This library is released under the MIT license which means it is free to use for personnal and commercial projects.

    All images used in the examples were taken by Marion Bornaz during the Mirage Festival.

    All examples video footages were shot by Analogue Production.

    Many thanks to webglfundamentals.org tutorials which helped me a lot.

    Author of this library is Martin Laxenaire, a french creative front-end developper based in Lyon.
    Found a bug ? Have questions ? Do not hesitate to email me or send me a tweet : @webdesign_ml.

    Showcase

    Here's a list of websites that use curtains.js with their own custom shaders:

    Nördik Impakt 2018
    Olaian Lookbook

    Show All