FAQ
Programming
What is the recommended way to initialize WebGL?
It is recommended that you check for success or failure to initialize. If WebGL fails to initialize it is recommended you distinguish between failure because the browser doesn't support WebGL and failure for some other reason. If the browser does not support WebGL then present the user with a link to "http://get.webgl.org". If WebGL failed for some other reason present the user with a link to "http://get.webgl.org/troubleshooting/"
You can determine if the browser supports WebGL by checking for the existence of WebGLRenderingContext.
if (window.WebGLRenderingContext) { // browser supports WebGL }
If the browser supports WebGL and canvas.getContext("webgl") returns null then WebGL failed for some reason other than user's browser (no GPU, out of memory, etc...)
if (!window.WebGLRenderingContext) { // the browser doesn't even know what WebGL is window.location = "http://get.webgl.org"; } else { var canvas = document.getElementById("myCanvas"); var ctx = canvas.getContext("webgl"); if (!ctx) { // browser supports WebGL but initialization failed. window.location = "http://get.webgl.org/troubleshooting"; } }
Note: You MUST check that the browser supports WebGL to know that getting null from canvas.getContext() means
There is a wrapper that will do all of this for you here.
Example using the wrapper
<html> <body> <script type="text/javascript" src="localpath/webgl-utils.js"></script> <script> function init() { canvas = document.getElementById("c"); gl = WebGLUtils.setupWebGL(canvas); if (!gl) { return; } ... } window.onload = init; </script> <canvas id="c"></canvas> </body> </html>
What is the recommended way to implement a rendering loop?
It is HIGHLY recommended you use requestAnimationFrame if it is available. There is a small cross browser implementationr available here.
Example using the wrapper.
<html> <body> <script type="text/javascript" src="localpath/webgl-utils.js"></script> <script> window.onload = init; function init() { canvas = document.getElementById("c"); ... // render the first frame. render(); } function render() { // request render to be called for the next frame. window.requestAnimFrame(render, canvas); ... // render scene ... } </script> <canvas id="c"></canvas> </body> </html>
What is the recommended way to handle lost context?
How do I make my content rendered in hi-res on HI-DPI machines?
See Handling High DPI displays in WebGL.
Why don't textures work in my vertex shader on browser/machine/os XXX?
WebGL conforms to the OpenGL ES 2.0 spec in this regard. Implementations advertise how many textures can be accessed in a vertex shader with
var numTexturesAvailableInVertexShader = gl.getParameter(gl.MAX_VERTEX_TEXTURE_IMAGE_UNITS)
That value is allowed to be 0. In which case your code must take appropriate action.
Why don't my textures render?
WebGL conforms to the OpenGL ES 2.0 spec in this regard. Textures that are non-power-of-two (NPOT) do not render unless texture filtering is set so it does not require mips and texture wrapping is set to CLAMP_TO_EDGE. The default parameters of a texture require mips and are set to wrap so you must set the texture parameters appropriately.
gl.bindTexture(gl.TEXTURE_2D, someNPOTTexture); // Turn off the need for mips. gl.texParameter(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR); // Set wrapping to CLAMP_TO_EDGE gl.texParamater(gl.TEXTURE_2D, gl.WRAP_S, gl.CLAMP_TO_EDGE); gl.texParamater(gl.TEXTURE_2D, gl.WRAP_T, gl.CLAMP_TO_EDGE);
Why do I get an error with gl.framebufferTexture2D on certain machines/browsers?
There is a bug in many NVidia drivers that requires that textures passed to framebufferTexture2D be complete. In other words, they either must have all required mip levels or else filtering must be set to not require more than the base mip level. Since the texture is not complete without these settings, set the texture parameters before calling framebufferTexture2D
var fbo = gl.createFramebuffer(); gl.bindFramebuffer(gl.FRAMEBUFFER, fbo); var tex = gl.createTexture(); gl.bindTexture(gl.TEXTURE_2D, tex); gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, width, height, 0, gl.RGBA, gl.RGBA, null); // Turn off the need for mips. gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR); // Set wrapping to CLAMP_TO_EDGE gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE); gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE); gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, tex, 0);
Why do I get poor performance with specific vertex attribute types on some GPUs?
There's an obscure slow-down on some ATI/AMD configs when using single-byte vertex attributes (e.g. passing size=1 and type=UNSIGNED_BYTE or BYTE to vertexAttribPointer). A simple workaround is to use size=4, or use type=FLOAT so that the vertex attribute size is a multiple of 4 bytes. This doesn't seem to be a WebGL problem, but a problem in the underlying GPU driver.
Configurations where this problem has been reproduced:
- 2011 MacBook Pro OSX 10.7.5 with AMD Radeon HD 6750M or HD 6770M