Category Archives: WebGL

FX Digital – WebGL

Presentation https://webglworkshop.com/fxdigital-webgl/

CodeSandBox.io – Preview

Triangle
Vertex Shader
Animation
Image Processing
Transition

Hands on

Skellington

HTML shell

Colour and square

Tasks
Answers

Fragment Shader

Tasks
Answers

Functions

Tasks
Answers
Answers using step

Animation

Tasks
Answers

Image Processing

Tasks
Answers

Transitions

Tasks
Answers

Extra

More Fragment Shader plus Shapes
More Image Processing
More Transitions

Introduction to WebGL – Algorithmic Art

16th October 2017

Presentation
On-line presentation

Worksheet
Worksheet as pdf

Demos

Hello, world!
The first thing you write in a new language… done properly!


Have a heart… for all your heart shaped geometry needs
Heart Geometry for Three.js
Demo showing the various options (here).
The geometry is the surface derived from the formula:
(2x^2 + y^2 + z^2 – 1)^3 – 0.1x^2z^3 – y^2z^3
(Based on the C implementation by Mateusz Malczak)

heartgeometrytest
Heart Geometry for Three.js
Simple example (here).

heartgeometrytest
Hmm, crunchy!
Making an apple algorithmically with Three.js’s ParametricBufferGeometry.


Happy Hallowe’en!
Skully


Swole!
Transformation to a sphere.


Psychedelic!
Psychedelic Skully


Browser based AR and VR

OK, so “WebAR and WebVR” may have been a snappier title but also technically incorrect. “WebVR” is an actual standard still under development and “WebAR” doesn’t exist as such (oh look, I spoke too soon! It looks like it may be a pending standard, after all).

The terms VR¹ and AR², along with MR³, RR⁴, ER⁵, XR⁶ and god knows what else can be used to mean a surprising variety of things (check Wikipedia for more info).
¹ Virtual Reality
² Augmented Reality
³ Mixed Reality
⁴ Real Reality (WTF?)
⁵ Extended Reality
⁶ X Reality (I don’t think they’re even trying anymore)

Milgram’s Continuum of Mixed Reality

Google’s version of Milgram’s Continuum

These are supposedly convenient labels for different things on a spectrum, but I think they largely serve to muddy the water and are little more than an exercise in marketing bullocks. To this end, I prefer to just use the terms AR and VR.

Definition of terms
I claim no authority, but I’ll be defining AR and VR as follows (hopefully to provide clarity for the length of this article only):
AR – adding digital content to the existing environment
VR – (near) total immersion in an artificial environment

A “quick” note about hardware
We’ll be browser based, so there’s no bulky, expensive or high-end equipment (although we might burn out your GPU or drain your battery). I’m using my Nexus 4 (ancient my modern standards). A tablet can be used for AR, but might be a bit awkward for VR 🙂 .

While researching this article, I used a SketchFab cardboard and a pair of Go4D glasses. Neither was ideal, the cardboard provides an easy touch button input, while the Go4D has adjustable lenses. Both were needed for a proper experience, so I was sadly disappointed.

Researching for this article is the first time I’ve used used VR for any length of time on a mobile, and it’s both literally headache inducing and arm tiring. I’ve found dedicated systems/headsets far more comfortable to use (although they do make my head hot).
I’m not sure what caused the headaches, possibly poor resolution or poor placement causing eye-strain, adjusting the inter-ocular distance seemed to ease the problem.
It’s evident that for prolonged use some kind of strap is required along with some kind of cushion for the nose.

APIs

Jerome Etiene’s AR.js
argon.js
Aruco

VR Demos

https://airtable.com/embed/shr2Lc7pmlJis02R4/tblZbV2S0W0T5DDth?viewControls=on

AR Demos

Hatsune Miku Dancing in Augmented Reality
WebAR Playground: AR in a Few Clicks
(more info)

Browser Based VR

Requirements
Phone
Headset
Orientation/Positioning/Tracking
Input method

Implementation

VR is based on old-fashioned stereoscopic viewing (and I do mean old-fashioned). The basic requirement is to provide the viewer with two slightly offset stereoscopic images giving the impression of depth. True “VR” is achieved by adding positional awareness — as the viewer moves the images change to reflect the viewer’s new point of view.

Stereoscopy is OLD!

Without positional tracking the VR experience is severely limited, only allowing the user to look around, resulting in an experience akin to 360° video. Teleporting is a practical alternative to moving around as this is already a widely used method of moving around in a virtual environment.

While the WebVR standard makes VR easier to implement, it’s not widely supported (at the time of writing only Edge, Firefox and Chrome for Android have limited support). Polyfills and various other hacks are used to achieve the necessary functionality. Ultimately, happily, all the required legwork has already been done by many brave, hearty souls, leaving us to simply find the API we like the most.

A number of headsets are available, most notably the cardboard. The cardboard become uncomfortable after a short time.
Recent iterations have moved away from the neodymium magnets, which was the most expensive part of the cardboard and could interfere with the magnetometer/compass functionality of the phone.

The “big three” Three.js, Babylon.js and PlayCanvas make it relatively easy to generate VR content

Polyfill
WebVR
Early browser VR

Browser Based AR

Requirements
Positioning (markers or visual positioning)
Camera
Image processing
Input

AR has significantly more requirements as images additionally interact with the local visual environment (to some extent).

Editors

ARKit – Apple only and can’t find actual demos only movies (Apple restricted?)

A-Frame

Code it yourself or use on-line editor

APIs and Code

VR

A-Frame
Three.js
Babylon.js
PlayCanvas

AR

A-Frame






  
    
      
        
          
        
      
    
    
    

Three.js




var renderer = new THREE.WebGLRenderer({
  antialias: true,
  alpha: true
});

renderer.setClearColor(new THREE.Color('lightgrey'), 0)
renderer.setSize( 640, 480 );
renderer.domElement.style.position = 'absolute'
renderer.domElement.style.top = '0px'
renderer.domElement.style.left = '0px'
document.body.appendChild( renderer.domElement );

var onRenderFcts= [];

var scene = new THREE.Scene();

var camera = new THREE.Camera();
scene.add(camera);

var arToolkitSource = new THREEx.ArToolkitSource({ sourceType : 'webcam' });

arToolkitSource.init(function onReady() {
  onResize()
});

window.addEventListener('resize', function() {
  onResize()
});

function onResize() {
  arToolkitSource.onResize()
    arToolkitSource.copySizeTo(renderer.domElement)
    if( arToolkitContext.arController !== null ) {
    arToolkitSource.copySizeTo(arToolkitContext.arController.canvas)
  }
}

var arToolkitContext = new THREEx.ArToolkitContext({
  cameraParametersUrl: THREEx.ArToolkitContext.baseURL + '../data/data/camera_para.dat',
  detectionMode: 'mono',
});

arToolkitContext.init(function onCompleted() {
    camera.projectionMatrix.copy( arToolkitContext.getProjectionMatrix() );
});

onRenderFcts.push(function() {
  if( arToolkitSource.ready === false )  return

  arToolkitContext.update( arToolkitSource.domElement )
      scene.visible = camera.visible
  });

  var markerControls = new THREEx.ArMarkerControls(arToolkitContext, camera, {
    type : 'pattern',
    patternUrl : THREEx.ArToolkitContext.baseURL + '../data/data/patt.hiro',
    changeMatrixMode: 'cameraTransformMatrix'
});

scene.visible = false

var geometry = new THREE.CubeGeometry(1,1,1);
var material = new THREE.MeshNormalMaterial({
  transparent : true,
  opacity: 0.5,
  side: THREE.DoubleSide
}); 

var mesh = new THREE.Mesh( geometry, material );
mesh.position.y = geometry.parameters.height/2
scene.add( mesh );

var geometry = new THREE.TorusKnotGeometry(0.3,0.1,64,16);
var material = new THREE.MeshNormalMaterial(); 
var mesh = new THREE.Mesh( geometry, material );
mesh.position.y = 0.5
scene.add( mesh );
onRenderFcts.push(function(delta) {
  mesh.rotation.x += Math.PI*delta
});

onRenderFcts.push(function() {
  renderer.render( scene, camera );
});

var lastTimeMsec= null
requestAnimationFrame(function animate(nowMsec) {
  requestAnimationFrame( animate );
  lastTimeMsec = lastTimeMsec || nowMsec-1000/60
  var deltaMsec = Math.min(200, nowMsec - lastTimeMsec)
  lastTimeMsec = nowMsec
  onRenderFcts.forEach(function(onRenderFct) {
    onRenderFct(deltaMsec/1000, nowMsec/1000)
  });
});

Babylon.js

 

 

var canvas = document.getElementById("renderCanvas");
var engine = new BABYLON.Engine(canvas, true);

function createScene(){
  var scene = new BABYLON.Scene(engine);
  scene.clearColor = new BABYLON.Color4(0, 0, 0, 0);

  var camera = new BABYLON.ArcRotateCamera("Camera", 1.0, 1.0, 12, BABYLON.Vector3.Zero(), scene);

  var light = new BABYLON.HemisphericLight("hemi", new BABYLON.Vector3(0, 1, 0), scene);

  light.groundColor = new BABYLON.Color3(0.5, 0, 0.5);

  var box = BABYLON.Mesh.CreateBox("mesh", 1, scene);
  box.position.y = 0.5;
  box.showBoundingBox = true;

  var material = new BABYLON.StandardMaterial("std", scene);
  material.diffuseColor = new BABYLON.Color3(0.5, 0, 0.5);

  box.material = material;

  return scene;
}

var scene = createScene();
scene.clearColor = new BABYLON.Color4(0, 0, 0, 0);

ARjs.Babylon.init(engine, scene, scene.cameras[0])

engine.runRenderLoop(function () {
  scene.render();
});

window.addEventListener("resize", function () {
  engine.resize();
});

Material Viewer

Motivation
For a while now I’ve wanted to write a simple editor/viewer to show simple, basic interaction between light and materials (based in no small part to this web page, http://www.barradeau.com/nicoptere/dump/materials.html).

I also noticed that even simple default scenes rendered very differently with Three.js, Babylon.js and PlayCanvas engine (referred to as PlayCanvas.js from hereafter). So once I started, I got the idea that it would be interesting to compare how different libraries rendered lights and materials. This would give me an opportunity to learn, compare and contrast each, as well as (hopefully) learn and apply some new ES6 features.

Use cases
 1. Investigate material properties
 2. Investigate light properties
 3. Show and compare how different renderers implement lighting

Light Properties
The following light types are supported:
 ● ambient
 ● spot
 ● point
 ● directional
 ● hemi-sphere ¹
¹ Babylon.js only

Material Properties
The following material properties are supported:
 ● ambient ²
 ● diffuse
 ● specular
 ● emissive
² Not Three.js, the ambient colour interacts directly with the diffuse material

GUI
I like dat-gui, but I’ve wanted a floating control for a long time, after several abortive attempts at writing my own (I just got bored dealing with all the fiddly bits) I finally found an old but working solution that resizes to its content. There were a couple of bugs but they were readily sorted
(e.g. dat-gui hangs over the bottom of a containing div but this was easily dealt with by adding a bottom border).

Still, I’m unhappy with both DraggableLayer and dat-gui. DraggableLayer will only work with one element, and dat-gui has limited input options (e.g. no vector entry) and colour selection can be fiddly.

Renderer Controllers
In “JavaScript: the Good Parts” Crockford eschews the use of new, using constructor functions instead. I try to follow suit with makeXXX functions. All the renderer specific code is localised to a single file, although this is rather spoiled by the matching <script> tag polluting the global space.

Each renderer has the same interface and hides any workarounds (e.g. each renderer implements ambient lighting differently). Adding a new renderer should be straightforward.

Three.js and Babylon.js were generally easy to code for, but PlayCanvas.js kept throwing up little issues and I found it’s syntax less intuitive.

PlayCanvas.js problems
PlayCanvas uses degrees rather than radians. While more an idiosyncrasy than a problem, it necessitates conversion between radians and degrees.

Despite providing a torus primitive in the editor (along with a capsule) there is no equivalent addComponent method. Using the createXXX method requires additional code to add it to the render list, etc. and maintain it. However, there was work round — generating an arbitrary mesh then replacing it with a “created” meshInstance.

var mesh1 = new pc.Entity('cube');
mesh1.addComponent('model', {
  type: 'box'
});

var node = new pc.GraphNode();
var mesh = geometryTypes[type](app.graphicsDevice, geometryDefaults[type]);

var material = new pc.StandardMaterial();
var meshInstance = new pc.MeshInstance(mesh1.model.meshInstances[0].node, mesh, material);

mesh1.model.meshInstances = [meshInstance];

material.ambient.set(...materialValue.ambient);
material.diffuse.set(...materialValue.diffuse);
material.emissive.set(...materialValue.emissive);
material.specular.set(...materialValue.specular);
material.fresnelModel = pc.FRESNEL_NONE;
material.shadingModel = pc.SPECULAR_PHONG;
material.update();
mesh1.model.material = material;

app.root.addChild(mesh1);

To get the ambient material setting working required fiddling around with material properties. The default material is physical which needed to be switched to Phong (not a single switch but setting a couple of properties, see lines 18-19 above).

The createCylinder method had a bug and used opts.baseRadius rather than correct opts.radius. I actually raised a PR to fix this (yipee!).

I still can’t figure out how to set the rotation of the orbit camera correctly, I suspect it may be related to rounding errors when converting between degrees and radians.

GitHub
After some initial work in my mega-repo, I split it off into it’s own — always fun. It can be found here and the live version here and mirrored here.

Future work
Sort out the camera rotation bug in PlayCanvas (der).

Either find better UI libraries that support draggable elements and more input types or D-I-Y my own.

Use glTF as the interchange format rather than my own. However, there may be some unique settings that aren’t supported. Further investigation is needed.

Add more light types — directional, spot, etc.
Add light parameters — position, fall off, penumbra, etc.
Add interactive control — click ‘n’ drag
More advanced lighting/shading/materials (AO, PBM)

Pop up info/help

Add/delete models
Shadows
Import models

Down the rabbit hole with FBX

PlayCanvas

PlayCanvas does an admirable job of importing FBX models and animations, but animation control seems limited. In particular, there seem to be no events for animation start, end, etc.

Animated Sackboy

This immediately prompted me to stop and reassess all my requirements (and not just animation).

Workflow

  1. I felt that I would almost certainly need more control over animation events than PlayCanvas provides.
  2. PlayCanvas only has version control (very important) for legacy scripts.
  3. A fall back is needed if PlayCanvas fails for some reason (e.g. it’s cloud based, so poor internet or even just high contention would be problematic).

Using an API would immediately address issues 2 and 3. This leaves FBX support as the main issue.

It’s apparent that game creation requires a smooth workflow. Easily getting assets into the game is very important, regardless of their source. Be it pre-existing from a store or newly created by an on-site 3D artist, assets have to come from somewhere, and FBX is one of the more common formats particularly for animation. FBX support is very important.

Checking out the main alternatives:

Babylon.js

Babylon.js doesn’t support FBX directly. I was advised to use the Babylon exporter for Unity or Blender. Alas, the results were not impressive.

Source model (Pearl.fbx) as should be

Model after exporting to Babylon (from Unity and Blender)

Clara.io has an “export to Babylon” option, unfortunately its own FBX import is lacking, and I’m still trying to work out how to add animation to a model.

Model after import into Clara.io

Three.js

Three.js currently only supports ASCII FBX not binary (Takahiro is doing some sterling work to add binary FBX).

Current state of FBX import to Three.js (not released yet)

So, it may be a contender if support is available in time, but for now it’s back to PlayCanvas.

Conclusion (if you can call it that)

Converting 3D assets seems inevitable, to reduce file size and minimise load time (even PlayCanvas converts FBX to JSON).

I will look into FBX – glTF or JSON converters, but for now I’m returning to PlayCanvas (in fact, PlayCanvas engine is also available, so it’s its own fallback).