Augmented Reality for the web with AR.js and THREEAR

Augmented reality is changing the world. To substantiate this claim, just check out the many different areas where augmented reality has been applied in the few last years. You will find it used for military applications, healthcare applications, enterprise applications, and so on. We know that the more the demand is increased for a particular technology, the more the number of systems that allow its development will increase, as will the number of different execution platforms. So, why not move augmented reality to the most well known and efficient distribution platform in the world: the Web?

AR.js is a powerful framework based on artoolkit and a stack of JavaScript technologies, which allow building marker-based augmented reality applications executable directly from your browser. To complete the technology stack, we are going to use AR.js on an Angular application, through the usage of a TypeScript wrapper named THREEAR.

The AR.js technologies stack

To learn how to use this framework, we are going to analyse an example application, so you need to check out this ready-to-use code repository. Also, we are going to use Visual Studio Code as development IDE and Node.js LTS (including the npm packages manager) as server-side back-end. Thus, if you haven’t done it yet, you have to download and install them on your computer. Once the setup is completed, by using the Visual Studio Code terminal, install Angular CLI by running the following commands inside the root folder of the repository:

npm install -g @angular/cli
npm install
ng serve

The last two commands allow running the example web application at the following address: http://localhost:4200/. To test this application, you also have to print out the marker below:

Now put the printed marker in front of the computer camera. Et voilà! An animated virtual model appears! As you can see, if you move the marker around the space, the virtual model will follow it. This kind of process is named marker detection and tracking. Contrariwise, If you try to do the same thing at the following address: http://localhost:4200/?loadModel=true, an animated 3D object will be loaded, instead of the default one.

Great! But how it works?

Let’s take a look at the code behind. Inside the app.component.ts file, the ngAfterViewInit function contains the main logic. The value of the loadModel variable is obtained by the URL query parameters, and allows switching the application behaviour by loading a locally stored 3D object, instead of the default one. Let’s now analyse the steps of the main function.

Render creation

To render our scene, we need a WebGLRenderer, a component able to use the WebGL API. Once initialised with the antialiasing and alpha parameters, the renderer will be appended to the #content div (the main HTML tag of the application), contained inside the app.component.html file:

// Initialise the render
const renderer = new THREE.WebGLRenderer({
    antialias: true,
    alpha: true

renderer.setClearColor(new THREE.Color('lightgrey'), 0);
renderer.setSize(window.innerWidth, window.innerHeight); = 'absolute'; = '0px'; = '0px';

// Append the render to the content
var content = document.getElementById('content');

Scene initialisation

Now we have to create a Scene where our virtual objects can be rendered, and add to it an AmbientLight, to illuminate the instantiated objects. Once the scene is ready, we can instantiate a new Camera to observe the existing virtual objects. We also need to create a new Group, which is a mesh container that will be shown during the marker detection and tracking phases. Finally, we have to initialise the augmented reality context by using the static THREEAR.initialize function, which takes as a parameter a THREEAR Source object:

// Initialise the scene
const scene = new THREE.Scene();
scene.add(new THREE.AmbientLight(0xcccccc));

// Initialise the camera
const camera = new THREE.Camera();

// Initialise the group
const markerGroup = new THREE.Group();

// Initialise the source
var source = new THREEAR.Source({ renderer, camera });

// Initialise the context
THREEAR.initialize({ source: source }).then((controller) => {
    // Here your code

Model loading

Depending on the value of the loadModel variable, the application will show a virtual model or a 3D object. If the value of the variable corresponds to false, the application will create a new TorusKnotGeometry, that will be associated with a MeshNormalMaterial through the creation of a new Mesh. By contrast, if the value of the loadModel variable corresponds to true, the MTLLoader will load the object material, while the OBJLoader will load the description of the object shape, associating it to the load material. Finally, the resulting mesh will be added to the previously created marker group:

var mesh: any;

if (!this._loadModel) {
    // Add a torus knot geometry  
    const geometry = new THREE.TorusKnotGeometry(0.3, 0.1, 64, 16);
    const material = new THREE.MeshNormalMaterial();

    // Create a mesh
    mesh = new THREE.Mesh(geometry, material);
    mesh.position.y = 0.5;

} else {

    // Load a model
    var mtlLoader = new MTLLoader();
    mtlLoader.load('fish-2.mtl', (materials) => {
        var objLoader = new OBJLoader();
        objLoader.load('fish-2.obj', (group) => {
            mesh = group.children[0];
            mesh.material.side = THREE.DoubleSide;
            mesh.position.y = 0.25;
            mesh.scale.set(0.25, 0.25, 0.25);


Marker tracking

The THREEAR PatternMarker class allows creating a new marker tracker, passing, as parameters, the local marker pattern file path, the mesh container to show in case of positive marker detection, and the minimum confidence level, which will be used as a minimum threshold during the evaluation of the marker detection result. Finally, the instantiated marker tracker is added to the created augmented reality context:

// Create a pattern marker
var patternMarker = new THREEAR.PatternMarker({
    patternUrl: 'assets/markers/hiro.patt',
    markerObject: markerGroup,
    minConfidence: 0.4 

// Start to track the pattern marker

Rendering loop

At last, the application needs to start the frame animation loop, which will call the update function for each object inside the scene. Here, the mesh container associated with the marker will be animated, updating, at each cycle, its rotation value. Finally, at the end of the loop cycle, the entire updated scene will be rendered by the render component:

// Run the rendering loop
let lastTimeMilliseconds = 0;
requestAnimationFrame(function animate(nowMsec) {
    // Keep looping

    // Measure time
    lastTimeMilliseconds = lastTimeMilliseconds || nowMsec - 1000 / 60;
    const deltaMillisconds = Math.min(200, nowMsec - lastTimeMilliseconds);
    lastTimeMilliseconds = nowMsec;

    // Call the update function for each object inside the scene

    // Set the object rotation
    if (mesh) mesh.rotation.y += deltaMillisconds / 1000 * Math.PI;

    // Render the scene
    renderer.render(scene, camera);


What’s next?

The last version of AR.js allows us to use the “Location Based” mode, combined with the “Marker Based” one. This new mode allows the augmented reality objects to show up in particular positions of the world space, through the usage of the GeoAR.js library. Take a look at this article to get more details.

And that’s it!
Before starting to play with AR.js, I suggest you take a look at the artoolkit and Three.js documentation. Also, at this link, you can find a lot of examples and different use-cases about AR.js, which can be easily implemented with THREEAR.

Happy coding!