Get started with Augmented Reality on the web using Three.js and WebXR | Part 1 (2024)

Get started with Augmented Reality on the web using Three.js and WebXR | Part 1 (3)

Over the last few years, augmented reality has become a commonplace feature in many mobile apps. From games and social networking to navigation and commerce apps - we’ve seen many of the major players experiment with augmented reality in various forms. Industry giants like Microsoft, Meta, and Apple are investing heavily in the field, and there are still tons of exciting applications for this tech out there just waiting to be discovered. And if you’re a web developer, there has never been a better time to get started developing AR applications!

When it comes to creating augmented reality (AR) and virtual reality (VR) experiences for the web, the introduction of the WebXR Device API represents a significant leap forward. This API is proving to deliver both the performance and capability needed for developers to create truly immersive AR and VR applications that can run right here in the browser.

At the time of writing WebXR for mobile AR is best supported in Chrome and Samsung Internet Browser on Android, and the Oculus Browser and Wolvic are probably your best bet when it comes to VR. Support for WebXR in Webkit on iOS is still not there yet, but it is hopefully right around the corner.

This does however mean that the app created in this project will not run on iOS for the time being. Fingers crossed this changes soon 🤞

Throughout this post, we will create a simple augmented reality web application that lets you place 3D models in your environment. My goal with this project is to introduce you to the main concepts you need to create a basic augmented reality application using the WebXR API — and also to get you fired up about working with AR on the web! ⚡

Over the two parts, you will learn how to set up a basic WebXR app, how to render a model in an augmented reality space with the Three.js library, and how you can respond to touch events from the user’s device.

You will also be introduced to some neat tips and tricks for getting your WebXR development environment set up. While we are using many familiar web technologies in this project, the WebXR space is still somewhat nascent compared to traditional web development, and it comes with its own set of challenges.

As you might already have noticed this post is rather lengthy, but there are good reasons for this. We are now delving into both 3D development and WebXR at the same time. Without a good grasp of the underlying concepts, this project could quickly become confusing and frustrating. I will therefore try to provide a certain level of detail throughout so that you can get familiar with all the moving parts that go into creating the AR experience.

If you’re curious about the final result, you can try out a live demo of the application by opening the link below on an Android phone.

Project app demo: https://webxr-koalas.vercel.app/

Prerequisites

To build and run the project app, there are a few requirements you should fulfill.

You will need:

  • A computer with node and npm installed. Your node install should be version 12.13.0 or newer.
  • Some familiarity with TypeScript (or JavaScript). We will be using TypeScript throughout this guide, but you should also be able to follow along if you know a bit of JavaScript.
  • An Android device with a recent version of Chrome. As WebXR is not yet implemented in the WebKit browser engine, this application will sadly not work on iOS for the time being.

If you don’t have an Android device on hand you could try using the WebXR Emulator on either Firefox or Chrome. Running the app on the emulator can’t be compared to running on a real device, but you will be able to test the code and play around within the emulator as you go along. The emulator can also be very handy for debugging along the way.

This takes us to another important point. As the app is targeted towards mobile devices, you might find it difficult to debug the usual way in the desktop browser console. The emulator helps, but a good solution to this is to use the remote debugging tools in Chrome which will allow you to inspect the mobile browser and see any console messages being output there: https://developer.chrome.com/docs/devtools/remote-debugging/

Before we clone the starter repository with the skeleton code, let’s have a quick look at the technologies that make up the app, beginning with WebXR itself.

The WebXR Device API is a native browser API that enables users to run AR and VR experiences entirely inside a browser, without having to rely on downloading a dedicated native application.

In short, the WebXR Device API enables your browser to communicate with all the sensors, cameras, and other components in your phone or head-mounted display that are needed for the device to get an understanding of your position, movements, and surrounding environment.

The web standards document describing the API is still in active development, but it started solidifying a while back, and the major browser engines have long since started to implement the most essential parts of the WebXR API.

Combine this API with a 3D rendering library, and you’ve got all you need to make a sophisticated AR or VR experience in the browser.

Speaking of 3D libraries for the web, let’s also take a glance at Three.js.

If you’ve ever played around with 3D on the web, you might already be familiar with the name Three.js. First released over 10 years ago, this open-source library is still going strong as ever, with new releases being made every month. It’s a favorite of many due to its user-friendly API, its extensive functionality, and the huge community of users online providing both support, example code, and amazing 3D experiences to explore and learn from.

Curious about what Three.js is capable of in the hands of seasoned developers? Here are some of my favorite examples from around the web:

In addition to being a solid all-around library for 3D on the web, it also happens to be one of the easiest ways to get started with WebXR! 🙌

Get started with Augmented Reality on the web using Three.js and WebXR | Part 1 (4)

If you’ve never previously done any 3D development, there are some key terms and concepts you should be familiar with to better understand what is going on in a typical Three.js project.

Let’s look at a few of these.

Scene

In the context of Three.js and many other 3D libraries, a scene represents the main 3D environment in the application. This is where you place all your models, lights, and cameras and arrange them together to a coherent experience.

Renderer

The renderer is the part responsible for actually drawing our scene on the screen. For a game, animation, or any otherwise interactive application, we will typically create a render-loop, which is a function responsible for repeatedly drawing our scene to the screen with a given interval (ex. 60 times per second). This is necessary, as each time we call the render function it’s like taking a snapshot of the current state of the scene. We can then perform updates to the scene and move our models, cameras, and lights around, and these changes will be reflected on the next call to render.

Camera

The camera represents the users’ view of the scene. There are several different camera types in Three.js, depending on what kind of experience you want to create. The one we’re going to be using is the Perspective Camera, which is designed to replicate how the human eye sees things.

Light

Another key component in any 3D experience is Light. Without a source of light, the scene will appear like a pitch-black void when rendered to the canvas. Even though we will be working in the context of augmented reality — with the real world acting as our environment — lighting is still important to provide realism to our scene and to make our models look their best. 😎

Models & Meshes

Three.js provides the possibility of both loading pre-made 3D models, as well as creating 3D objects from the ground up. We will be doing both. However, it’s entirely possible to make your own 3D objects in Three.js by specifying the geometry of a shape and applying a material to it, and then creating a Mesh from these two parts. This can however quickly become a bit convoluted, and if you’re creating anything with a little complexity it’s better to use a dedicated 3D modeling tool like Blender to create your model, before importing it into Three.js.

In this project, we are going to use a pre-made koala bear model made for Google Poly.

Get started with Augmented Reality on the web using Three.js and WebXR | Part 1 (5)

The above section only covers the very minimum of what we should know before starting, and the Three.js library itself is packed with so much functionality you could probably spend years getting to know it all. We are however going to keep it simple for now, and I’ll explain the relevant concepts we need along the way.

Setting up a new project with webpack & friends can sometimes be a chore. To minimize time spent on tooling and project setup I’ve therefore created a small starter repository to work from.

This project includes all the dependencies we will be needing throughout. I’ve also defined some empty functions throughout and declared every single import we will be using in our code. This is just to make everything a little easier when we begin writing the application code, and ensuring you can focus on code rather than fixing build issues.

Finally, I have also added a scaled and processed koala bear model to the project files, so it’s ready to be used in our app. 🐨

Visit the link below, and clone the repository to your local machine:

https://github.com/sopra-steria-norge/get-started-with-ar-web/

Next, navigate to the repository with your terminal and run the below commands to install all the project dependencies and start the app in development mode.

npm install # Install dependencies
npm run start:live # Start dev server and enable tunneling

The start:live command above will launch a development server, and then enable tunneling to localhost through the internet using the package localtunnel.

❗ Be aware — this exposes your local development server to the internet!

I’ve set things up this way, as I find that using a tunneling service like localtunnel or ngrok instead of accessing the development server directly greatly reduces friction during development when previewing the app on a physical mobile device.

This also helps simplify running everything under HTTPS, as WebXR applications are required to be served over HTTPS to run, and without localtunnel, we would otherwise have to set up certificates for local development to use the WebXR Device API.

Check your terminal after running the code.

If localtunnel was successful in setting up a connection, it should print a randomly generated URL where you can now access the development server.

Get started with Augmented Reality on the web using Three.js and WebXR | Part 1 (6)

The resulting page should be blank, an empty canvas for our AR app.

Make sure there are no errors printed to the terminal, and feel free to poke around and get familiar with the starter project.

Alright, enough chit-chat — it’s time to get started on developing our app. ⚡

Fire up your IDE and begin by opening the file index.ts

This is the main entry point to our app.

Find the function named initializeXRApp and insert the code from the snippet below. Don’t fret about simply copying and pasting it, as we move forward I’ll always walk through every code snippet to explain what is going on in every single step.

In the above function, we start by getting the pixel ratio of the device, along with the height and width of the window’s layout viewport. Then we create a new renderer object using the WebGLRenderer constructor provided by Three.js. When calling this constructor, we pass in two arguments: antialias and alpha.

By enabling antialiasing, our models will look cleaner, and less jagged when rendered to the scene. And enabling the alpha buffer will let us make use of transparency on anything rendered within our scene.

Next, we set the size and pixel ratio on the renderer using the values we extracted from the window object. This will help ensure our app renders with the correct resolution for the device. We also have to specifically enable XR-functionality on the renderer, as this is turned off by default. Then, we add the renderer to the DOM.

The next step is setting up the AR experience using a built-in helper function from Three.js. By calling ARButton.createButton() and passing in our configuration object, Three.js will take care of setting everything up, and will even give our page a nice little button to enter into the AR experience.

The configuration object we pass into the createButton-function specifies that we require the device browser to support hit-testing before allowing our app to be run. This means that if hit-testing is not available, the app won’t start. This helps ensure users don’t accidentally experience a broken/incomplete version of the app.

Near the bottom, we call the createScene-function and pass in our renderer. This function is imported from the scene.ts file, which is currently empty —but don’t worry, we’ll get to this one soon.

Oh — and we’re also calling a helper function at the end there. All it does is display a simple welcome message to greet any users opening our app.

If you run the app now, you’ll notice nothing is happening — as we haven’t actually called the initializeXrAppfunction yet!

Move to the start-function, and add the below code.

Here we are checking if the user’s browser supports the immersive-ar session with WebXR before starting the app, and rendering a message if no support is found.

To do this, we are using two helper functions I’ve included in the project. These are included mainly to reduce the scope of this project, letting us focus on the actual AR functionality.

The verbosely named browserHasImmersiveArCompatibility-function just calls a built-in function on the navigator.xr object to check for WebXR-support, and the showUnsupportedBrowserMessage-function simply renders some HTML to the DOM containing a message to the user that their browser is not supported.

With this being a tutorial project we could of course skip this step, but it’s always good practice to let the user know that they won’t be able to run the app if their device is not supported yet, instead of just showing a blank page. This is especially relevant when dealing with cutting-edge APIs that are not yet fully adopted by all browser engines, potentially leaving some users out.

After adding the above code, you can open the app on your device. Use the localtunnel URL, and ensure you are accessing the HTTPS-URL.

You should now see a button with the text “START AR”.

Press it, and a camera feed is created and rendered by the WebXR functionality in Three.js.

And all this by writing just a few lines of code — pretty neat, huh? 👏

Let’s continue by setting up the scene for our app.

In the previous step, you passed the renderer object into the createScene function. This function is currently empty, so let's get started on that one next.

Open the scene.ts file and locate the function. We’re now going to add a scene, a camera, and a render loop.

Declare a variable named scenenear the top, and create a new Scene-object by calling the constructor.

We start by calling the Scene-constructor, which initializes a new empty scene object. This scene is what will house all the 3D content in our app.

Next, we create a PerspectiveCamera using the constructor provided by Three.js. While this app also uses the actual device camera which we got from the ARButton, we still need to instantiate a camera that will function as a view into the Three.js scene. The first parameter we pass in is the field of view. The second one is the aspect ratio, which we calculate by dividing the width of the viewport by the height. Finally, we provide a value for the near plane, and then another value for the far plane. Any objects closer than the near plane or further than the far plane will not be rendered, so here we are actually defining the boundaries for how near or far we will allow the user to be to any objects.

In this case, we pass in the values 0.02 and 20, which might first seem like arbitrary numbers. However, Three.js generally uses SI units for any physical quantities and uses meters for distance units. This translates great to working in AR, and so this means that we will render objects in the scene within the range between 0.02 and 20 meters.

Finally, we set up the main render loop for the app, and then we pass it into the setAnimationLoop function on the renderer. This will ensure that our renderLoopfunction is called on every frame, which in turn renders our scene.

If you are familiar with the window.requestAnimationFrame function in the browser, the setAnimationLoop-function is pretty similar. It is however strongly advised by Three.js to use the latter function when dealing with WebXR, so that’s what we’re going to do.

We can now open the app on our device. If everything is working correctly, you should still see the camera feed being rendered to your screen.

But as you might notice, there’s nothing rendered in the scene yet. 👀

We’ll do something about that right now!

I promised you Koalas, and I intend to keep that promise. But before we get carried away with marsupials, let’s begin with something a little simpler to get our toes wet.

We are going to render a simple cube to the scene. Rendering a cube is almost like the “Hello world” of Threejs (and 3D development in general), so you can consider this your official rite of passage! 🎉

Update your createScenefunction to reflect the code below:

The first thing we do is create a BoxGeometry. This is a pre-defined shape from Threejs, which gives us a Geometry object shaped like a cube that we can use when creating a mesh. The three parameters we pass in will define the width, height, and depth of that cube — in that order.

Next, we create a MeshBasicMaterial for our cube. We need a material in order for our cube to be drawn to the scene. Threejs provides several types of material ready to use, but the MeshBasicMaterial doesn’t require any lights for us to see it, which lets us keep things simple for now. The constructor accepts a configuration object to set various properties on the material, but as you might notice we are only making use of the color property. Furthermore,

Three.js generally recommends using hexadecimal values for colors, so we will follow suit. We pass in the value 0xff0000which gives us the color red.

Finally, we can create the box mesh itself, by passing the geometry and material to the Mesh constructor. This will return a nice red cube that we can render to our AR view.

We then add it to the scene using the .add() method on the scene object.

You’ll be calling this add-function a lot when working with Three.js, but it’s also one of the easiest things to forget. If you are ever wondering why your object is missing from the scene, always check if you called the .add() function.

To add a bit of flair to this otherwise simple demonstration, we are also going to make the cube spin. We do this by manipulating the rotation of the cube within the render method by continuously adding to the x- and y-values of the object rotation.

This means that for each frame being rendered by Three.js, the cube will rotate 0.01 radians on each of these axes. You can play around with this value to increase or decrease the rotation speed.

If you open the app on your device, you should see something like this:

Get started with Augmented Reality on the web using Three.js and WebXR | Part 1 (7)

Look at that cube go! 🟥

Spinning around in AR space without a worry in the world.

Wanna see it go faster? Try setting the rotation to 1 within the render function, and watch the cube go bananas! 💫

This concludes the first part of this project. Great job on getting this far!

Hopefully, I’ve managed to spark your interest in WebXR development and made you feel somewhat comfortable working with Three.js. Don’t be afraid to play around with the code yourself. Try changing the size of the cube, or adding another one in the scene!

I find this kind of development to be a great way to get creative with code, and as you’ve now seen it really does not have to be that complicated. Once you get the hang of the basics, it feels like a world of possibilities opening up.

Take care for now, and thanks for reading! 🙌

Ready for more?

Click here to read second part of this project, where we will finally add hit testing and learn how we can render our koala model onto the ground in AR space.

PS: The WebXR Device API is still in development and parts of it might change over time. Anything not working correctly? Drop me a comment!

Get started with Augmented Reality on the web using Three.js and WebXR | Part 1 (2024)

References

Top Articles
Latest Posts
Article information

Author: Foster Heidenreich CPA

Last Updated:

Views: 5501

Rating: 4.6 / 5 (56 voted)

Reviews: 87% of readers found this page helpful

Author information

Name: Foster Heidenreich CPA

Birthday: 1995-01-14

Address: 55021 Usha Garden, North Larisa, DE 19209

Phone: +6812240846623

Job: Corporate Healthcare Strategist

Hobby: Singing, Listening to music, Rafting, LARPing, Gardening, Quilting, Rappelling

Introduction: My name is Foster Heidenreich CPA, I am a delightful, quaint, glorious, quaint, faithful, enchanting, fine person who loves writing and wants to share my knowledge and understanding with you.