Categories
Android App Development AR / VR

Getting Started with Google ARCore, Part 2: Visualizing Planes & Placing Objects

Following the basic project setup of the first part of this article, we now get to the fascinating details of the ARCore SDK. Learn how to find and visualize planes. Additionally, I’ll show how to instantiate objects and how to anchor them to the real world using Unity.

Finding Planes with ARCore

The ARCore example contains a simple script to visualize planes, point clouds and to place the Android mascot. We’ll create a shorter version of the script here.

Create a new empty game object and call it “Managers”. Add a new script component (call it PlaneVisualizationManager.cs) and copy the following code:

using System.Collections.Generic;
using UnityEngine;
using GoogleARCore;
using GoogleARCore.HelloAR;

public class PlaneVisualizationManager : MonoBehaviour {
    /// <summary>
    /// A prefab for tracking and visualizing detected planes.
    /// </summary>
    public GameObject TrackedPlanePrefab;

    private List<TrackedPlane> _newPlanes = new List<TrackedPlane>();
    
	// Update is called once per frame
	void Update ()
	{
	    Frame.GetNewPlanes(ref _newPlanes);

	    // Iterate over planes found in this frame and instantiate corresponding GameObjects to visualize them.
	    foreach (var curPlane in _newPlanes)
	    {
            // Instantiate a plane visualization prefab and set it to track the new plane. The transform is set to
	        // the origin with an identity rotation since the mesh for our prefab is updated in Unity World
	        // coordinates.
	        var planeObject = Instantiate(TrackedPlanePrefab, Vector3.zero, Quaternion.identity,
	            transform);
	        planeObject.GetComponent<TrackedPlaneVisualizer>().SetTrackedPlane(curPlane);

	        // Apply a random color and grid rotation.
	        planeObject.GetComponent<Renderer>().material.SetColor("_GridColor", new Color(Random.Range(0.0f, 1.0f), Random.Range(0.0f, 1.0f), Random.Range(0.0f, 1.0f)));
	        planeObject.GetComponent<Renderer>().material.SetFloat("_UvRotation", Random.Range(0.0f, 360.0f));
	    }
    }
}

The code only has a single public property: it needs a reference to a visualizing prefab for the tracked plane. In the Unity game editor, you can assign it from Assets > GoogleARCore > HelloARExample > Prefabs > “TrackedPlaneVisualizer”. It’s a handy mesh renderer and script that dynamically generates triangles based on the plane detected so far.

Plane Visualization Manager script for ARCore in Unity

The static Frame.GetNewPlanes() method comes from the ARCore SDK and simply returns a list reference to all newly detected planes from this frame. The PlaneVisualizationManager then loops over the new planes and instantiates new instances of the TrackedPlaneVisualizer prefab.

To make the difference between the planes more visible, each new plane is customized with a random color and a random texture rotation. In case ARCore later discovers that two previously separate planes grow together, they will be automatically merged. This could be the case if you for example scan the left side of a table, move your camera to the floor, then scan the right side of the table. ARCore creates two separate table planes. When you then scan the middle of the table, the previously separate planes are merged and one is marked as outdated (subsumed).

Go to File > Build & Run in Unity to compile the project and run it on an attached Android phone. Move your phone around for a few seconds while pointing at a plane. After a short while, the app begins visualizing planes. A separate plane is visualized using a different color. While the tracking is by far not as fast or accurate as it would be with Tango or a Microsoft HoloLens that both include real depth sensors, it’s amazing what ARCore is able to do with just a single color camera!

ARCore - Plane Detection running on the Google Pixel 2

Handle Touch Events

Like in the main HelloARExample, the plan is to instantiate a new perfab on a plane every time the user taps on the screen. For this behavior, add a new script called “InstantiateObjectOnTouch” and add it to the “Managers” game object in our scene hierarchy.

The script needs two public references that you assign from the Unity editor: The First Person Camera, which is already in your scene in MainScene > ARCore Device > First Person Camera. The second public property is a reference to the game object you want to instantiate upon clicking (PlaceGameObject).

Add the following code to your newly created script:

using UnityEngine;
using GoogleARCore;
using GoogleARCore.HelloAR;

public class InstantiateObjectOnTouch : MonoBehaviour {
    /// <summary>
    /// The first-person camera being used to render the passthrough camera.
    /// </summary>
    public Camera FirstPersonCamera;

    /// <summary>
    /// The gameobject to place when tapping the screen.
    /// </summary>
    public GameObject PlaceGameObject;

    // Update is called once per frame
    void Update ()
    {
        // Get the touch position from Unity to see if we have at least one touch event currently active
        Touch touch;
        if (Input.touchCount < 1 || (touch = Input.GetTouch(0)).phase != TouchPhase.Began)
        {
            return;
        }

        // Now that we know that we have an active touch point, do a raycast to see if it hits
        // a plane where we can instantiate the object on.
        TrackableHit hit;
        var raycastFilter = TrackableHitFlag.PlaneWithinBounds | TrackableHitFlag.PlaneWithinPolygon;

        if (Session.Raycast(FirstPersonCamera.ScreenPointToRay(touch.position), raycastFilter, out hit) && PlaceGameObject != null)
        {
            // Create an anchor to allow ARCore to track the hitpoint as understanding of the physical
            // world evolves.
            var anchor = Session.CreateAnchor(hit.Point, Quaternion.identity);

            // Intanstiate a game object as a child of the anchor; its transform will now benefit
            // from the anchor's tracking.
            var placedObject = Instantiate(PlaceGameObject, hit.Point, Quaternion.identity,
                anchor.transform);

            // Game object should look at the camera but still be flush with the plane.
            placedObject.transform.LookAt(FirstPersonCamera.transform);
            placedObject.transform.rotation = Quaternion.Euler(0.0f,
                placedObject.transform.rotation.y, placedObject.transform.rotation.z);

            // Use a plane attachment component to maintain the game object's y-offset from the plane
            // (occurs after anchor updates).
            placedObject.GetComponent<PlaneAttachment>().Attach(hit.Plane);
        }
    }
}

This script essentially queries if a new touch event was registered. It uses the standard Unity function Input.GetTouch(). The if statement excludes all cases with multiple touch events. Only the beginning of a single touch event continues execution of the script.

Anchor Virtual Objects to the Real World

Now that we know that a new touch event was registered, the next part of the code performs a raycast to see which plane is hit by shooting a straight ray from the touch position. This is performed by the Session.Raycast() method of the ARCore SDK. As parameters, we specify that we are only interested in tracked planes. We could also query for collisions with the point cloud.

In case a hit is detected, the method returns true and stores the result in the “hit” parameter. The GoogleARCore.TrackableHit class instance contains information about the hit point in the 3D scene.

We use this information to create an Anchor. This glues the game object to a position and rotation in the world – which is relative to the current understanding of the real world by ARCore. Whenever ARCore learns more about the real world or inaccuracies accumulate, drift may occur. The anchor ensures that game objects retain their physical location in the real world, even if the virtual coordinates in the Unity coordinate system get adjusted.

Instantiate Objects

Based on the hit position and the anchor, we can now proceed to instantiating a new object based on the prefab we want to place in the real world. The next two lines ensure that the rotation is flush with the plane, but still looks in the direction of the user.

In case your prefab is then rotated differently than you expect, check the steps in a previous step of this article again – make sure the top game object in your prefab has a position and rotation of 0/0/0 and that you apply the necessary alignment transformation only to the child object. The parameters of the parent transform are overwritten by this script – thus, any adaptions you make there are lost.

The last line attaches the new object to the plane. This ensures that the y-offset (vertical offset) of the game object is adapted when the plane moves. The “PlaneAttachment” script is what we added to the prefab before. It’s a useful script of the example project. The script also ensures that the plane reference gets updated whenever a plane is subsumed by another plane (e.g., if ARCore merges two different planes). It also ensures that the game object is only visible if the plane is valid.

Finishing the Scene Setup

Finally, simply wire up both public properties to our second script on the “Managers” game object: take the “First Person Camera” from the scene, and drag the prefab to instantiate into the “Place Game Object” slot – in this case, it’s the “BrainPrefab”.

ARCore script: Instantiate Object On Touch

Go to File > Build & Run again to execute the final project on the phone. Move around until ARCore detects some planes. Then, simply tap on the screen to place a 3D object at that position.

The speed of the plane detection is incredible, especially if the floor has enough structure like in the screenshot seen below. The tracking is not as accurate as on devices with a depth sensor like the HoloLens – the planes might be larger than the actual surfaces, or ARCore might merge different planes together that are separate and at a slightly different height (e.g., the table surface and the windowsill close to it). However, given that this is running in real-time on a phone with a simple RGB camera instead of an array of active depth-sensing sensors, the performance is amazing.

For many scenarios, especially for simply placing 3D models in the real world, the tracking is good enough. With the advantage of soon reaching over 100 million phones, it’ll be interesting to see the first powerful mass-market augmented reality coming to life. Combined with Apple’s ARKit, you can then reach most people with a reasonably new phone with AR solutions, opening the market for developers to create new innovations.

Download the example source code built in this article on GitHub.

Placing 3D models of brains on planes detected by Google ARCore - Screenshot running on the Google Pixel 2

ARCore article series