Environmental HDR Lighting & Reflections in ARCore: Implementation in Unity 3D (Part 3)

How to make real-time HDR lighting and reflections possible on a smartphone? Based on the unique properties of human perception and the challenges of capturing the world’s state and applying it to virtual objects. Is it still possible?

Google found an interesting approach, which is based on using Artificial Intelligence to fill the missing gaps. In this article, we’ll take a look at how ARCore handles this. The practical implementation of this research is available in the ARCore SDK for Unity. Based on this, a short hands-on guide demonstrates how to create a sphere that reflects the real world – even though the smartphone only captures a fraction of it.

Google ARCore Approach to Environmental HDR Lighting

To still make environmental HDR lighting possible in real-time on smartphones, Google uses an innovative approach, which they also published as a scientific paper . Here, I’ll give you a short, high-level overview of their approach:

First, Google captured a massive amount of training data. The video feed of the smartphone camera captured both the environment, as well as three different spheres. The setup is shown in the image below.

Continue reading “Environmental HDR Lighting & Reflections in ARCore: Implementation in Unity 3D (Part 3)”

Environmental HDR Lighting & Reflections in ARCore: Virtual Lighting (Part 2)

In part 1, we looked at how humans perceive lighting and reflections – vital basic knowledge to estimate how realistic these cues need to be. The most important goal is that the scene looks natural to human viewers. Therefore, the virtual lighting needs to be closely aligned with real lighting.

But how to measure lighting in the real world, and how to apply it to virtual objects?

Virtual Lighting

How do you need to set up virtual lighting to satisfy the criteria mentioned in part 1? Humans recognize if an object doesn’t fit in:

Comparing a simple scene setup to environmental HDR lighting. Image adapted from the Google Developer documentation.

The image above from the Google Developer Documentation shows both extremes. Even though you might still recognize that the rocket is a virtual object in the right image, you’ll need to look a lot harder. The image on the left is clearly wrong, especially due to the misplaced shadow.

Continue reading “Environmental HDR Lighting & Reflections in ARCore: Virtual Lighting (Part 2)”

Environmental HDR Lighting & Reflections in ARCore: Human Perception (Part 1)

Realistically merging virtual objects with the real world in Augmented Reality has a few challenges. The most important:

  1. Realistic positioning, scale and rotation
  2. Lighting and shadows that match the real-world illumination
  3. Occlusion with real-world objects

The first is working very well in today’s AR systems. Number 3 for occlusion is working OK on the Microsoft HoloLens; and it’s soon also coming to ARCore (a private preview is currently running through the ARCore Depth API – which is probably based on the research by Flynn et al. ).

But what about the second item? Google put a lot of effort into this recently. So, let’s look behind the scenes. How does ARCore estimate HDR (high dynamic range) lighting and reflections from the camera image?

Remember that ARCore needs to scale to a variety of smartphones; thus, a requirement is that it also works on phones that only have a single RGB camera – like the Google Pixel 2.

Continue reading “Environmental HDR Lighting & Reflections in ARCore: Human Perception (Part 1)”

How-To: Retrofit, Moshi, Coroutines & Recycler View for REST Web Service Operations with Kotlin for Android

It might be overwhelming to choose the best way to access a web service from your Android app. Maybe all you want is to parse JSON from a web service and show it in a list in your Kotlin app for Android, while still being future-proof with a library like Retrofit. As a bonus, it’d be great if you could also perform CRUD operations (create, read, update, delete) with the data.

You can choose from basic Java-style HTML requests, or go up to full-scale MVVM design patterns with the new Android Architecture Components. Your source code will look entirely different depending on what approach you chose – so it’s important to make a good choice right at the beginning.

In this article, I’ll show a walk-through using many of the newest components for a modern solution:

Continue reading “How-To: Retrofit, Moshi, Coroutines & Recycler View for REST Web Service Operations with Kotlin for Android”

Basics of AR: SLAM – Simultaneous Localization and Mapping

In the first part, we took a look at how an algorithm identifies keypoints in camera frames. These are the base for tracking & recognizing the environment.

For Augmented Reality, the device has to know more: its 3D position in the world. It calculates this through the spatial relationship between itself and multiple keypoints. This process is called “Simultaneous Localization and Mapping” – SLAM for short.

Sensors for Perceiving the World

The high-level view: when you first start an AR app using Google ARCore, Apple ARKit or Microsoft Mixed Reality, the system doesn’t know much about the environment. It starts processing data from various sources – mostly the camera. To improve accuracy, the device combines data from other useful sensors like the accelerometer and the gyroscope.

Based on this data, the algorithm has two aims:

  1. Build a map of the environment
  2. Locate the device within that environment
Continue reading “Basics of AR: SLAM – Simultaneous Localization and Mapping”

Basics of AR: Anchors, Keypoints & Feature Detection

Creating apps that work well with Augmented Reality requires some background knowledge of the image processing algorithms that work behind the scenes. One of the most fundamental concepts involves anchors. These rely on keypoints and their descriptors, detected in the recording of the real world.

Anchor Virtual Objects to the Real World

AR development APIs hide much of the complexity. As a developer, you simply anchor virtual objects to the world. This ensures that the hologram stays glued to the physical location where you put it. Continue reading “Basics of AR: Anchors, Keypoints & Feature Detection”

How to Record a Video from a Unity ARCore App on Android

A video is a great way to showcase your Unity app. To capture the full visual fidelity of your app, you need to record at the highest possible quality with a smooth frame rate.

Several screen recording apps are available in the Google Play Store. However, there’s an easy and completely free way that provides the highest possible quality.

This short guide demonstrates how to record the screen with an APK file generated by Unity. Of course, it works for both AR and Non-AR Apps. Continue reading “How to Record a Video from a Unity ARCore App on Android”

Remote ARCore with Unity’s Experimental ARInterface

Overall, the AR ecosystem is still small. Nevertheless, it’s fragmented. Google develops ARCore, Apple creates ARKit and Microsoft is working on the Mixed Reality Toolkit. Fortunately, Unity started unifying these APIs with the ARInterface.

At Unite Austin, two of the Unity engineers introduced the new experimental ARInterface. In November 2017, they released it to the public via GitHub. It looks like this will be integrated into Unity 2018 – the new features of Unity 2018.1 include “AR Crossplatfom Kit (ARCore/ARKit API)“.

Remote Testing of AR Apps

The traditional mobile AR app development cycle includes compiling and deploying apps to a real device. That takes a long time and is tedious for quick testing iterations.

A big advantage of ARKit so far has been the ARKit Unity Remote feature. The iPhone runs a simple “tracking” app. It transmits its captured live data to the PC. Your actual AR app is running directly in the Unity Editor on the PC, based on the data it gets from the device. Through this approach, you can run the app by simply pressing the Play-button in Unity, without native compilation.

This is similar to the Holographic Emulation for the Microsoft HoloLens, which has been available for Unity for some time.

The great news is that the new Unity ARInterface finally adds a similar feature to Google ARCore: ARRemoteInterface. It’s available cross-platform for ARKit and ARCore.

ARInterface Demo App

In this article, I’ll explain the steps to get AR Remote running on Google ARCore. For reference: “Pirates Just AR” also posted a helpful short video on YouTube. Continue reading “Remote ARCore with Unity’s Experimental ARInterface”

NFC Tags, NDEF and Android (with Kotlin)

In this article, you will learn how to add NFC tag reading to an Android app. It registers for auto-starting when the user taps a specific NDEF NFC tag with the phone. In addition, the app reads the NDEF records from the tag.

NFC & NDEF

Apple added support for reading NFC tags with iOS 11 in September 2017. All iPhones starting with the iPhone 7 offer an API to read NFC tags. While Android included NFC support for many years, this was the final missing piece to bring NFC tag scenarios to the masses. Continue reading “NFC Tags, NDEF and Android (with Kotlin)”

How To: RecyclerView with a Kotlin-Style Click Listener in Android

In this article, we add a click listener to a RecyclerView on Android. Advanced language features of Kotlin make it far easier than it has been with Java. However, you need to understand a few core concepts of the Kotlin language.

To get started with the RecyclerView, follow the steps in the previous article or check out the finished project on GitHub. Continue reading “How To: RecyclerView with a Kotlin-Style Click Listener in Android”