Categories
Android App Development AR / VR

How to Record a Video from a Unity ARCore App on Android

A video is a great way to showcase your Unity app. To capture the full visual fidelity of your app, you need to record at the highest possible quality with a smooth frame rate.

Several screen recording apps are available in the Google Play Store. However, there’s an easy and completely free way that provides the highest possible quality.

This short guide demonstrates how to record the screen with an APK file generated by Unity. Of course, it works for both AR and Non-AR Apps.

Categories
Android App Development AR / VR

Remote ARCore with Unity’s Experimental ARInterface

Overall, the AR ecosystem is still small. Nevertheless, it’s fragmented. Google develops ARCore, Apple creates ARKit and Microsoft is working on the Mixed Reality Toolkit. Fortunately, Unity started unifying these APIs with the ARInterface.

At Unite Austin, two of the Unity engineers introduced the new experimental ARInterface. In November 2017, they released it to the public via GitHub. It looks like this will be integrated into Unity 2018 – the new features of Unity 2018.1 include “AR Crossplatfom Kit (ARCore/ARKit API)“.

Remote Testing of AR Apps

The traditional mobile AR app development cycle includes compiling and deploying apps to a real device. That takes a long time and is tedious for quick testing iterations.

A big advantage of ARKit so far has been the ARKit Unity Remote feature. The iPhone runs a simple “tracking” app. It transmits its captured live data to the PC. Your actual AR app is running directly in the Unity Editor on the PC, based on the data it gets from the device. Through this approach, you can run the app by simply pressing the Play-button in Unity, without native compilation.

This is similar to the Holographic Emulation for the Microsoft HoloLens, which has been available for Unity for some time.

The great news is that the new Unity ARInterface finally adds a similar feature to Google ARCore: ARRemoteInterface. It’s available cross-platform for ARKit and ARCore.

ARInterface Demo App

In this article, I’ll explain the steps to get AR Remote running on Google ARCore. For reference: “Pirates Just AR” also posted a helpful short video on YouTube.

Categories
Android App Development NFC

NFC Tags, NDEF and Android (with Kotlin)

In this article, you will learn how to add NFC tag reading to an Android app. It registers for auto-starting when the user taps a specific NDEF NFC tag with the phone. In addition, the app reads the NDEF records from the tag.

NFC & NDEF

Apple added support for reading NFC tags with iOS 11 in September 2017. All iPhones starting with the iPhone 7 offer an API to read NFC tags. While Android included NFC support for many years, this was the final missing piece to bring NFC tag scenarios to the masses.

Categories
Android App Development

How To: RecyclerView with a Kotlin-Style Click Listener in Android

In this article, we add a click listener to a RecyclerView on Android. Advanced language features of Kotlin make it far easier than it has been with Java. However, you need to understand a few core concepts of the Kotlin language.

To get started with the RecyclerView, follow the steps in the previous article or check out the finished project on GitHub.

Updated on December 15th, 2020: the solution projects on GitHub have been migrated to the latest versions and dependencies. Most importantly, the new solutions now also use Jetpack View Bindings instead of Kotlin synthetics. The text in this article is still the original.

Updated on July 4th, 2019: Google is transitioning the additional libraries to AndroidX. Nothing changes in terms of behavior with regards to our example. I’ve updated the source code examples on GitHub to use AndroidX instead of the Android Support libraries.

Categories
Android App Development

Kotlin & RecyclerView for High Performance Lists in Android

RecyclerView is the best approach to show scrolling lists on Android. It ensures high performance & smooth scrolling, while providing list elements with flexible layouts. Combined with modern language features of Kotlin, the code overhead of the RecyclerView is greatly reduced compared to the traditional Java approach.

Updated on December 15th, 2020: the solution projects on GitHub have been migrated to the latest versions and dependencies. Most importantly, the new solutions now also use Jetpack View Bindings instead of Kotlin synthetics. The text in this article is still the original.

Updated on July 4th, 2019: Google is transitioning the additional libraries to AndroidX. Nothing changes in terms of behavior with regards to our example. I’ve updated the source code examples on GitHub to use AndroidX instead of the Android Support libraries.

Sample Project: PartsList – Getting Started

In this article, we’ll walk through a sample scenario: a scrolling list for a maintenance app, listing machine parts: “PartsList”. However, this scenario only affects the strings we use – you can copy this approach for any use case you need.

Categories
App Development Artificial Intelligence Digital Healthcare

Using Natural Language Understanding, Part 4: Real-World AI Service & Socket.IO

Updated: January 18th, 2023 – changed info from Microsoft LUIS to Microsoft Azure Cognitive Services / Conversational Language Understanding. 

In this last part, we bring the vital sign checklist to life. Artificial Intelligence interprets assessments spoken in natural language. It extracts the relevant information and manages an up-to-date, browser-based checklist. Real-time communication is handled through Web Sockets with Socket.IO.

The example scenario focuses on a vital signs checklist in a hospital. The same concept applies to countless other use cases.

In this article, we’ll query the Microsoft Azure Conversational Language Understanding Service from a Node.js backend. The results are communicated to the client through Socket.IO.

Connecting Language Understanding to Node.js

In the previous article, we verified that our language understanding service works fine. Now, it’s time to connect all components. The aim is to query our model endpoint from our Node.js backend.

Categories
App Development Artificial Intelligence Digital Healthcare

Using Natural Language Understanding, Part 3: Conversational Language Understanding Service

Updated: December 12th, 2022 – changed info from Microsoft LUIS to Microsoft Azure Cognitive Services / Conversational Language Understanding. 

Training Artificial Intelligence to perform real-life tasks has been painful. The latest AI services now offer more accessible user interfaces. These require little knowledge about machine learning. The Microsoft Azure Conversational Language Understanding Service performs an amazing task: interpreting natural language sentences and extracting relevant parts. You only need to provide 5+ sample sentences per scenario.

In this article series, we’re creating a sample app that interprets assessments from vital signs checks in hospitals. It filters out relevant information like the measured temperature or pupillary response. Yet, it’s easy to extend the scenario to any other area.

Language Understanding

After creating the backend service and the client user interface in the first two parts, we now start setting up the actual language understanding service. I’m using the Conversational Language Understanding service from Microsoft, which is based on the Cognitive Services of Microsoft Azure.

Categories
App Development Artificial Intelligence Digital Healthcare

Using Natural Language Understanding, Part 2: Node.js Backend & User Interface

Updated: December 12th, 2022 – changed info from Microsoft LUIS to Microsoft Azure Cognitive Services / Conversational Language Understanding. 

The vision: automatic checklists, filled out by simply listening to users explaining what they observe. The architecture of the sample app is based on a lightweight architecture: HTML5, Node.js & the Microsoft Conversational Language Understanding Cognitive service in the cloud.

Such an app would be incredibly useful in a hospital, where nurses need to perform and log countless vital sign checks with patients every day.

In part 1 of the article, I’ve explained the overall architecture of the service. In this part, we get hands-on and start implementing the Node.js-based backend. It will ultimately handle all the central messaging. It communicates both with the client user interface running in a browser, as well as the Microsoft LUIS language understanding service in the Azure Cloud.

Creating the Node Backend

Node.js is a great fit for such a service. It’s easy to setup and uses JavaScript for development. Also, the code runs locally for development, allowing rapid testing. But it’s easy to deploy it to a dedicated server or the cloud later.

I’m using the latest version of Node.js LTS (currently version 18) and the free Visual Studio Code IDE for editing the script files.

Categories
Android App Development AR / VR

Augmented Reality Christmas Tree with Google ARCore Developer Preview 2 – in 5 Minutes

We don’t have a Christmas tree in our apartment. But in today’s world, this is what Augmented Reality is for, right? Therefore, I decided to create an AR Christmas Tree in 5 minutes. This also gave me an opportunity to check out the new Google ARCore Developer Preview 2.

Christmas Tree 3D Model

First off, you need a 3D model of a Christmas tree. Two of the most accessible sources are Google Poly and Microsoft Remix 3D. Sticking to models created directly by Google and Microsoft, these two are the choices:

Christmas Tree by Poly by Google
Christmas Tree by Poly by Google

Categories
App Development Artificial Intelligence Digital Healthcare

Using Natural Language Understanding, Part 1: Introduction & Architecture

Updated: December 12th, 2022 – changed info from Microsoft LUIS to Microsoft Azure Cognitive Services / Conversational Language Understanding. 

During the last few years, cognitive services have become immensely powerful. Especially interesting is natural language understanding. Using the latest tools, training the computer to understand spoken sentences and to extract information is reduced to a matter of minutes. We as humans no longer need to learn how to speak with a computer; it simply understands us.

I’ll show you how to use the Conversational Language Understanding Cognitive Service from Microsoft. The aim is to build an automated checklist for nurses working at hospitals. Every morning, they record the vital sign of every patient. At the same time, they document the measurements on paper checklists.

With the new app developed in this article, the process is much easier. While checking the vital signs, nurses usually talk to the patients about their assessments. The “Vital Signs Checklist” app filters out the relevant data (e.g., the temperature or the pupillary response) and marks it in a checklist. Nurses no longer have to pick up a pen to manually record the information.

The Result: Vital Signs Checklist

In this article, we’ll create a simple app that uses the conversational language understanding APIs of the Microsoft Azure Cognitive Services. The service extracts the relevant data from freely written or spoken assessments.