Node.js and Cloud NoSQL Databases: Azure Cosmos DB

Azure Cosmos DB Quickstart

Learn how to access a cloud-based NoSQL database from Node.js. The Azure Cosmos DB stores documents (e.g., JSON) and allows scaling for improved performance plus geo-redundancy with one click. The access interface also allows well-known SQL queries.

This guide uses the latest Azure Cosmos DB JavaScript module (released as final version just 17 days ago). Additionally, this article is based on the ES 2017 standard. The async / await syntax makes the code short and readable. In contrast to many other tutorials, this article focuses on the minimum code required to understand the concepts.

The complete source code of this article is available on GitHub.

Continue reading “Node.js and Cloud NoSQL Databases: Azure Cosmos DB”

Asynchronous JavaScript with Promises & Async/Await in JavaScript

From the perspective of a C# developer, the introduction of Async and Await into the latest JavaScript version (ECMAScript 2017+) is a welcome addition. It makes asynchronous code a lot cleaner and more readable.

However, a lot of legacy libraries and code snippets are out there. It’s usually difficult to go all-in with async/await. This article is a short intro to error handling and the evolution of asynchronous development in JavaScript.

Error Handling in JavaScript

Most asynchronous operations like web requests can cause an error. Thus, let’s spend a minute reviewing the basics of the code flow.

Continue reading “Asynchronous JavaScript with Promises & Async/Await in JavaScript”

Using Natural Language Understanding, Part 4: Real-World AI Service & Socket.IO

The final vital sign checklist app with natural language understanding

In this last part, we bring the vital sign check list to life. Artificial Intelligence interprets assessments spoken in natural language. It extracts the relevant information and manages an up-to-date, browser-based checklist. Real-time communication is handled through Web Sockets with Socket.IO.

The example scenario focuses on a vital signs checklist in a hospital. The same concept applies to countless other use cases.

In this article, we’ll query the Microsoft LUIS Language Understanding service from a Node.js backend. The results are communicated to the client through Socket.IO.

Connecting LUIS to Node.JS

In the previous article, we verified that our LUIS service works fine. Now, it’s time to connect all components. The aim is to query LUIS from our Node.js backend. Continue reading “Using Natural Language Understanding, Part 4: Real-World AI Service & Socket.IO”

Using Natural Language Understanding, Part 3: LUIS Language Understanding Service

Pre-built entities in intents, in use with LUIS

Training Artificial Intelligence to perform real-life tasks has been painful. The latest AI services now offer more accessible user interfaces. These require little knowledge about machine learning. The Microsoft LUIS service (Language Understanding Intelligent Service) performs an amazing task: interpreting natural language sentences and extracting relevant parts. You only need to provide 5+ sample sentences per scenario.

In this article series, we’re creating a sample app that interprets assessments from vital signs checks in hospitals. It filters out relevant information like the measured temperature or pupillary response. Yet, it’s easy to extend the scenario to any other area.

Language Understanding

After creating the backend service and the client user interface in the first two parts, we now start setting up the actual language understanding service. I’m using the LUIS Language Understanding service from Microsoft, which is based on the Cognitive Services of Microsoft Azure. Continue reading “Using Natural Language Understanding, Part 3: LUIS Language Understanding Service”

Using Natural Language Understanding, Part 2: Node.js Backend & User Interface

User Interface for our Vital Sign Checklist app that uses the LUIS Language Understanding Service from Microsoft

The vision: automatic checklists, filled out by simply listening to users explaining what they observe. The architecture of the sample app is based on a lightweight architecture: HTML5, Node.js + the LUIS service in the cloud.

Such an app would be incredibly useful in a hospital, where nurses need to perform and log countless vital sign checks with patients every day.

In part 1 of the article, I’ve explained the overall architecture of the service. In this part, we get hands-on and start implementing the Node.js-based backend. It will ultimately handle all the central messaging. It communicates both with the client user interface running in a browser, as well as the Microsoft LUIS language understanding service in the Azure Cloud.

Creating the Node Backend

Node.js is a great fit for such a service. It’s easy to setup and uses JavaScript for development. Also, the code runs locally for development, allowing rapid testing. But, it’s easy to deploy it to a dedicated server or the cloud later.

I’m using the latest version of Node.js (currently 9.3) and the free Visual Studio Code IDE for editing the script files. Continue reading “Using Natural Language Understanding, Part 2: Node.js Backend & User Interface”

Using Natural Language Understanding, Part 1: Introduction & Architecture

Vital Signs Checklist Architecture - 5

During the last few years, cognitive services became immensely powerful. Especially interesting is natural language understanding. Using the latest tools, training the computer to understand real spoken sentences and to extract information is reduced to a matter of minutes. We as humans no longer need to learn how to speak with a computer; it simply understands us.

I’ll show how to use the Language Understanding Cognitive Service (LUIS) from Microsoft. The aim is to build an automated check-list for nurses working at hospitals. Every morning, they record the vital sign of every patient. At the same time, they document the measurements on paper checklists.

With the new app developed in this article, the process is much easier. While checking the vital signs, nurses usually talk to the patients about their assessments. The “Vital Signs Checklist” app filters out the relevant data (e.g., the temperature or the pupillary response) and marks it in a checklist. Nurses no longer have to pick up a pen to manually record the information.

The Final Result: Vital Signs Checklist

In this article, we’ll create a simple app that uses the natural language understanding APIs (“LUIS”) of the Microsoft Cognitive Services on Microsoft Azure. The service extracts the relevant data from freely spoken assessments.

LUIS just went from preview state to general availability. This important milestone brings SLAs and more worldwide availability regions. So, it’s a great time to start using it! Continue reading “Using Natural Language Understanding, Part 1: Introduction & Architecture”

Basics of Web Technology: HTML5, CSS, JavaScript, Accessibility & WordPress

Basics of Web Technologies

It’s important to know the basics of how the web works. In one of the introductionary lectures at Digital Healthcare, we take a look at HTML5, CSS, JavaScript & more. Understanding basic web technology is required to ensure that web sites have excellent usability as well as accessibility.

This includes how to make sure navigating the page works with screen readers, as well as that the design works well for people with e.g., (color) vision deficiency. Accessibility is especially important for websites and mobile apps of public sector bodies, based on a new directive of the EU that will come into effect in all member states soon. Continue reading “Basics of Web Technology: HTML5, CSS, JavaScript, Accessibility & WordPress”