Image for post
Image for post
Photo by Krisztián Korhetz on Unsplash

And How Does CX Compare To Other Environments…

Introduction

Looking at a previous Medium story wrote on Dialogflow, which is now referred to as Dialogflow ES, the one big issue I raised is the lack of a dialog development and management environment.

Image for post
Image for post
Google Dialogflow CX Main Console

And, that ES will have to be used as an API with native code managing the dialog and state management.

This has been solved for with Dialogflow CX.

In other words, an interface where the state of the conversation can be managed.

Currently, in the market, there are five distinct groupings in approaching dialog design and development…

Dialog configurations, design canvas, native code, ML stories and of course design canvas. …


Image for post
Image for post

Las acciones o “Actions” son nuevas en Watson Assistant… Hoy te enseño lo que he aprendido al usarlas.

Image for post
Image for post
Photo by David Monje on Unsplash

Introducción

Se ha agregado una nueva característica a IBM Watson Assistant llamada Acciones. Esta nueva característica permite a los usuarios desarrollar diálogos de manera rápida.

Image for post
Image for post
Mover pasos en una acción para alterar la secuencia de eventos conversacionales

El enfoque adoptado por IBM con Actions es de naturaleza extremadamente no técnica. La interfaz es intuitiva y prácticamente no requiere conocimientos de desarrollo ni formación previos. Las variables de entrada del usuario (entidades) se recogen automáticamente con una referencia descriptiva.

Los pasos de la conversación se pueden reorganizar y mover libremente para actualizar el flujo del diálogo.

Las actualizaciones se pueden guardar automáticamente, el aprendizaje automático se lleva a cabo en segundo plano.

Y la aplicación (acción) se puede probar en un panel de vista previa. …


Image for post
Image for post
Photo by Beeline Navigation on Unsplash

An Enhanced Environment for Telephony, Collaboration, Design Canvases And More

Introduction

Currently Google offers three chatbot or Conversational UI development environments; Actions SDK, the familiar Dialogflow ES (Essentials) and the newly added Dialogflow CX (Customer Experience).

In principle, the main differences between ES and CX are:

  • Cost
  • Collaboration
  • Enterprise Orientated: Google Cloud Integration & Cognitive Services
  • Design Canvas
  • Telephony & Enterprise Integration
Image for post
Image for post
Image for post
Image for post

Cost

From the two tables below, detailing cost, it is evident that the cost of running CX is exorbitant, to say the least. The only explanation I can think of, is the fact that CX is aimed at an enterprise environment and implementation.

Where ES often serves as a starting point for individual users. …


Image for post
Image for post
Photo by Nick Fewings on Unsplash

10 Elements Of A Conceptual Process To Derive General Conversational Rules & Concepts

Introduction

When creating or rather crafting a chatbot conversation we as designers must draw inspiration and guidance from real-world conversations.

Image for post
Image for post
Still life by Dutch painter, Henk Helmantel. I found viewing his paintings in real life jarring. You know it is an abstraction of reality, yet appearing to be real.

Elements of human conversation should be identified and abstracted to be incorporated in our chatbot conversation.

General rules and concepts of human conversations must be derived and implemented via technically astute means.

Below I list 10 elements of human conversation which can be incorporated in a Conversational AI interface. Conversational designers want users to speak to their chatbot as to a human…hence it is time for the chatbot to converse more human like.

Christoph Niemann has fascinating ideas on abstraction and when visual design becomes too abstract.


Image for post
Image for post

Y por qué es importante para capturar datos no estructurados con precisión y eficiencia.

Image for post
Image for post

Introducción

En lo relacionado con las herramientas y los entornos de desarrollo de chatbot disponibles actualmente, hay tres problemas que requieren solución:

  • Entidades contextuales compuestas.
  • Descomposición de la entidad.
  • Desactivación de los que se conoce como “Rigid State Machine”, gestión de cuadros de diálogo.

En esta historia me voy a centrar solo en las Entidades. Y cómo se gestionan las entidades en tres entornos de tecnología conversacional.

Image for post
Image for post

Entidades 101

¿Qué es una entidad?

Las tres palabras que se utilizan con mayor frecuencia en relación con los chatbots son:

  • declaraciones o utterances,
  • intenciones y
  • entidades.

Una declaración o utterance es realmente cualquier cosa que diga el usuario. El enunciado puede ser una oración, en algunos casos unas pocas oraciones, o simplemente una palabra o una frase. Durante la fase de diseño, se anticipa lo que sus usuarios podrían decirle a su bot. …


Image for post
Image for post
Photo by Ant Rozetsky on Unsplash

How To Handle User Conversations Which Are Out-Of-Domain

Introduction

How do you develop for user input which is not relevant to your design…

In general chatbots are are designed and developed for a specific domain. These domains are narrow and applicable to to the concern of the organization they serve. Hence chatbots are custom and purpose built as an extension of the organization’s operation, usually to allow customers to self-service.

Image for post
Image for post
https://www.wordnik.com/words/domain

As an added element to make the chatbot more interactive and lifelike, and to anthropomorphize the interface, small talk is introduced. Also referred to as chitchat.

But what happens if a user utterance falls outside this narrow domain? With most implementations the highest scoring intent is assigned to the users utterance, in a frantic attempt to field the query. …


Image for post
Image for post
Photo by Ståle Grut on Unsplash

Actions Are New In Watson Assistant…This Is What I have Learnt

Introduction

A new feature has been added to IBM Watson Assistant called Actions. This new feature allows users to develop dialogs in a rapid fashion.

Image for post
Image for post
Moving Steps In An Action To Alter The Sequence Of Conversational Events

The approach taken with Actions is one of an extreme non-technical nature. The interface is intuitive and requires virtually no prior development knowledge or training. User input (entities) variables are picked up automatically with a descriptive reference.

Conversational steps can be re-arranged and moved freely to update the flow of the dialog.

Updates can be saved automatically, machine learning takes place in the background.

And the application (action) can be tested in a preview pane.

There is something of Actions which reminds me of the Microsoft’s Power Virtual Agent interface. The same general idea is there, but with Watson the interface is more simplistic and minimalistic. And perhaps more a natural extension of the current functionality. …


Image for post
Image for post
Photo by Christian Storz on Unsplash

This Is How To Enhance the IBM Watson NLU API With Watson Knowledge Studio ML Models

Introduction

You Can Extend Natural Language Understanding With Custom Models To Support Specific Features.

When creating a NLU API, there are often generic and general Natural Language Understanding features available.

Image for post
Image for post
NLU from the IBM Cloud services list.

One example of such an interface is spaCy, where relatively advanced functionality is available out of the box.

IBM Watson Natural Language Understanding is another such an example where you have access, on activation, to considerable NLU capability.

In certain use cases and instances the standard approach is sufficient. …


Image for post
Image for post
Photo by Ma Fushun on Unsplash

Iterative Prototyping Is Useful In Developing Conversational AI Applications

Introduction

Getting started with chatbots in particular and Conversational AI in general can be daunting. You might have heard about chatbots and have a basic understanding of the gist. And now the next step is to get started by building something…but how?

Image for post
Image for post
Rasa’s Prototype Training Indicator.

Questions often asked include:

  • How do I access the software?
  • Do I need a GPU or extensive computing power?
  • Will it be expensive?
  • Do I need a specific design or prototyping tool?
  • How do I gather training data?
  • Does only the big players have what I need; i.e. Google, Facebook or AWS?
  • Where and how do I host my bot? …


Image for post
Image for post

Calcula cuál es el esfuerzo que lleva a cabo el cliente a la hora de resolver sus dudas con tu chatbot.

Image for post
Image for post
Photo by Robert Tjalondo on Unsplash

Introducción

  • ¿Qué es el customer effort o esfuerzo del cliente y cómo se puede medir en las conversaciones de los chatbots?
  • ¿Y cómo puede la desambiguación mejorar este concepto?
  • ¿Se puede emplear el aprendizaje automático para mejorar el esfuerzo del cliente con el tiempo?

Aquí veremos tres elementos, que combinados tendrán un gran impacto en la experiencia de conversación que estás construyendo:

  • Desambiguación.
  • Aprendizaje automático.
  • Esfuerzo del cliente.
Image for post
Image for post

¿Qué es la desambiguación?

En la parte inferior de esta página hay enlaces a artículos detallados sobre desambiguación. Sin embargo, os dejo un resumen…

Cuando un usuario ingresa una utterance ambigua, en lugar de mostrar por defecto la intención con la puntuación más alta; el chatbot comprueba las diferencias entre las puntuaciones de intención más altas. …

About

Cobus Greyling

NLP/NLU, Chatbots, Voice, Conversational UI/UX, CX Designer, Developer, Ubiquitous User Interfaces. www.cobusgreyling.me

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store