How Did We Get From Siloed Apps To Siloed Chatbots?

Why the chatbot ecosystem needs fragmentation like we saw with Apps

The Problem With Apps

June 2020 I published a medium post, where I ask the question, what is wrong with Apps?

In this article I argued that Apps force users to adhere to a siloed approach of accessing their data and services.

With Apps boundaries are placed around our data, and we as users are constrained by a small GUI, from this GUI we need to select a silo/App and drill down via a narrow and constrained user experience. Users need to navigate top down via graphic affordances like tabs, buttons, menu trees, etc to reach the service or function desired.

I also argued that Apps draw users into a synchronous, single thread, narrow “domain of squares” while we really want to have multi-threaded asynchronous conversations.

Chatbots Fragmented The App Landscape

Some could not see a world beyond apps, and envisaged a world with larger screens, for a richer user experience. But all along we wanted to have conversations, in text and speech. Hence we witnessed how Conversational AI (CAI) fragmented the App landscape.

Chatbots ushered in an alternative, the option to host functionality within messaging platforms. Companies could choose to host all their functionality in a chatbot, or a portion of it.

A few advantages of chatbots compared to Apps are:

  • Meeting users where they spend most of their time, in their favourite messaging app.
  • No friction of downloading apps, with users having to jump from app to app, no on-device version control, etc.
  • Users can have synchronous or asynchronous conversations, compound intents, etc.
  • Built-in disambiguation, digression and other conversational elements.
  • Chatbots can respond with multi-turn dialog, or any other conversational component, like images, cards, web-views, etc.
  • Conversational interfaces have the advantage of invisible design affordances which negates the pitfalls of a crowded GUI.
  • Chatbots are a necessary next step in the evolution of the human-machine interface. As seen below: ⬇️
The evolution of the human-machine interface from the command line trough to where we are today with chatbots and voicebots.
  • Part of ambient orchestration is the ability of a bot to read gestures, facial expression, etc.

Chatbots Need Fragmentation

I would argue that chatbots need fragmentation on two fronts…

1️⃣ Implementation

Enterprises are implementing chatbots again in a siloed manner, with segmented and separate chatbots for different departments. I have seen large corporates have a chatbot for HR, IT Support, Customer Facing, Agent Assist…the list goes on.

Often chatbot implementations are also siloed in terms of the medium they are surfaced on, with no underlying orchestration. There are separate chatbot squads for Messenger, WhatsApp, Web Chat, etc with no overarching shared strategy for designing and sharing NLU training data, process flows, building, scaling and leveraging the CAI eco-system.

2️⃣ Technology

Companies are starting to realise that a single Conversational AI platform will not suffice. The approach of a singular CAI development framework is being fragmented by the need for voicebots and more use-case specific Contact Centre AI (CCAI )demands.

Voicebots in specific demand elements like ASR and Speech Synthesis. There is also the emergence of NLU Design latent spaces, a no code environment where intent driven design and development can be used to establish data best-practice; converting unstructured data into NLU Training data.

Organisations are realising that customer needs and expectations demand an astute Conversational AI strategy with a fragmented approach. An approach where the various CAI demands are segmented into smaller use-cases, identifying key challenges and putting the best-in-class tools to work.

Also determining which components like ASR Acoustic model or NLU Design data can be used across other CAI related implementations.

As raised by Gartner, organisations are failing in CAI due to them starting with the technology, instead of starting with understanding their customers’ needs.

I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces and more.

https://www.linkedin.com/in/cobusgreyling
https://www.linkedin.com/in/cobusgreyling

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Cobus Greyling

Cobus Greyling

Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language; NLP/NLU/LLM, Chat/Voicebots, CCAI. www.humanfirst.ai