• State of Mind
  • Posts
  • Why I love working on NLP, CX Agents and Automation

Why I love working on NLP, CX Agents and Automation

This is brief overview on why I love working on NLP, CX Agents and Automation and my journey building StateSet ReSponse.

Computational Linguistics - the branch of linguistics in which the techniques of computer science are applied to the analysis and synthesis of language and speech.

Prolog - Prolog is a programming language that is well-suited for developing logic-based artificial intelligence applications.

In 2013-2014 I was exploring my first computer science classes at the University of Arizona. It was an introduction to computational linguistics course. The class taught me how to think about building grammar parsers using a logic programming language called Prolog. My professor helped us reason about how we humans interpret language, meaning and sentiment. This reminded me of the elementary school grammar trees but at a lower level, how to construct a language interpretation engine not just breakdown a single sentence. That semester we built a language parser that translated English to Japanese. It also covered how to explain idioms and other more nuanced ways we express meaning with words. Around the same time I read the book, the stuff of thought, by Steven Pinker. My notes at the time:

Semantics are the human window into the language of thought.

We understand entities in terms of four things:

who or what brought it about

what it's made of

what shape it has

what it's for

To ease the listeners understanding of the coinage, we create a metaphor that reminds them of the idea and hopes it evokes a similar idea in their mind. Then at some point, the metaphorical ladder is kicked it out, and it just is. At this point everyone agrees on and understands what you mean.

These sorts of ideas around semantics, intent and how it relates to computers was fascinating. I really wanted to understand this field of computational linguistics and eventually apply it. Here is an example of the english language module parser we built:

:- module(english,[]).

sbar(sbar(C,S)) --> complementizer(C), s(S).
objrel_sbar(sbar(C,S)) --> complementizer(C), objrel_s(S).
subjrel_sbar(sbar(C,S),Number,Person) -->
    complementizer(C), subjrel_s(S,Number,Person).

sbarq(PA) --> whnp(X), sq(PA), {arg(2,PA,X)}.
sbarq(PA) --> whnp(X), sq(PA), {arg(3,PA,X)}.
whnp(X) --> wp(X).
wp(what) --> [what].
wp(who) --> [who].
sq(predarg(V+TNS,NP,NP2,yesno)) --> v_do(do+TNS,Tag),
       np(NP,Person,Number,nom),
       verb(V,vb),
       np(NP2,_,_,acc),
       {check(Person,Number,Tag)}.
sq(predarg(V,_,NP,wh)) -->
    verb(V,Tag), np(NP,_,_,acc), {check(3,sg,Tag)}.
sq(PA) --> v_do(do+TNS,Tag), np(NP,Person,Number,nom),
	   vp(PA,vb,V),
	   {check(Person,Number,Tag),arg(2,PA,NP),arg(1,PA,V+TNS)}.

v_do(do+past,vbd) --> [did].
v_do(do+pres,vbz) --> [does].
v_do(do+pres,vbp) --> [do].

s(PA) --> np(NP,Person,Number,nom), vp(PA,Tag,Number),
		{arg(2,PA,NP), check(Person,Number,Tag)}.
s(s(NP,VP)) --> np(NP,Person,Number,nom), vp_progressive(VP,Tag,Number),
		{check(Person,Number,Tag)}.
s(PA) --> np(NP,Person,Number,nom), vp_passive(PA,Tag,Number),
		{arg(3,PA,NP), check(Person,Number,Tag)}.
objrel_s(s(NP,VP)) -->
    np(NP,Person,Number,nom), objrel_vp(VP,Tag,Number),
    {check(Person,Number,Tag)}.
subjrel_s(s(NP,VP),Number,Person) -->
    empty_np(NP), vp(VP,Tag,Number), {check(Person,Number,Tag)}.

vp(predarg(V,_,none,decl),Tag,_) --> verb_intran(V,Tag).

vp(predarg(die+past,_,none,decl),vbd,_) --> [kicked,the,bucket].
vp(predarg(_,_,_,wh),Tag,V) --> verb(V,Tag).

vp(predarg(V,_,NP,decl),Tag,_) --> verb(V,Tag), np(NP,_,_,acc).

vp(vp(V,VP),Tag,_) --> v_have(V,Tag), vp_progressive(VP,vbn,_).
vp(vp(V,VP),Tag,_) --> v_have(V,Tag), vp_main(VP,vbn,_).
vp(vp(V,VP),Tag,_) --> v_have(V,Tag), vp_passive(VP,vbn,_).
vp_progressive(vp(V,VP),Tag,Number) -->
    v_aux(V,Tag,Number), vp_main(VP,vbg,_).
vp_progressive(vp(V,VP),Tag,Number) -->
    v_aux(V,Tag,Number), vp_passive(VP,vbg,_).
objrel_vp(vp(V,NP),Tag,_) --> verb(V,Tag), empty_np(NP).
vp_main(vp(V,NP),Tag,_) --> verb(V,Tag), np(NP,_,_,acc).
vp_passive(predarg(V,_,_),Tag,Number) -->
    v_aux(Aux,Tag,Number), verb(V,vbn).
vp_passive(predarg(V,NP,_),Tag,Number) -->
    v_aux(Aux,Tag,Number), verb(V,vbn), pp(NP).
pp(NP) --> p(P), np(NP,_,_,acc).
p(in(by)) --> [by].

Prolog was one of the first programming languages I learned in addition to learning some Java through a graphics program called Processing. Though I haven’t use Prolog / Java much since it led me down a path of eventually learning doing Codecademy, learning basic HTML, CSS, JavaScript, Node.js and starting to lean more into software development.

Working on Agents in the Salesforce Ecosystem

Fast forward two years later and I am working as a Sales Engineer at Apttus, an ISV in the Salesforce ecosystem. After around a year of doing presales I started building different demos using a salesforce node library called nforce. It was great, it gave me scripting capabilities for CRUD operations in Salesforce and I could use node.js. I worked on combining this with two new bot frameworks at the time called Botkit and Botbuilder. I extended the application to use an engine called LUIS, (Language Understanding) Cognitive Service from Microsoft. It was my first time building bots / agents that could detect utterance, intent and parameters. At the time here were my thoughts:

When building a bot it is important to understand that you are abstracting away the UI with intelligent cognitive services. There is an abundance of web based services and applications that are locked into their own interfaces. We have arrived at a time where often we interact with apps without a UI. Why do I need a phone to call an Uber, can't I just tell the uber bot in a Slack channel to pick me up from work and drop me off at the train. The bot needs two parameters: from and to.

This new paradigm of having intelligent assistants is deemed conversation-as-a-service.

The front-end is the conversations that you will have with the Bot to gather the paramaters that are passed to your server in the form of a serialized JSON object. The server deserializes it, and responds back to your bot with its own serialized JSON object.

The user interacts with the bot --> formats the request with the parameters --> the API retrieves the call data --> responds with error/data --> formats the response to the user.

I built these bots which could query contacts, create updates to Accounts, close won opportunities and more. I then learned we could build this into Slack and that gave us a great market that we could build into ~ 2016. This eventually turned into Max, the Quote to Cash intelligent agent. The AI Agent was used in Slack and could perform actions like generating a contract and sending it to a primary contact on an account in Salesforce. At the time the co-routines for determining the conversational flows we used were an array of predetermined responses but ultimately it worked. We were able to deliver a working product that helped salespersons streamline their revenue driving operations. 

Shopify AI Agents

7 years later I am working in the eCommerce / Shopify ecosystem on delivering workflow automation using a best-in-class workflow engine and AI Agents. The technology has advanced so much and the things we can build now were simply impossible to do then. We have generative models that can answer the most nuanced product questions. We have deterministic workflow frameworks that can scheduled API calls. We have out of the box APIs for text-to-speech and speech-to-text that can be imported using npm. It's an amazing time to be building AI agents and software in general.

As a software developer and technologist I am always gravitating towards what new tech I can get my hands on and build with. When GPT2 at the CLI wrote a somewhat coherent story out of thin air; it was unbelievable. When GPT3 first came out and I got access, I knew there was something groundbreaking happening and I wanted my next application I built to be based on this next wave. 

We first built our integration into our StateSet platform by integrating what become ReSponse CX with Zendesk. We could read the subject and message of the Zendesk ticket and then use GPT3 to generate a response, that could be edited and sent back to the customer. 

First Shopify App Launch

We then built our first Shopify App on this idea and heard crickets... it was a little early.

Stateset Response

Using Stateset ReSponse, Shopify Merchants can generate responses to customer service tickets based on their knowledge base and real-time customer and order data from Shopify. Merchants can upload a set of documents and ReSponse answers the customer service ticket by parsing the subject and body of the ticket. The uploaded documents can be a company knowledge base, product data, FAQ data or any source of truth.

March 2021, the market wasn't yet ready, the answers API we built the original product on eventually got deprecated, RAG AI & Vector DBs weren’t really a thing, and we didn’t really have any sort of feedback loop from the market.

The Next Generation of Autonomous Operations

Fast forward to 2024 and we are working with multiple customers processing thousands of requests a month across product questions, return requests, warranty replacements, order cancellations / changes, subscription changes and more. 

We have developed an entire operating system and RAG AI platform that is giving us the edge in making our customers successful with the latest advancements in AI. Our ReSponse app consist of:

  • Channels and Messaging

  • Knowledge Base Management

  • Rules Engines

  • Function Calling APIs

  • Workflow Scheduling

We have built our own framework, hosting services and pipelines for building agents that can power that workflows of the fastest growing DTC brands.

We are redefining what operations can look like for commerce. With our state of the art workflow engine combined with the AI advancements we have seen over the last year its hard to overstate how transformative this will be for multiple industries. 

I am excited about the market and category we are building and can't wait to see how we can continue to drive the next generation of software.