Heather Jones joined Geckoboard as Customer Success Champion in 2017. She has a keen interest in AI and last year she built Geckoboard’s first Customer Success Bot. Here she tells us what's happened since then.

Last year, the Customer Success team set out to build a bot that could improve our customers’ experience and speed up our first response and full resolution times.

Since our last post, we embarked on an adventure to improve the structure of Geckobot by expanding the number of topics the bot could respond to and increasing the quality of its responses.

We also reviewed its success and decided whether to continue using it or to suspend the project – it’s Judgment Day!


Let’s talk about the improvements first:

Structural Improvements

Originally, all inputs and outputs were contained within one CMS on the Meya.ai platform, and this CMS was connected to the NLP (natural language processing) platform Dialogflow, previously known as API.ai. This approach worked well in the beginning and enabled us to get the bot up and running quickly with minimal code. More details on the original setup can be found in Part 1.

While adding new topics for Geckobot to handle, we quickly ran into quota issues with Dialogflow (part of the Google Cloud Platform) and waited weeks for an increase. During this time, we discovered there was a much better organizational structure that could be implemented that would resolve the quota issues, improve training and increase our response quality.

To realize this structural change, we migrated all of our CMS content to Dialogflow, while continuing to use Meya.ai to host and manage our Intercom integration. In Dialogflow, all of the CMS content was categorized into topics, that were added to Agents (NLU modules), that contain Intents (developer-defined components, where each intent is a specific topic), that contain training phrases (possible user inputs), and responses. The improved structure allowed for better natural language processing, and increased the quality of responses since all data was being processed directly within Dialogflow, rather than the CMS.


A New Feature

While considering improvements we could make to Geckobot, we looked at the big picture, which included chats the bot was not currently trained for or designed to handle. We found there were a large number of chats that only contained chit-chat phrases (such as “hi”, or “hello there”), rather than queries with substantial content.

Prior to restructuring the organization, due to how quickly our bot takes action, chats containing chit-chat phrases (and nothing else) would be automatically re-assigned to the Customer Success team queue. In theory this was fine, but if a user added more info to a query already in a queue, Geckobot would never have an opportunity to respond if the input was a topic the bot was trained for.

We viewed this as a good opportunity to implement a new feature that listened for chit-chat phrases and waited on the user for more input. If more context was added later, Geckobot would respond as normal if trained on the topic, or re-assign to the Customer Success team.

After several rounds of research and testing, we added a new intent to an agent in Dialogflow and connected it to a flow in Meya.ai. To prevent Geckobot from responding too soon, and to ensure it ‘listened’ for any new context added to the query, the intent response body in Dialogflow was left empty, so nothing was returned to the user. The flow in Meya ensured it was not transferred to a queue preemptively, which gave Geckobot the opportunity to respond if additional context was added later on by the user.

After implementing this new feature, the number of chats that Geckobot was able to participate in and respond to dramatically increased!


Team Communication

As a team, we’re all responsible for monitoring the bot’s queue regularly to ensure the correct responses are being sent, following up as needed. To this end, I created a Slack channel dedicated to Geckobot. This channel served as a central place to share announcements about the bot as improvements were made and a place for my teammates to flag any unruly bot behavior so I could review and correct it.

We also created tags in Intercom to better track Geckobot successes, failures and to separate test messages. This introduction of tags provided a better view of unruly behavior as well as what issues Geckobot was regularly and successfully responding to.


Building Geckobot hasn’t been without challenges. The Intercom integration on the Meya.ai platform uses Intercom’s Replying to Users Last Conversation endpoint rather than using the specific conversation id provided by Intercom. In theory, it makes sense why this endpoint would be attractive to use for a bot platform, but when an Intercom Visitor Auto Message is triggered, this presents an issue.

If the bot responds in any capacity while there are multiple ongoing messages (both user initiated, or when a visitor auto message is triggered), since the last conversation endpoint is being used, there is a high probability the bot will respond to the wrong conversation or empty messages will be created in Intercom, since the conversation id is not specified.

As you can imagine, this can create quite a bit of confusion for both the end user and the Customer Success team! Fortunately, we only experienced this scenario on a few rare occasions.

Measuring Geckobot’s results and deciding its future

From previous analysis, we knew that about 15% of all support requests are ‘repetitive’. For example, questions about the options to get data in if we don’t have a pre-built integration with a specific service (where specific service is normally different), people asking for discounts or payment options, customers asking for invoices, etc.

We wanted Geckobot to own, reply and resolve those requests asap.

If we succeeded, we could then bring down our first response time for the following reasons:

  1. Up to 15% of queries would be replied to in a matter of seconds.
  2. We’d be able to focus on the remaining 85%.

That was the theory, but the reality proved to be slightly different. Truth is we never quite got rid of that 15% of repetitive questions as achieving it requires training and supervision. Training and supervision requires time that we could also spend replying to those queries, especially since they are normally the easiest (if not very exciting) to deal with.

We also wanted to delight customers as some of their questions would be answered in a flash and the team would be able to focus on more complex ones. Again, the reality was slightly different from the theory.

Geckobot was indeed able to delight customers in some cases, due to its impressive response times.


But there were also times when Geckobot misfired, creating some confusion. Luckily, in most of these cases, we were able to turn things around.


What we learned

All in all, our experience building and using Geckobot was very enriching. We learned that customers value speedy responses, but they also value personal attention. Bots in this regard represent a trade-off. Interestingly, the trade-off is somehow removed when the bot doesn’t have a persona. Let me explain:

We started using Geckobot with an avatar of a robot. We thought that it would make it clear that it was a robot, but that wasn’t entirely true. The avatar gave our bot a personality and I believe some people felt deceived. Our next approach was to strip Geckobot of its persona so we used our logo (Geckoboard’s “G”) as avatar. That did the trick.

Beyond all of that learning, we also realized that, at this stage, looking after Geckobot requires as much time and attention (if not more) as it does looking after customers’ simple enquiries. If our volumes were different and we were getting 10 times as many tickets (even less than that), we could perfectly justify the time, but at present it seems like a better use of our time is to simply work on those requests the good old fashioned way.