We recently documented our own experiment to build an IBM Watson Assistant chatbot with Atlassian. If you’re thinking about doing the same, you could benefit from the lessons we learned.
Docs can work against you
To IBM’s credit, there is a lot of documentation on Watson Assistant. However, the way it’s organized makes it surprisingly difficult to sift through to find information on some topics. Sometimes details on key functionality, for example Webhooks, are nested three level deeps into their navigation tree (left).
By the end we had ten pages worth of Google search history of ‘how to… in Watson Assistant’, which can make the whole development process quite tedious.
IBM also tends to bury important information within tutorial-style articles, which can be lengthy and are typically sectioned with their own on-page navigation (right).
On the plus side, there is a search box (red arrow), which (contrary to it’s position) actually searches across all Watson Assistant articles – something that definitely comes in handy. Otherwise you could benefit from there’s a wealth of information on Stack Overflow, too.
Complexity escalates very quickly
Building simple static question-and-answer chatbots is super easy with IBM Watson Assistant. But if you want your chatbot to have any practical use, it probably needs to be able to communicate with an external API, which is about as complicated as landing a SpaceX rocket in the ocean.
If you follow our build guide, you’ll see that we make requests to third-party services (in our case, Jira) via Watson Asisstant’s Webhooks. Webhooks are the simplest way to communicate with an external service in Watson Assistant, but they have a few caveats:
- Only POST requests are supported
- Only JSON is supported in the body of the request
- Only Basic authentication is supported
- You can’t transform the response, only read it
These caveats can be quite restrictive; at worst, it means that any API that requires an OAuth token is off-limits. Ouch!
If you’re a confident developer and want to have more control over your API calls, IBM’s solution is IBM Cloud Functions. Cloud Functions have a lot more flexibility, but don’t forget that this service exists outside of the scope of IBM Watson Assistant, so it has it’s own rate limiting and pricing (and there’s no free tier).
Editor’s comment:
If you need to create more complex requests, or need to transform your data before handing it off to a chatbot, Postman is a popular tool for building and testing API requests (and it’s totally free, unlike IBM Cloud Functions).
Watson Assistant unfortunately doesn’t integrate with Postman (which I think is a missed opportunity). But you can upload Postman requests directly to free tools like Digital Assistant, and start using them with the chatbot straight away!
Lack of workplace app integrations
IBM is the world’s original B2B champion, and with Watson Assistant they boldly promise you can “connect to any channel”.
So you’d be forgiven for thinking they’d have figured out a way to plug-and-play deploy Watson Assistant to widely used workplace apps like Slack or Cisco Webex. Alas, you’d be mistaken.
The list of available integrations for your chatbot highlights a real weak spot for Watson Assistant – you can’t really use it anywhere! Slack is a recent addition to their lineup, but there’s no support for Microsoft Teams (which has over 115 million users).
If you want to make your chatbot accessible through other voice assistants like Google Assistant or Amazon Alexa, you’ll have to build that functionality from the ground up (again using Cloud Functions, which isn’t free).
I think this area most notably shows that Watson Assistant was designed to be customer-facing. Consequently, any use cases where a chatbot could interact with employees and answer their questions, is a little out of scope for it. If this was your intended use case, it might be worthwhile exploring other options.
The Plus plan is not-so-optional
Watson Assistant is free to get started, which is great for when you’re just dipping your toes in. Looking at the pricing page, you could be forgiven for thinking that 100 Dialog nodes per skill (and you’re allowed 5 skills) is plenty to get you started. But don’t be fooled. You’ll rip through these rather quickly, and end up being forced to choose between subscribing to the Plus plan or switching to another platform.
There are other limitations to the Lite plan, too. You’ll get 10,000 messages a month from your Assistant, across up to 1000 users. Even with 100 users, that averages out at 4 messages per user, per day. Imagine how quiet your workday would become if all your conversations were like that!
Unless your chatbot is designed to be a fairly simple and address only one use-case, you’re most likely going to need to shell out for the Plus plan. $120 isn’t outrageous – but if you’re not expecting it, it can come as an unwelcome surprise.
Watson Discovery integration
In case you haven’t heard of Watson Discovery, it’s actually a very cool product. In a nutshell, it intelligently classifies documents and mines them for answers and data trends. Really a perfect solution if you want to create a smart workplace assistant that can respond accurately to complex questions like How many machines did my customer order last year?.
If this is your intended chatbot use case, Watson Discovery can be a great add-on to your chatbot to add search and data analysis capabilities to your business. Not only that, the Lite plan is surprisingly competitive – especially for small businesses with lower volumes of data. The catch is that you’ll need to be on the Standard Watson Assistant plan to integrate with Discovery since search skills aren’t available on the Lite plan.
Microsoft has released a more internal-focused competitor to Watson Discovery, named SharePoint Syntex. At $5 per user, per month, it’s not exactly cheaper than Watson Discovery; however, Microsoft have already announced that some “automated Q&A” feature is coming sometime in 2021, which Watson doesn’t currently support.
Single use-case limit
A chatbot built with Watson Assistant can support just one dialog skill (the component that lets you train all the conversation flows related to your use case). This means each bot will be capable of doing one thing and one thing only, be it booking a training session or requesting PTO.
This might be fine if you don’t mind creating separate, specialized chatbots for each unique use-case you have; the Lite plan has a generous limit of 100 Assistants per instance.
It could nevertheless become a huge time sink, as you effectively have to duplicate any updates over multiple use cases. Combined with the other Lite plan limitations and you have a recipe for opening your wallet… again.
Having a single use-case limit in and of itself isn’t especially unusual; neither Dialogflow nor Botpress offer a capability to train just one bot to differentiate between multiple use cases. But at least one of them is free and open-source…
On the other hand, if supporting multiple use cases in one environment is what you need, tools like Digital Assistant support an unlimited number of Connectors and use AI to figure out the user’s intent.
Prepare to write a big check!
Even if you stick out ’till the end, the work is far from over once you deployed your chatbot.
The analytics offered in the Lite plan are particularly scarce on details, which can make it hard to even recognize where your chatbot may have gaps in recognition. Identifying these gaps is super important in making your chatbot user-friendly and functional. Nobody likes to hear “I didn’t understand”.
IBM offers a solution to help you, and it’s called: pay for Plus (and of course, pay developers to do the analysis & improvements).
On top of that, if your chatbot uses any third party API integrations, you’re most probably going to have to sign up to IBMs’ Cloud Functions service, which above time costs – you guessed it – money. Theoretically, you could use another cloud function service, but this means you’ll have to host Watson Assistant yourself and that is only possible in the highest price plan that’s ominously called “Contact us”.
And you better hope that your chatbot doesn’t hit any usage limits. Those messages (only four a day if you have 100 users) and cloud function minutes can add up quickly, so if you’re building a chatbot for company-wide use, you should figure out how to keep an eye on your usage lest you deplete your quota and your chatbot takes a forced nap.
~
I hope you found some useful tips and advice for your chatbot project in this article. You can also find our opinions of making an actual Watson chatbot for the workplace on our blog. What do you like or dislike about Watson Assistant? Anything you think I got wrong? Let me know in the comments below.