Watson Conversation with Alchemy Entity Extraction

Overview of Watson Conversation

Recently, there’s been an increased proliferation of bots in various applications and platforms to address various needs from ordering pizza, to booking a flight, to providing customer support. The Watson Conversation service recently became generally available to offer developers a turnkey service for building bots and virtual agents. If you haven’t tried it already, I strongly encourage you to take it for a test drive by either checking the demo and browsing the documentation or better yet, by building your own bot. To build your own chatbot or virtual agent, you’ll need to sign up to Bluemix (it is free for 30 day trial), IBM’s platform as a service solution and the environment through which you can provision any of the Watson services including the Conversation service.

To best understand the Watson Conversation service, it is best to focus on its three constituent components: Intents, Entities, and Dialog.

Intents

The purpose of intents is to map “what a user says” to “what a user means”. Because of the wide variety of utterances users say to mean the same intent, Watson Conversation leverages deep learning technology to extract intents from utterances. You, as the app developer, have complete control in defining what intents are relevant to your application. Once you’ve defined your list of intents, you need to teach Watson how to recognize these intents by providing it with sample utterances and how they map to these intents. Probably the most common question we get from developers is “how many sample utterances do I need to provide”. While the tool requires a minimum of five sample utterances, the general guideline is “the more the better”. We’ve typically seen good results with 20-30 sample utterances per intent. The key observation is to try and capture real end-user input utterances and then map those to the intents you’ve defined for your application.

Entities

Entities, on the other hand, provide specific information that your bot can leverage to provide most relevant response. For example, if the user’s utterance is “I would like to book a flight to Paris”, then the intent is “book_flight” and the entity is “Paris”. You, as the app developer, define the entities that are relevant for your application by providing a list of entity names and for each entity, a list of values and for each value a corresponding list of synonyms.

Dialog

Once intents and entities are defined in the Conversation service, it is the Dialog that actually orchestrates the conversation based on extracted intents and entities, as well as context provided by the application. Context is an extremely important concept as it is the bridge that links the conversation service to your bot (or application). Any information communicated back and forth between the Conversation service and the bot goes across through the context variables. Dialog consists of a number of user defined nodes where each node executes based on whether its condition is true. Think of each node as an “if” condition where the condition checked is based on a combination of intents, entities, and context variables (or any derived variables). If the condition is true, then the node executes; if not, the flow continues to the next node in the dialog. Please note that the order of the nodes is important as the flow executes top to bottom, left to right.

Integration of Conversation with Alchemy Language

In the rest of this blog, I’d like to focus on a very common question I get from partners and clients: “How can I integrate conversation service with another service such as Alchemy Language to extract relevant entities from the user utterance”. Once you’ve experimented with Conversation service, you’ll love how intuitive and simple it is to create powerful bots but you’ll probably express the same need above.

Currently, you have two options in extracting entities from user utterance:

  • You define the list of value entity values and synonyms in the conversation service.
  • Your application manages the extraction of relevant entities and passes those as context variables to the conversation service.

While the first option is quite easy to implement, it does require providing an extensive list of entity values. The second option, on the other hand, allows you to leverage another service to extract relevant entities in your application and pass those along to your conversation service.

To illustrate the second option, we will start with the cognitive car dashboard sample application. Before proceeding, please follow the steps in the cognitive car dashboard conversation-simple application and make sure you have it running locally per the instructions. Once it is running, you will see the initial prompt from the application:

Screen Shot 2016-08-27 at 12.24.52 AM

Go ahead and type in the following utterance to ask about the weather in Austin today.

Screen Shot 2016-08-27 at 12.27.08 AM

Verify that you get the following response:

Screen Shot 2016-08-27 at 12.28.19 AM

Note that the application understood you are asking about the weather but it hasn’t been updated to actually look up the weather based on your input. We will update the conversation service and the application to be able to respond to such input utterance. Specifically, we’ll update the application to extract relevant entities (in this case, we mainly care about City where weather is requested) and then pass those entities to the Conversation service via the context variable. To do so, we will leverage Alchemy Language entity extraction api.

All the code changes are available in joe4k/conversation-simple github repo which was forked from original conversation-simple repo. For reference, here is the code change required. First, we need to add service wrapper for Alchemy Language (as shown below) which requires an api key to authenticate to the service. You get the api key by creating an Alchemy Language service on Bluemix; once created, the credentials will provide the api key. If you need more details on how to create Alchemy Language service, check out this AlchemyLab with code uploaded to github.

Screen Shot 2016-08-27 at 12.31.10 AM.pngNext, we will modify the code to first call Alchemy Language entity extraction on the input utterance, update the context variable with extracted entities, and then pass the input + context to Conversation service.

Original

Screen Shot 2016-08-27 at 12.31.47 AM.png

Modified

Screen Shot 2016-08-27 at 12.33.41 AM.png

Next step would be to update the Watson Conversation service to perform some action based on the context variable appCity. As you can see in the Modified version below, we’re adding an extra node after intent is identified as #weather to check if the city context variable is defined. If it is, then Conversation service responds indicating it will provide the weather for that city. If not, it requests user input to specify the city of interest.

The actual code in github repository includes some extra error checking and handling which weren’t included here for more readability. Furthermore, the application has also been integrated with WeatherUnderground API where it makes a REST call to that API and returns the weather for that city.

Original

Screen Shot 2016-08-27 at 12.34.44 AM.png

Modified

Screen Shot 2016-08-27 at 12.35.35 AM.png

After making these edits, you can validate the results by running the code from the following joe4k/conversation-simple github repository.

  • Clone code from github repository

git clone https://github.com/joe4k/conversation-simple.git

  • Create a new Conversation workspace by importing training/car_workspace_alchemy.json file
  • Obtain Alchemy Language API key
  • Obtain Weather Underground API key
  • Update the .env file with credentials for Conversation, Alchemy Language and Weather APIs

Run the application locally (npm start) and verify you get the following results:

Screen Shot 2016-08-27 at 12.37.36 AM.png

Conclusion

In conclusion, the Watson Conversation service offers a real intuitive and powerful interface for building bots and conversational agents. In this blog, we illustrated how we can extend Watson Conversation by extracting entities using Alchemy Language service to enrich the conversation based on that using context variables. While our example focused on extracting the City entity which comes readily available with Alchemy Language standard (pre-trained model), the same approach would work with any entities you’d extract using Alchemy Language custom models.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s