diff --git a/Lab_ChatBot/cards/1.md b/Lab_ChatBot/cards/1.md new file mode 100644 index 00000000..3aee7880 --- /dev/null +++ b/Lab_ChatBot/cards/1.md @@ -0,0 +1,13 @@ +# Chatbot + + + +A chatbot is an AI-powered program that you can “chat with.” Many businesses use these “virtual assistants” for customer service or tech support purposes, because they make it easy to answer user queries in real-time through a messaging platform using the power of machine learning. + +Chatbots typically integrate with messaging apps such as Facebook Messenger and websites popups. + +## What is DialogFlow? + +Natural language understanding (NLU) tends to be a tricky component of building a chatbot. How do you make sure your bot is actually understanding what the user says, and parsing their requests correctly? That's where DialogFlow comes in and fills the gap. It replaces the NLU parsing bit so that you can focus on other areas like your business logic! + +DialogFlow is a tool that allows you to make bots (or assistants or agents) that understand human conversation, string together a meaningful API call with appropriate parameters after parsing the conversation, and respond with an adequate reply. You can then deploy this bot to any platform of your choosing — Facebook Messenger, Slack, Google Assistant, Twitter, Skype, or on your own app or website as well! diff --git a/Lab_ChatBot/cards/2.md b/Lab_ChatBot/cards/2.md new file mode 100644 index 00000000..8b51900c --- /dev/null +++ b/Lab_ChatBot/cards/2.md @@ -0,0 +1,33 @@ +# The building blocks of DialogFlow + +### Agent: + +DialogFlow allows you to make NLU modules called agents (these are basically the face of your bot). This agent connects to your backend and provides it with business logic. + +### Intent: + +An agent is made up of intents. Intents are actions that a user can perform on your agent. It maps what a user says to what action should be taken. They’re entry points into a conversation. + +In short, a user may request the same thing in many ways, re-structuring their sentences. But in the end, they should all resolve to a single intent. + +These are some examples for intents: +“What’s the weather like in Mumbai today?” or “What is the recipe for an omelet?” + +You can create as many intents as your business logic needs, and even co-relate them using **contexts**. An intent decides what API to call, with what parameters, and how to respond back to a user’s request. + +### Entity: + +An agent on its own wouldn’t know what values to extract from a given user’s input. This is where entities come into play. Any information in a sentence, critical to your business logic, will be an entity. This includes things like dates, distance, currency, etc. There are system entities, provided by DialogFlow for simple things like numbers and dates. And then there are developer defined entities. For example, “category” for a bot about Pokemon! We’ll dive into how to make a custom developer entity later in the post. + +### Context: + +A final concept before we can get started with coding is “Context.” This is what makes the bot truly conversational. A context-aware bot can remember things, and hold a conversation like humans do. Consider the following conversation: + +“Hey, are you coming for piano practice tonight?” +“Sorry, I’ve got dinner plans.” +“Okay, what about tomorrow night then?” +“That works!” + +Did you notice what just happened? The first question is straightforward to parse: The time is “tonight,” and the event, “piano practice.” + +However, the second question, “Okay, what about tomorrow night then?” doesn’t specify anything about the actual event. It’s *implied* that we’re talking about “piano practice.” This sort of understanding comes naturally to us humans, but bots have to be explicitly programmed so that they understand the context across these sentences. diff --git a/Lab_ChatBot/cards/3.md b/Lab_ChatBot/cards/3.md new file mode 100644 index 00000000..0971ddea --- /dev/null +++ b/Lab_ChatBot/cards/3.md @@ -0,0 +1,13 @@ + +# Making a Reddit chatbot using DialogFlow +Now that we’re well equipped with the basics, let’s get started! We’re going to make a Reddit bot that tells a joke or an interesting fact from the day’s top threads on specific subreddits. We’ll also sprinkle in some context awareness so that the bot doesn’t feel “rigid”. +NOTE: You would need a billing-enabled account on Google Cloud Platform(GCP) if you want to follow along with this tutorial. It’s free and just needs your credit card details to set up + + +## Creating an Agent +1. Log in to the DialogFlow dashboard using your Google account. [Here’s the link.](https://console.dialogflow.com/api-client/#/login) +2. Click on “Create Agent” +3. Enter the details as below, and hit “Create”. You can select any other Google project if it has billing enabled on it as well. + + + diff --git a/Lab_ChatBot/cards/4.md b/Lab_ChatBot/cards/4.md new file mode 100644 index 00000000..be906806 --- /dev/null +++ b/Lab_ChatBot/cards/4.md @@ -0,0 +1,17 @@ +# Setting up a “Welcome” Intent +As soon as you create the agent, you see this intents page: + + + +The “Default Fallback” Intent exists in case the user says something unexpected and is outside the scope of your intents. We won’t worry too much about that right now. Go ahead and click on the “Default Welcome Intent”. We can notice a lot of options that we can tweak. +Let’s start with a triggering phrase. Notice the “User Says” section? We want our bot to activate as soon as we say something along the lines of: + + + +Let’s fill that in. After that, scroll down to the “Responses” tab. You can see some generic welcome messages are provided. Get rid of them, and put in something more personalized to our bot, as follows: + + + +Now, this does a couple of things. Firstly, it lets the user know that they’re using our bot. It also guides the user to the next point in the conversation. Here, it is an “or” question. + +Hit “Save” and let’s move on. \ No newline at end of file diff --git a/Lab_ChatBot/cards/5.md b/Lab_ChatBot/cards/5.md new file mode 100644 index 00000000..482f1af7 --- /dev/null +++ b/Lab_ChatBot/cards/5.md @@ -0,0 +1,10 @@ +# Creating a Custom Entity + +Before we start playing around with Intents, I want to set up a Custom Entity real quick. If you remember, Entities are what we extract from user’s input to process further. I’m going to call our Entity “content”. As the user request will be content — either a joke or a fact. Let’s go ahead and create that. Click on the “Entities” tab on left-sidebar and click “Create Entity”. + +Fill in the following details: + + + +As you can see, we have 2 values possible for our content: “joke” and “fact”. We also have entered synonyms for each of them, so that if the user says something like “I want to hear something funny”, we know he wants a “joke” content. Click “Save” and let’s proceed to the next section! + diff --git a/Lab_ChatBot/cards/6.md b/Lab_ChatBot/cards/6.md new file mode 100644 index 00000000..b07280f1 --- /dev/null +++ b/Lab_ChatBot/cards/6.md @@ -0,0 +1,21 @@ +# Attaching our new Entity to the Intent + +Create a new Intent called “say-content”. Add the phrase “Let’s hear a joke” in the “User Says” section, like so: + + + +Right off the bat, we notice a couple of interesting things. Dialogflow parsed this input and associated the entity content to it, with the correct value (here, “joke”). Let’s add a few more inputs: + + + +PS: Make sure all the highlighted words are in the same color and have associated the same entity. Dialogflow’s NLU isn’t perfect and sometimes assigns different Entities. If that’s the case, just remove it, double-click the word and assign the correct Entity yourself! + +Let’s add a placeholder text response to see it work. To do that, scroll to the bottom section “Response”, and fill it like so: + + + +The “$content” is a variable having a value extracted from user’s response that we saw above. + +Let’s see this in action. On the right side of every page on Dialogflow’s platform, you see a “Try It Now” box. Use that to test your work at any point in time. I’m going to go ahead and type in “Tell a fact” in the box. Notice that the “Tell a fact” phrase wasn’t present in the samples that we gave earlier. Dialogflow keeps training using it’s NLU modules and can extract data from similarly structured sentences: + + \ No newline at end of file diff --git a/Lab_ChatBot/cards/7.md b/Lab_ChatBot/cards/7.md new file mode 100644 index 00000000..a68c605e --- /dev/null +++ b/Lab_ChatBot/cards/7.md @@ -0,0 +1,82 @@ +# A Webhook to process requests + +To keep things simple I’m gonna write a JS app that fulfills the request by querying the Reddit’s website and returning the appropriate content. Luckily for us, Reddit doesn’t need authentication to read in JSON format. Here’s the code: + +```js +'use strict'; +const http = require('https'); +exports.appWebhook = (req, res) => { + let content = req.body.result.parameters['content']; + getContent(content).then((output) => { + res.setHeader('Content-Type', 'application/json'); + res.send(JSON.stringify({ 'speech': output, 'displayText': output })); + }).catch((error) => { + // If there is an error let the user know + res.setHeader('Content-Type', 'application/json'); + res.send(JSON.stringify({ 'speech': error, 'displayText': error })); + }); +}; +function getSubreddit (content) { + if (content == "funny" || content == "joke" || content == "laugh") + return {sub: "jokes", displayText: "joke"}; + else { + return {sub: "todayILearned", displayText: "fact"}; + } +} +function getContent (content) { + let subReddit = getSubreddit(content); + return new Promise((resolve, reject) => { + console.log('API Request: to Reddit'); + http.get(`https://www.reddit.com/r/${subReddit["sub"]}/top.json?sort=top&t=day`, (resp) => { + let data = ''; + resp.on('data', (chunk) => { + data += chunk; + }); + resp.on('end', () => { + let response = JSON.parse(data); + let thread = response["data"]["children"][(Math.floor((Math.random() * 24) + 1))]["data"]; + let output = `Here's a ${subReddit["displayText"]}: ${thread["title"]}`; + if (subReddit['sub'] == "jokes") { + output += " " + thread["selftext"]; + } + output += "\nWhat do you want to hear next, a joke or a fact?" + console.log(output); + resolve(output); + }); + }).on("error", (err) => { + console.log("Error: " + err.message); + reject(error); + }); + }); +} +``` + +Now, before going ahead, follow the steps 1–5 mentioned [here](https://cloud.google.com/functions/docs/quickstart) religiously. + +NOTE: For step 1, select the same Google Project that you created/used, when creating the agent. + +Now, to deploy our function using gcloud: +*$ gcloud beta functions deploy appWebHook — stage-bucket BUCKET_NAME — trigger-http* + +To find the BUCKET_NAME, go to your Google project’s console and click on Cloud Storage under the Resources section. + +After you run the command, make note of the *httpsTrigger* URL mentioned. On the Dialoglow platform, find the “Fulfilment” tab on the sidebar. We need to enable webhooks and paste in the URL, like this: + + + +Hit “Done” on the bottom of the page, and now the final step. Visit the “say_content” Intent page and perform a couple of steps. + +1. Make the “content” parameter mandatory. This will make the bot ask explicitly for the parameter to the user if it’s not clear: + + + +2. Notice a new section has been added to the bottom of the screen called “Fulfilment”. Enable the “Use webhook” checkbox: + + + +3. Click “Save” and that’s it! Time to test this Intent out! + + + + Reddit’s crappy humor aside, this looks neat. Our replies always drive the conversation to places (Intents) that we want it to. + diff --git a/Lab_ChatBot/cards/8.md b/Lab_ChatBot/cards/8.md new file mode 100644 index 00000000..248245a9 --- /dev/null +++ b/Lab_ChatBot/cards/8.md @@ -0,0 +1,21 @@ +# Adding Context to our Bot + +Even though this works perfectly fine, there’s one more thing I’d like to add quickly. We want the user to be able to say, “More” or “Give me another one” and the bot to be able to understand what this means. This is done by emitting and absorbing contexts between intents. + +First, to emit the context, scroll up on the “say-content” Intent’s page and find the “Contexts” section. We want to output the “context”. Let’s say for a count of 5. The count makes sure the bot remembers what the “content” is in the current conversation for up to 5 back and forths. + +Now, we want to create new content that can absorb this type of context and make sense of phrases like “More please”: + + + + + + + +Finally, since we want it to work the same way, we’ll make the Action and Fulfilment sections look the same way as the “say-content” Intent does: + + + + + +And that’s it! **Your bot is ready!!** \ No newline at end of file diff --git a/Lab_ChatBot/cards/9.md b/Lab_ChatBot/cards/9.md new file mode 100644 index 00000000..4fd4090d --- /dev/null +++ b/Lab_ChatBot/cards/9.md @@ -0,0 +1,20 @@ +# Integrations + +Dialogflow provides integrations with probably every messaging service in Silicon Valley, and more. But we’ll use the Web Demo. Go to “Integrations” tab from the sidebar and enable “Web Demo” settings. Your bot should work like this: + + + + + + + +And that’s it! Your bot is ready to face a real person! Now, you can easily keep adding more subreddits, like news, sports, bodypainting, dankmemes or whatever your hobbies in life are! Or make it understand a few more parameters. For example, “A joke about Donald Trump”. + + + + + + + +Consider that your homework. You can also add a “Bye” intent, and make the bot stop. Our bot currently isn’t so great with goodbyes, sort of like real people. + diff --git a/Lab_ChatBot/images/.DS_Store b/Lab_ChatBot/images/.DS_Store new file mode 100644 index 00000000..5008ddfc Binary files /dev/null and b/Lab_ChatBot/images/.DS_Store differ diff --git a/Lab_ChatBot/images/1.png b/Lab_ChatBot/images/1.png new file mode 100755 index 00000000..2432508b Binary files /dev/null and b/Lab_ChatBot/images/1.png differ diff --git a/Lab_ChatBot/images/2_2.png b/Lab_ChatBot/images/2_2.png new file mode 100755 index 00000000..435a0777 Binary files /dev/null and b/Lab_ChatBot/images/2_2.png differ diff --git a/Lab_ChatBot/images/2_3.png b/Lab_ChatBot/images/2_3.png new file mode 100755 index 00000000..3edfd424 Binary files /dev/null and b/Lab_ChatBot/images/2_3.png differ diff --git a/Lab_ChatBot/images/2_welcome intent.png b/Lab_ChatBot/images/2_welcome intent.png new file mode 100755 index 00000000..32725fc9 Binary files /dev/null and b/Lab_ChatBot/images/2_welcome intent.png differ diff --git a/Lab_ChatBot/images/3_2.png b/Lab_ChatBot/images/3_2.png new file mode 100755 index 00000000..4c268e08 Binary files /dev/null and b/Lab_ChatBot/images/3_2.png differ diff --git a/Lab_ChatBot/images/3_3.png b/Lab_ChatBot/images/3_3.png new file mode 100755 index 00000000..5c7d9f6c Binary files /dev/null and b/Lab_ChatBot/images/3_3.png differ diff --git a/Lab_ChatBot/images/3_4.png b/Lab_ChatBot/images/3_4.png new file mode 100755 index 00000000..11c96c18 Binary files /dev/null and b/Lab_ChatBot/images/3_4.png differ diff --git a/Lab_ChatBot/images/3_5.png b/Lab_ChatBot/images/3_5.png new file mode 100755 index 00000000..88b8fc19 Binary files /dev/null and b/Lab_ChatBot/images/3_5.png differ diff --git a/Lab_ChatBot/images/3_Entity.png b/Lab_ChatBot/images/3_Entity.png new file mode 100755 index 00000000..bfaf5578 Binary files /dev/null and b/Lab_ChatBot/images/3_Entity.png differ diff --git a/Lab_ChatBot/images/4-webhook.png b/Lab_ChatBot/images/4-webhook.png new file mode 100755 index 00000000..c57f9b3c Binary files /dev/null and b/Lab_ChatBot/images/4-webhook.png differ diff --git a/Lab_ChatBot/images/4_2.png b/Lab_ChatBot/images/4_2.png new file mode 100755 index 00000000..af30efdc Binary files /dev/null and b/Lab_ChatBot/images/4_2.png differ diff --git a/Lab_ChatBot/images/4_3.png b/Lab_ChatBot/images/4_3.png new file mode 100755 index 00000000..a4e7ed12 Binary files /dev/null and b/Lab_ChatBot/images/4_3.png differ diff --git a/Lab_ChatBot/images/4_4.png b/Lab_ChatBot/images/4_4.png new file mode 100755 index 00000000..2b7e1da6 Binary files /dev/null and b/Lab_ChatBot/images/4_4.png differ diff --git a/Lab_ChatBot/images/5_2.png b/Lab_ChatBot/images/5_2.png new file mode 100755 index 00000000..e7866c93 Binary files /dev/null and b/Lab_ChatBot/images/5_2.png differ diff --git a/Lab_ChatBot/images/5_add context.png b/Lab_ChatBot/images/5_add context.png new file mode 100755 index 00000000..67ef598d Binary files /dev/null and b/Lab_ChatBot/images/5_add context.png differ diff --git a/Lab_ChatBot/images/6_2.png b/Lab_ChatBot/images/6_2.png new file mode 100755 index 00000000..e3b15641 Binary files /dev/null and b/Lab_ChatBot/images/6_2.png differ diff --git a/Lab_ChatBot/images/6_Integrations.png b/Lab_ChatBot/images/6_Integrations.png new file mode 100755 index 00000000..c072e425 Binary files /dev/null and b/Lab_ChatBot/images/6_Integrations.png differ diff --git a/Lab_ChatBot/images/chatbot.png b/Lab_ChatBot/images/chatbot.png new file mode 100644 index 00000000..5bd5c959 Binary files /dev/null and b/Lab_ChatBot/images/chatbot.png differ