Basic Concept of Chatbot using Wit.ai - node.js

I am trying to create a chatbot application where user can create their own bot like Botengine. After going through google I saw I need some NLP api to process user's query. As per wit.ai basic example I can set and get data. Now I am confused, How I am going to create a botengine?
So as far I understand the flow, Here is an example for pizza delivery:-
User will enter a welcome message i.e - Hi, Hello ...
Welcome reply will be saved by bot owner in my database.
User will enter some query, then I will hit wit.ai API to process that query. Example :- Users query is "What kind of pizza's available in your store" and wit.ai will respond with the details of intent "pizza_type"
Then I will search for the intent return by wit in my database.
So, is that the right flow to create a chatbot? Am I in the right direction? Could anyone give me some link or some example so I can go through it. I want to create this application using nodejs. I have also found some example in node-wit, but can't find how I will implement this.
Thanks

What you need is webhook. You need to call different API's based on the user intent. I believe you can distinguish between different intents using parameters available in request. Check this out - Creating nodejs webhook for dialogflow

Related

Connecting Alexa to my own NodeJS back-end

I'm back again with a question about NLP. I made my own back-end, which on one side can connect to websites, the Google Assistant and Facebook Messenger, and on the other end to Dialogflow. On the side, is logs interactions and does some other database stuff.
Now, I'm trying to connect this back-end to Alexa. I made a project which calls my endpoint. This project has one intent, which has a paramater which should get the raw user input, send it to my back-end, process it, parse and send the response to get back. I feel like there is not a real way to collect and send the raw user input, so I can process it myself (on Dialogflow) instead of using the Amazon way of mapping intents and such.
I know Dialogflow can export to Alexa, but this is not an option for me. I really hope one of you can point me in the right direction.
I just need a way to collect the raw user input, and respond in an Alexa accepted response format.
For Actions on Google for example, I'm using a Custom Project Action Package.
Thanks a lot in advace!
To accept or get any user input, you can use sys.any in google assistant and AMAZON.SearchQuery in AMAZON ALEXA.
In Alexa, You have to add the carrier phrase to use AMAZON.SearchQuery. You can't combine any other slot with AMAZON.SearchQuery.
So there are also some limitations. I hope this answer will help you.

How to create a chatbot with facebook messenger like templates

I'm trying to create a chatbot for use in a chat app I've created. I basically need the chatbot to send me replies that have message templates like in facebook messenger. For example, If I type in "what's the weather like", I want my chatbot's reply to look like facebook's media template, linked here: Media Template
Does anyone have any tutorials or links I can follow?
Thank you in advance.
Cheers!
Usavaully work flow of chat application as follows,
Message providers(Facebook, twitter,slack etc..) receives messages from user
Message is sent to the configured endpoint(your webserver) according to the settings provided in the face book developer page reference
In the webserver you classify the intent prepares the response according to the request and sends the responses back.
So in the 3rd point web server you give responses based on the platform you are responding to reference, since in your case it's your own platform you need to design your own UI based on the response format or you can use some predefined html templates.
I hope answer gave some direction to work on.

Control your device with custom commands using Actions in Google

just getting started with Assistant features in RPi and I am able to successfully implement upto this point and wondering few thing.
Scenario:
user: hey google "please turn on my living room Lights"
List item my code in horword.py : has a function to perform same action based on ON_RECOGNIZING_SPEACH_FINISHED
RPi/google home: I am not sure how respond to that
I was able to capture the request query asked by user using ON_RECOGNIZING_SPEACH_FINISHED = Args.text(str) and use it in my logic to perform the task. However, at the same time, "ok google" is responding with this answer.
to mitigate this problem, I created an google-actions, now it understands my query and respond with intention from api.ai. However, didn't acts on turn lights ON. So, wondering how can I read response from google home/api.ai in text and change code to act on it locally.
appreciate it.
You will not get response as text.
For getting response to client app use webhook in API.AI and send message using fcm to client app.
Read the fcm message in client app and do the corresponding actions.
finally was able to figure out multiple ways. answered this in other stack question. find more details in this post.
Multiple ways to handle this since google doesn't gives voices transcript and we let google say our transcript which is kind off solution for now.

Creating Follow Up intent in API.AI

I am trying to create a Chatbot through API.AI. Now my chatbot has a flow and it responds based on the users query. So I have created many Follow up Intents. But my follow up intents are not connecting. It is returning the same respinse for the initial question. Where did i go wrong?
Would help to see what your intent looks like. From what you're saying though, it may be that the previous question is not getting the answer its looking for so it will just continue asking it until it does. Check your entity value under action within your intent.

How to implement a chatbot to human executive switch using Microsoft Bot Framework?

The exact point being, that I've created a bot that can take inputs from users in free form text and return relevant web links. Now the problem being, that in case the bot is not able to understand the user query, the control of the conversation has to be passed on to the human executive.
I've researched for over 2 days but could not find any such implementations. The closest I came was third party applications like ChatFuel, letsclap.io provide such a provision. So, there should be a way only that I am not able to find such a thing.
Any help on this would be appreciated.
one possible way is you can make a bridge, idea is as follow:
user send something that the bot cannot reply (conv-1)
make a new conversation with your human executive (conv-2)
forward user message to conv-2
human executive replied to the bot (conv-2)
capture the message and forward back to (conv-1)
See this link on how to start a new conversation:
https://docs.botframework.com/en-us/csharp/builder/sdkreference/routing.html#sendtoconversation
Hope it helps,
Maybe you can create some APIs in a WebApplication that will be used by your bot.
If the LUIS Intent "None" is called, you make a call to that API and start a new conversation with a human.
You can use this same process to manage all conversations in a WebApplication Chat Control

Resources