Can one import a DialogFlow agent into LUIS - dialogflow-es

I have a fully mature agent in DialogFlow with utterances, intents and entities. Is there a way to quickly bring this info over into a new app in Luis? I really don't want to reenter the information. I know DialogFlow allows for a zip export. Is this format recognized by Luis?

This format is not recognised per Luis but both Dialogflow and luis have APIs and you can use those APIs to extract the utterances from Dialogflow and then POST those into Luis

Related

Is it possible to use multiple LUIS application in single bot services?

I have a Azure Bot Service and it is integrated with LUIS application where i have some intents.
What I want according to my need i can forward same examples to different intents in LUIS.
But there is limitation that same example can't be there for multiple intents.
So i created another application in the LUIS and created the same example but different intent name.
Here my problem is i have to connect Single Bot service with two different LUIS application.
Can i do it in Azure Bot Services in Node.js??
Yes, it is absolutely possible to make use of multiple LUIS applications in one bot. The dispatch tool helps to determine which LUIS model best matches the user input. The dispatch tool does this by creating a single LUIS app to route user input to the correct model. Also, ensure that you do not have overlapping intents and make use of prediction data from LUIS to determine if your intents are overlapping.
The dispatch model is used in cases when:
Your bot consists of multiple modules and you need assistance in routing user's utterances to these modules and evaluate the bot integration.
Evaluate the quality of intents classification of a single LUIS model.
Create a text classification model from text files.
Refer to this sample, which is an NLP with Dispatch bot dealing with multiple LUIS models and also this documentation that provides more information about using multiple LUIS and QnAMaker models.
Hope this helps!!

Extract LUIS intent and score and train the LU from LUIS portal instead of from composer UI

I am exploring Composer. I already use bot-framework c# model in Azure that connects with LUIS. We developed a one step FAQ bot (not multi-turn conversations) with only one main dialog. We used to extract the Luis intent and score whenever users type a text. We get answer from a SharePoint list for FAQs by passing the identified LUIS Intent as a column filter.
Is it possible to do the same in Composer to get the intent identified and score for each input we get from user.?
Also, instead of training the LU model in composer, can we train the utterances in the LUIS portal and consume the same intent/score in composer?
Yes it is possible to use Luis inside Bot Framework Composer to recognize intents and do entity extractions. Bot Framework Composer allows you to put conditions on the recognized intents which you can use to make sure the next steps only fire when the score is >= x.
You can see how to use this in action and how the scoring works here (starting at 27:38):
https://youtu.be/ZNcfIgO8biw?t=1658
You can still make changes in the Luis portal, but at this time this is not recommended as your changes will be overwritten the next time you publish to Luis via Bot Framework Composer. If you make changes in the Luis portal directly, make sure to export the .lu and re-integrate your Luis model in Bot Framework Composer to not lose your changes.

What is LUIS "Speech Requests"?

The LUIS pricing page states
Standard-Web/Container
Text Requests €1.265 per 1,000 transactions
Speech Requests €4.639 per 1,000 transactions
"https://azure.microsoft.com/en-in/pricing/details/cognitive-services/language-understanding-intelligent-services/"
What does "speech requests" refer to?
Does this mean it's possible to send audio to LUIS instead of text? I can find only API and SDK methods accepting text input. Where is the corresponding documenation?
Yes, it is possible as per the documentation where you can find a tutorial that walks you through the necessary steps to use the Speech SDK to develop a C# console application that derives intents from user utterances through your device's microphone.
LUIS integrates with the Speech Services to recognize intents from
speech. You don't need a Speech Services subscription, just LUIS.
The price corresponding to the "Speech Requests" feature is for the speech transcriptions you do using LUIS.
The Speech SDK is also available in many languages and for many platforms.

How to view missed user utterances in Microsoft Azure Bot framework

I have developed a bot using Microsoft Azure Bot framework but now I am trying to see if there is a way to see what were some of the utterances that were missed/not mapped to an intent. I tried to look around but couldn't find anything related to this. Is it possible to see the log of missed utterances?
There is no built-in capability in Bot Framework or LUIS for this. You will have to log the utterances that go through the None intent somewhere or use some analytics service like (App Insights) to get the information you are looking for.
Some initial information at https://learn.microsoft.com/en-us/bot-framework/portal-analytics-overview
LUIS provides a feature called Active Learning where it tracks utterances it is relatively unsure of, and places them in a list called Suggested Utterances. More information can be found here: https://learn.microsoft.com/en-us/azure/cognitive-services/luis/label-suggested-utterances
In the active learning process, LUIS examines all the utterances that
have been sent to it, and calls to your attention the ones that it
would like you to label. LUIS identifies the utterances that it is
relatively unsure of and asks you to label them. Suggested utterances
are the utterances that your LUIS app suggests for labeling.

How to transfer conversation from Bot to human agents?

I have MS BOT client which the user will interact with. The Bot will use LUIS models for NLP. For any queries tagged under None, we should involve a human agent immediately. What is the best way to transfer the conversation from Bot client to human agent?
I think, none intent is not the right place to divert chat to human assistant. You should use text analytics to find the sentiment of user's reply and if it scores negative then you should redirect the chat to human assistant.
In Microsoft Bot it provides a setting to transfer Bot-to-Human if needed or the bot is unable to understand. Here are the steps to do on the Microsoft platform.
This changes depending on the Chatbot platform, for example, in Dialogflow JSON need to be used. In that JSON code, you can mention the human agent Id and set like the "Talk to Human" button and transfer.

Resources