Microsoft Bot Framework - very high response times - azure

I have 10 seconds response times through any channel. (WebChat and Facebook)
My endpoint is a PAAS instance located in the western United States.
The WebApp has an S3 size and the response times are constant (even if there is only one conversation).
I have the following questions:
Is there any way to optimize this?
What are the Azure Bot Framework SLAs?

As bot framework is a preview product, there is no current SLAs.
Are you using the default state storage? If so, part of the slow down you mentioned is probably related. We highly recommend implementing your own state service. There is a blog article here discussing the implementations there is also a repository here with samples. This is probably not 100% of your issue but it is probably at least part of it.
Another thing to keep in mind is where your bot is located in relationship to your WebChat client and what instance of the Bot Connector you are using this blog may provide more info. Please see the "Geographic Direct Line endpoints" section.

Related

Concept of Conversations in Teams Bot development

I am developing a Microsoft Teams Bot using the NodeJS v4 Bot Framework. This is the first time I have gone and developed a bot and it seems to me it is missing a core concept, conversations / previous message context. When the bot asks me how I am going and I answer "good" in the next message and following messages it doesn't seem to store in an object how I am going.
I have a work around for this by pushing answers into an array but it just seems strange that previous message context hasn't been implemented... Am I missing something?
I think what you might be missing is an understanding of Bot state management. This link gives an overview of the types of state (user vs conversation) as well as places you can store state (e.g. memory, Azure blob storage, etc.). Be aware that Cosmos DB, proposed in the article, can be an expensive option because of the high read state of bots (every turn results in a read, which is part of what Cosmos pricing is based on), so MongoDB for instance could be another possible option.
Another approach to "state" though is the concept of "dialogs", where there a specific "guided conversation" the user might be going through. As an example, in an flight booking scenario you would need departure location, destination, date, time, etc., so this is a multi-turn "mini conversation" and dialogs do their own state management in this context. See "Dialogs within the Bot Framework".
As an aside, the "array" approach you're taking is kind of similar to the in-memory state option, but it requires you to manage things 100%, it can't easily be scaled (with the built in state stuff, it's easy to switch out memory to another option), and it might not be multi-user safe (depending how you're working with the array, if you're saving one per user or so).
Hope that helps

How do I send out notification messages to skype groups using the azure bot framework?

I am new to the Azure Bot Framework and I am trying to do something that I think is quite simple.
I have a Node application that a few times a day needs to send notifications to various skype groups. I have been using skype-http for a while, but it is unreliable and not officially supported by Microsoft. So, I am looking to rebuild the notification system using azure services.
It seems like I should be creating an Azure Bot Function, but the Bot Functions use the V3 API, which is deprecated.
It looks like Web App Bots are now the recommended option to create bots, but they seem to be solving a slightly different problem. But going this route, it looks like I still need to set up an Azure Function as well as storage. So, it seems vastly more complex than my current implementation.
My question is: Should I be using a Bot Function, a Web App Bot, or something else entirely to send notifications to multiple chat groups?
EDIT: To be clear, I am looking for an officially supported solution. skype-http regularly breaks for us due to API changes, and the other node-based skype libraries are similarly brittle.
Or, please let me know if there is no Microsoft-supported solution, then at least I know I will be stuck using private APIs.
You should look at Azure Functions as your platform, instead. Azure Functions would allow you to make the API calls you need. Additionally, you can set it to run via a trigger (timer or http request).
Instead of skype-http, check out sky-web-plus (https://www.npmjs.com/package/sky-web-plus). It's basically a port of several other packages already out there, but is getting regular love from its developer (so, hopefully stable and up-to-date) and appears to do what you need. As the developer is active, you can reach out to him/her with any specific questions.
Azure Bot Framework would not be a good fit for this purpose, alone.
Hope of help!

Mastransit - publish vs send and how to manage message

I have just used MassTransit in my project, .Net core2.0. It is great but there are some concerns:
That is different between Publish vs Send. In my scenario, I have one email service to send email to out side. Other services will pass request to email service via RabbitMQ. So, in this case we should use "Publish" or "Send".
With Send, we need to pass the full URL of endpoint. There is any best practice to manage endpoint? Because if we have 10 commands, we need to manage 10 endpoints. Is it right?
Relate to event(Publish), if one service is deployed on multiple instances, when one event is published to queue. It will be processed one time or will be processed many times on each instance.
Could you please share me one unit test for consumer? Because with harness test, it seems we just ensure message was queued.
Masstransit is ready for .Net Core 2.1?
Many thanks,
There are way too many questions for one post tbh, on SO it is better
to ask more specific questions, one by one. Some of your questions already have answers on SO.
The difference between publishing events and sending commands is similar to what you expect. We actually cover some of it in the documentation.
You can handle as many message types as you want in one receive endpoint, but you need to be aware of the consequences. The best practice is to have one endpoint per command type or at least one endpoint for related commands. The risk here is that an important command might get stuck waiting in the queue until other, less important commands will be processed.
If you publish events, each endpoint (queue) will get a copy of it. If you have several instances of one endpoint, only one of those instances will get it. It is valid also for sending commands, but it will be only one endpoint that gets a message and only one of the instances will process it.
Although there is no documentation for MT testing just yet, you can look at this test to see how it is done.
MassTransit is compiled for .NET 4.6 and .NET Standard 2.0. There is nothing specifically different in .NET Core 2.1 that would have any effect on MassTransit.

Chatbot - Possible to call Watson API to respond user queries?

Chatbot has been developed using IBM bulemix to respond the user queries of grade one students.
Suppose a question raised "What is the life cycle of the leaf?" As of now, Chatbot has no entities related to leaf, life cycle etc..
Chatbot identifies the above query as an irrelevant entity. For the above case is it possible call any Watson knowledge API to answer the above queries?
Or
Can we make any third party searches (google/bing).
Or
the only option we need teach more relevant entities to the chatbot
You can use Watson-Discovery Tool
https://www.ibm.com/watson/services/discovery/
As #Rabindra said, you can use Discovery. IBM Developers built one example using Conversation and Discovery service using Java. And I built one example using Node.js based on the Conversation simple example. You can read the README and will understand how it works.
Basically, you need to know: this example has one action variable to call for Discovery when don't have the "relevant information" for an answer to the user and the Discovery service is called for get relevant answers.
You can see more about in this video from Official IBM Watson channel.
See more about Discovery Service.
See the API Reference for using Discovery Service.
You can also check entity linking service from Bing: https://azure.microsoft.com/en-us/services/cognitive-services/entity-linking-intelligence-service/. It is in preview for now, so you will get limited queries per second but it is free for use.

AFIncrementalStore with Parse

I am developing an social app on iOS that have many-to-many relation, local persistency, and user interaction. I have tried using native Parse API in iOS and find it too cumbersome to do all the client-server logic. So my focus shifted to finding a syncing solution.
After some research I found AFIncrementalStore quite easy to use and it's highly integrated in CoreData. I just started to work on this and I have two questions to ask:
1) How to do the authentication process? Is it in AFRESTClient?
2) How to set up AFRESTClient to match Parse's REST API? (an example would be great!)
P.S. I also found FTASync, which seems to be another solution. Any thought on this framework?
Any general suggestion on client-server syncing solutions will be highly appreciated!
Thanks,
Lei Zhang
Back with iOS 5 Apple silently rolled out NSIncrementalStore to manage connection between APIs and persistent stores. Because I couldn't word it better myself:
NSIncrementalStore is an abstract subclass of NSPersistentStore designed to "create persistent stores which load and save data incrementally, allowing for the management of large and/or shared datasets". And while that may not sound like much, consider that nearly all of the database adapters we rely on load incrementally from large, shared data stores. What we have here is a goddamned miracle.
Source: http://nshipster.com/nsincrementalstore/
That being said, I've been working on my own NSIncrementalStore (built specifically for Parse and utilizing the Parse iOS/OS X SDK) and you're welcome to check out/use/contribute to the project at https://github.com/sbonami/PFIncrementalStore.
Take a look at this StackOverflow question and at Chris Wagner's article on raywenderlich.com.
The linked SO question has examples for how to include the authentication token with each request to Parse. So you'll just need to have the user log in first, and store their token to include it with each subsequent request.
Chris Wagner's tutorial has a sample AFHTTPClient named SDAFParseApiClient to communicate with the Parse REST API. You'd have to adapt it to be an AFRESTClient subclass, but it should give you a start.
Some other thoughts between the two solutions you're considering:
AFIncrementalStore does not allow the user to make any changes without a network connection, while FTASync keeps a full Core Data SQLite store locally and syncs changes to the server when you tell it to.
FTASync requires you to make all your synched managed objects subclasses of FTASyncParent, with extra properties for sync metadata. AFIncrementalStore keeps its metadata behind the scenes, not in your model.
FTASync appears not to be widely used and hasn't been updated in over a year; if you use it you will likely be maintaining it.

Resources