azurebotframeworkazure-language-understandingdirect-line-botframeworkazure-stack

Is it possible to use the Direct Line API protocol that is through an azure bot service for an on-premise environment?


The question I have is can you use the Microsoft Bot Framework service via an on-premise solution through, ideally a docker container, ~~or at least an Azure Stack installation~~ (not available currently through azure stack)? We need a 100% on premise solution that will utilize LUIS and other Azure services but still be on-premise when utilizing the chat bot.

The problem is the bot almost requires a solution that is through the direct line api which authenticates through a token. This token is generated through an azure service, if it's not the secret, and the direct line api is through a registered bot application through an azure service.

Although there is LUIS container support, meaning a localized docker container that can pull down azure cognitive services and use them through that container, there doesn't seem to be any support for the bot framework service. Which just seems bizarre to not have one without the other.

https://learn.microsoft.com/en-us/azure/cognitive-services/cognitive-services-container-support

But, that's ok if utilizing an Azure stack that would perhaps solve a lot of on premise solutions. It could even be the hybrid variation where lLis and other aspects are through traditional cloud services but the bot service has to be on premise and able to utilize the direct Line api. If possible. Or what is another solution?

Would it have to be traditional restful api calls and what would be missing from a deployed nodejs or C# bot to the cloud. Perhaps I am missing something in the architecture but the need described is 100% off premise


Solution

  • You will want to look into offline DirectLine. This is an unoffical package, but it is open source.