Chatbots – how the Azure bot framework is changing the AI game

What are Chatbots?

Communication underpins intelligence. And language underpins communication. But language is complex and must be understood through the prism of intent and understanding. For example:

Take the term, “thong” – in Australian slang this means flip-flops, a meaning lost on someone not familiar with Australian slang, as it means underwear in most other countries.

This is where bots, specifically chatbots come into play. They allow users to interact with computer systems through natural language, and they facilitate the learning and training of, amongst others, language, intention and understanding through machine learning and cognitive APIs.

It is important for the chatbot to be able to leverage trained understanding of human language so that it knows how to respond to the user request, and what to do next. And so, when “John” (who you will meet below) interacts with the computer with the question “do you sell thongs?” the computer understands what it means within the correct context.

Sounds cool, but complicated? Things have become much easier

Five years ago, embarking on a project to build an intelligent chatbot would have been an exercise involving an array of specialists assisting in the interpretation of natural language processing.  It wasn’t something that was affordable for companies other than those in the Fortune 500.

How times have changed – with the development of natural language processing toolkits and bot building frameworks such as wit.ai and api.ai. these tools have allowed web application/lambda developers to have the means to create intelligent yet simple chatbots without the requirement of a natural language processing specialist.

There are different options available to build a chatbot, but in this article, we investigate the Microsoft bot framework and introduce our own EVA (the Exposé Virtual Agent) – a chatbot built within the Microsoft bot framework. But first, let’s have a quick look at why businesses should care (i.e. what are the business benefits)?

Why should businesses care?

It’s mostly about your customer experience!

We have all dealt with customer call centres. The experience can be slow and painful. This is mainly due to the human staff member on the other side of the call having to deal with multiple CRM and other systems to find the appropriate answers and next actions.

Chatbots are different. Providing they can have a conversation with the customer, they are not limited by technology as they have the ability to dig through huge amounts of information to pick out the best “nugget” for a customer. They can then troubleshoot and find a solution or even recommend or initiate the next course of action.

Let’s look at how this can be achieved with the Microsoft Bot Framework.

What is the Microsoft bot framework?

The Microsoft bot framework is a platform for building, connecting, testing and deploying intelligent and powerful bots.  The bot framework works by providing a tool that allows you to bring together all the Microsoft bot related technologies together; easily and efficiently. The core foundation of this framework is the Azure Bot Service.

The Azure Bot Service manages the desired interaction points, natural language processing tools and data sources. This means that all of the interactions go through the bot service before they make use of any natural language or cognitive toolkits, while also using these interactions to utilise information for a variety of data sources; for example Azure SQL Database.

In figure 1, “John” interacts with the Bot Service via a channel (that thing they use to communicate with the Computer in Natural Language). Many readers will already have used Skype and Slack to interact with other humans. They can now use this to interact with Computers too.

Bot Interaction
Figure 1

John is essentially asking about Thongs, its availability and ends up with all the information he needs to buy the product. The Bot framework interacts with the broader Cognitive Services APIs (in this example Language Understanding and Knowledge Base) and various external sources of information, whilst Machine Learning continually learns from the conversation.

Let’s look at a local government example:

A council ratepayer interacts with the council’s bot via the council website and asks for information on the rubbish collection. At this point, the bot will simply refer to a particular knowledge base, and in addition other sources of information such as the website, an intranet site or a database.  The bot’s response is at this stage informative. A response could, for example, be, “Rubbish is collected each week in Parkside on Friday mornings between 530am and 9am. General waste must go in the red bin and is collected each week. Recyclables in the Yellow bin and Garden Waste in the Green bin is alternated each week”.

The user realizes he has no Green bin and so asks the bot where one can obtain a Green bin.

The bot now uses Language Understanding APIs and picks up the words “where can…be obtained” as the user’s intent, and “Bin” and “Yellow” as entities (that could easily also have been “Green Bin” or “Rates Bill”, etc.). This invokes an interaction with the council’s Asset application and an order of the Asset required, and likely also any financials that go with it through the Billing system.

The question, therefore, leads to a booking and a delivery and bill; all without having to visit or call the council office and no on-hold telephone waits.

Who is our own Eva?

Eva
Eva – Exposé Virtual Assistant

It’s just been Christmas time, and Eva joined festivities 😊

If you browse to the Exposé website, http://exposedata.com.au/, you will meet Eva if you select “Chat with us now”. Eva was initially (Eva version 1) built to act as an intermediary between the website visitor and our knowledge base of questions and answers.  She is a tool that allows you to insert a series of questions and she returns answers. She learns from the questions and the answers using machine learning in order to improve the accuracy of responses. The net result is users spending less time searching for information on our website.

Eva version 2 was meant to solve our main pain point – what happens if the content on the web (or blog) site changes? With Eva version 1 we would have had to re-train Eva to align with new/ altered content. So, in version 2 we allowed Eva to dynamically search our WordPress blog site (this is where most of the content changes occur) so as to better answer user questions with up-to-date information.

And if the user’s question could not be answered, then we log this to an analytics platform to give us insight as to the questions visitors are asking.

Analytics
Eva – Analytics

In addition, we trained a language model in Microsoft Language Understanding Intelligent Service (LUIS) and built functionality inside of the Azure bot service to utilize functionality from the WordPress Exposé blog.

An example of an interaction with Eva can be seen below. As there are a few blogs that involve videos Eva will identify the videos and advise the visitor if there is a video on the requested subject.

EvaInteraction

Eva clearly found a video on predictive analytics on the blog site and so she returns a link to it. But she could not find anything on cats (we believe everyone loves cat videos 😊) and informs the visitor of this gap. She then presents the visitor with an option to contact us for more information.

Eva has learnt to understand the context of the topic in question. The answer is tailored depending on how the question is asked about “Predictive Analytics”. For example…

Chat

Go and try this for yourself, and try and replace “predictive analytics” with any of the topics below to get a relevant and contextual answer.

  • Advanced Analytics
  • Artificial Intelligence
  • Virtual Reality *
  • Augmented Reality *
  • Big Data *
  • Bot Framework
  • Business Intelligence
  • Cognitive Services *
  • Data Platform
  • Data Visualization *
  • Data Warehouse
  • Geospatial
  • IoT *
  • Machine Learning *
  • Predictive Analytics *

* Note that at the time of publishing of this article we only have videos for these topics. A comprehensive list of videos can be found here

Eva is ever evolving and she will soon become better at answering leading chained questions too.

GOTCHA: Eva was developed whilst the Azure Bot Service was in preview, but Bot names must now contain at least 4 characters.

Did this really help?

Often technology that looks appealing lacks a true business application.

But as you have seen from the example with Eva, we asked her about a video on a particular topic. Imagine using your intranet (e.g. SharePoint), data held in a Database or even an operating system as a source of information for Eva instead to interact with.

Authors: Chris Antonello (Data Analytics Consultant, Exposé) & Etienne Oosthuysen (Head of Technology and Solutions, Exposé)

Transforming the business into a data centric organisation through an Advanced Analytics and Big Data solution – our ACH Group case study

Big Data

An Advanced Analytics and Big Data solution allows for the acquisition, aggregation and blending of large volumes of data often derived from multiple disparate sources. Incorporating IoT, smart devices and predictive analytics into the solution.
Our ACH Group case study shows how a clever data platform architecture and design facilitates transformation into a data-centric organisation in response to comprehensive regulatory changes and to leverage opportunities presented by technology in order to create a better experience for customers and staff.

See the case study here: Exposé case study – ACH Group

See more about advanced analytics

Breaking: Microsoft announced a brand new Azure Machine Learning service, available in the Australian region

Exciting news hot off the press – Microsoft just announced new features in the Azure Machine Learning offering, available in preview in the following regions: East US, West Central US, and Australia East

One of the new feature offerings is Azure Machine Learning Services which provides additional functionality and regions over the existing and user-friendly Azure Machine Learning studio offering.

Azure Machine Learning Services includes:

  • ML on a bigger scale
  • AI-powered data wrangling
  • Spark
  • Docker
  • Cognitive Toolkit
  • TensorFlow
  • Caffe
  • Etc.
  • And relevant to Australian customers with data sovereignty concerns, the ability to provision this service in Australia East.

The differentiation seems to be that this new offering will apply to professional data scientists and allow for an end-to-end Data Science solution, whereas Azure machine Learning Studio will still be used by data analytics professionals who are more casual data scientists.

In Azure Machine Learning Services the model development and training occurs in Machine Learning Experimentation (I.e. the Azure Machine Learning Workbench application). When this is provisioned in Azure, it invokes a local Workbench Installer.

Once installed, the user can access the Workbench App. This is where ML training and scoring projects are managed. The screenshot below clearly shows it is much different from its Azure Machine Learning Studio predecessor. 

Once you log in to the Workbench you will be in the Workbench Dashboard, which is where projects are created and managed. It also contains templates that can be used by new users as starting points to learn from.

There is a good high-level overview here.

Transmissions Dashboard – our Energy Infrastructure Provider case study

This Transmissions Dashboard solution we designed and developed for this national Energy Infrastructure Provider involved both a data platform and analytical dashboards.

It reduced reporting turnaround potential by over 1000%, allowing staff to focus on business-critical tasks.

It embedded a trusted source of truth and quality and consistency of analysis.

It improved analytical agility and timely decision making, in and out of the office.

Please see our case study here: exposé case study – Energy Infrastructure Provider – Transmission Dashboard

Better Client Care with IoT – Our Eldercare Case Study

Utilising Internet-connected devices and real-time Advanced Analytics, we created a solution that provides better client care, a reduction in cost, time and potential health risks.

The solution helps the organisation overcome customer wellbeing challenges. Medication and Food must be kept at constant temperatures. This clever solution uses Azure IoT Suite and Cortana Intelligence Suit of technologies to create a proactive monitoring solution and ensures the maximum wellbeing of aged care clients.

See the video case study here:

See the full case study here: exposé case study – Eldercare

We test drive Azure Time Series Insights

azure time series

In this video, we take Azure Time Series Insights for a test drive. It is the new “fully managed analytics, storage, and visualization service that makes it simple to explore and analyse billions of IoT events simultaneously”.

We also importantly look at the differences between Power BI’s real-time visualisation capabilities and Azure Time Series Insights.

See more about Azure

Augmented Reality meets Advanced Analytics – it changes the way we plan

Augmented_reality_1

Using your historical data, putting it on steroids with the help of Machine Learning, then overlaying it with Augmented reality. All of a sudden you can see what impact changes in the physical space will have on your data. You can almost “experience” what the changes will look like.

Combining AR/ VR with Advanced Analytics is especially relevant anywhere where planning in the physical space applies. From city planning, through to event planning, and everything in-between. It’s about integrating data sources to bring contextually relevant information into your maps.

The experience is accessible via Hololens as well as through your mobile or tablet device.

In this demo, we show a segment of the Adelaide city map. It shows how foot traffic is affected by factors such as weather, time of day, the day of the week, etc. It also peers into the future by tying the solution into Machine Learning to see the likely effect on foot traffic in the future, including through substantial infrastructure changes such as replacing a building with a park.

Please see a more comprehensive brochure here – exposé and Cortex Interactive Virtual Planner Solution

See more about Advanced Analytics

info@exposedata.com.au

We test drive Azure Analysis Services – the good, and the not so good.

Microsoft’s announcement of the release of Azure Analysis Services for SSAS Tabular solutions as a PaaS solution has come at just the right time. See the related article here – https://exposedata.wordpress.com/2016/10/30/old-faithful-is-renewed-welcome-to-cubes-as-a-service/

In the modern world of Advanced Analytics, Semantic Models rather than data warehouses will increasingly play the core role in corporate data landscapes – they are places where data is blended into models that business users can understand, they provide single sources of truth, they can increasingly embrace data from anywhere, they serve as centralised repositories for business rules, they are massively responsive to business change, etc.

In addition, as the business adoption of cloud platforms increase and as they move away from server-based/ Infrastructure as a Service (IaaS) solutions towards Platform as a Service (PaaS) solutions, the greater the need becomes for PaaS based Semantic Models.

But as with most new services and technologies, they can take a while to settle down, so we gave this a few months and now decided to give Azure Analysis Services a proper test drive. In this article, we share our experiences in setting up an Azure Analysis Services solution – we run through its set-up and automation, we provide hints worth sharing, and we discuss the advantages of adopting Azure Analysis Services over its Virtual Machine/ IaaS based siblings.

1         Benefits

Many organizations are moving away from on-premise server-based solutions, some are even trying to avoid IaaS based solutions to meet their Analytical Workloads. This is mostly due to the cost advantages and flexibility associated with PaaS over IaaS and the sheer convenience of support, maintenance, and availability.

But some services in the Microsoft Business Intelligence stack, notably SQL Server Reporting Services (SSRS) for paginated reports, SQL Server Integration Services (SSIS) for batch ETL workloads, Master Data Services (MDS) for Master and Master Reference Data management and until recently SQL Server Analysis Services (SSAS) for semantic model databases, could only be deployed as Virtual Machine/ IaaS or server-based solutions.

HINT: Please also note that some of the other IaaS based services are on a current and imminent PaaS roadmap, so we should hopefully see some more PaaS versions of previous IaaS only services. Please make sure you bear this in mind when planning your environments.

Azure Analysis Services is available in the Australian Regions (Australia East and Australia South East) so for me often having to deal with questions around data sovereignty, Azure Analysis Services, and its underlying storage as Storage Accounts, both being local, therefore presents no data sovereignty issues at all. By the way – most major regions worldwide are now supported.

It ties in nicely with dev ops processes. It is for example much more responsive to changing business needs such as responding to unforeseen usage trends, without the need to manage VM’s.

There are obvious cost benefits. The PaaS model for Azure Analysis Services means that the solution can be scaled up or down to meet varying workload requirements, and even paused when not used. This all means a much more flexible PAYG cost model.

There are no new dev technologies or skills required. The solutions are still created using Visual Studio (SSDT templates), it is still managed and monitored through SQL Server Management Studio (SSMS), and it is still deployed in a similar way as before. There are however some deployment gotchas and automation pitfalls, but these can be easily overcome. We discuss these later in this article.

Power BI will find it much simpler to connect to Azure Analysis Services as it need not worry about an Enterprise Gateway. This makes for an even simpler architecture.

Below are our test drive results for Azure Analysis Services. We assessed it through the following criteria:

·       How easy is it to set up the service?

·       What about the development experience?

·       What does deployment look like?

·       How easy is it to actually access and use it?

·       Are there any operationalize pitfalls? We step through how to overcome these.

Say “hello Azure Analysis Services!”

2       Set up

Set up was really easy – you select a name, resource group, location, administrator (a named user from your AAD tenancy), and a pricing tier.

1 setup

HINT: The pricing tier comes in Developer, Basic and Standard options. Our suggestion is to opt for the Developer tier for evaluation, dev and test purposes and then to advance to either basic or standard for production depending on whether Perspectives, Multiple Partitions, and Direct Query mode were used during the development cycle.

1 setup 2

https://azure.microsoft.com/en-au/pricing/details/analysis-services/

The most basic Developer tier (D0) will equate to approx. $126 (AUD) per month if run 24 x 7, based on standard Microsoft pricing as at the date of this article.

Administrators can pause and resume the server as required. There are no charges whilst the server is paused. The developer cost quoted is, therefore, worst case scenario for this tier.

In addition, Azure Analysis Services use very cost effective Azure Storage Accounts as its primary storage mechanism.

HINT: We suggest to select the same location for the storage account as was selected for the Azure Analysis Services for performance reasons. We selected Australia South East.

3       Development

Azure Analysis Services uses the same development regime as SSAS Tabular, it’s only the deployment that is different. So, there should be no issues as long as you know your way around SSDT.

2 Dev

Developers can use SQL Server Data Tools in Visual Studio for creating models and deploying them to the service. Administrators can manage the models using SQL Server Management Studio and investigate issues using SQL Server Profiler.

HINT: Make sure you have the latest version of SSMS and SSDT otherwise you may run into trouble:

https://docs.microsoft.com/en-au/sql/ssms/download-sql-server-management-studio-ssms

https://docs.microsoft.com/en-au/sql/ssdt/download-sql-server-data-tools-ssdt

4       Deploy

Deployment can be tricky, but nothing too severe that cannot easily be overcome. Deployment is in theory very straightforward, but we suspect most people may run into compatibility level problems – we fall into that trap. We think it’s safe to say most people will still be using SQL Server 2012 or 2014. This requires some upgrades before you can deploy to Azure Analysis Services.

3 Deplpy

4.1      Ensure a consistent compatibility level

HINT: Deployment to Azure Analysis Services will only work at a compatibility level of 1200 or higher. Therefore (as in my case) upgrade the local version of SQL Server (or whatever is used as the workspace server) from 2014 to 2016.

HINT: Ensure Visual Studio also uses the correct compatibility level.

When we initially created the Visual Studio solution we would have selected a compatibility mode and quite likely “Do not show this message again”.

3 Deplpy 2

This means that all of our projects will have defaulted to the compatibility level we originally specified (1103). To change this, we had to log into the Tabular database in SSDT, then select Tools > Options. We then had to set the compatibility level as per below.

3 Deplpy 3

HINT: Keep in mind, upgrading the compatibility level is irreversible.

4.2      Actual deployment

  • Right-click the project
  • Select properties
  • Alter your Server, the database, and the model name
  • Click apply

3 Deplpy 4

Back in your project

  • Right-click the project
  • Select Deploy
  • Log in to your subscription
  • The Deployment should now start and hopefully complete without errors

5       Using the Azure Analysis Services Model

You may want to connect to your new Azure Analysis Services cube via SQL Server Management Studio (SSMS) or via a reporting tool such as Power BI or Excel.

All of these mechanisms proved to be seamless and simple as long as you are familiar with accessing IaaS or on premise-based SSAS Tabular databases. The fact that no Enterprise Gateway complications exist also makes using PaaS for Azure Analysis Services a very compelling option.

4 Use

5.1      SSMS

We opted for Active Directory Password Authentication.

4 Use 2

Browsing the Tabular database was exactly the same as the IaaS or on premise-based equivalent. We found performance to be pretty good too, even at the D0 tier.

4 Use 3

5.2         Power BI

The connection to Azure Analysis Services is found under the Azure section of the Get Data functionality.

4 Use 4

We opted for Live Connection, i.e. allow the data to remain in Azure Analysis Services rather than bring it into Power BI.

4 Use 5

Simple log in using my Organisational account

And we’re in!

4 Use 6

5.3         Other

Other 3rd party BI tools can also connect to Azure Analysis Services. This article, for example, discusses connections from Tableau.

https://azure.microsoft.com/en-au/blog/connect-tableau-to-an-azure-analysis-services-server/

6       Operationalise, Processing and other management tasks

You can, of course, process your Tabular database through SSMS manually, but in this section, we are automating that process on a scheduled basis. This can still be achieved via ETL, but as we are highlighting a PaaS solution here, we are limiting this discussion to PaaS only (SSIS still at the date of authoring of this article still an IaaS service only).

This automation has proved to be the most difficult of all criteria assessed. It is however relatively simple once it’s up and running, but it took quite a bit of effort to get to that point.

Our lower rating here is therefore not so much a judgement on Azure Analysis Services, but on Azure’s overall Automation regime, which is still very code-heavy and, in our opinion, not as nicely integrated and mature as other services’ integration, for example, the integration between Azure Event Hubs and Azure Stream Analytics.

5 Oper

In the sections below, we step you through how to automate the processing of Azure Analysis Services.

6.1      Azure Functions

In order to Automate Azure Analysis Services, it’s important to understand Azure Functions. Azure functions enable us to run small pieces of code, and it allows for this code to run without having to worry about an application or infrastructure to run the application. It also gives us the choice of various development languages (C#, F#, Node.js, Python, and PHP). In the following we list some of the key features of Azure functions:

HINT: Azure Functions come in two pricing plans as:

“Consumption plan – When your function runs, Azure provides all the necessary computational resources. You don’t have to worry about resource management, and you only pay for the time that your code runs.

App Service plan – Run your functions just like your web, mobile, and API apps. When you are already using App Service for your other applications, you can run your functions on the same plan at no additional cost.”

https://docs.microsoft.com/en-us/azure/azure-functions/functions-overview

6.2      Automating the scheduled processing of Azure Analysis Service using Azure Functions

In this section, we will walk you through how to use Azure Functions to trigger the processing the Azure Analysis Services database or tables.

6.2.1        Create the Azure Function

5 Oper 2

After creating new function app, open the app and choose “Timer” with “CSharp” language to create a new function:

5 Oper 3

6.2.2       Set up the Timer

Click on Integrate and select the Timestamp parameter name and schedule. In this example, we set the schedule to trigger the function every 8 hours.

5 Oper 4

6.2.3        Function App Configuration Setting

To run the Azure analysis service, upload the following files to the existing function. These files can be found from your local computer.

HINT: Make sure you have the latest data providers installed on your computers.

To get more info and download the latest data (if not available on your local computer) please see https://docs.microsoft.com/en-us/azure/analysis-services/analysis-services-data-providers

C:\Windows\Microsoft.NET\assembly\GAC_MSIL\Microsoft.AnalysisServices.Core\ Microsoft.AnalysisServices.Core.DLL

C:\Windows\Microsoft.NET\assembly\GAC_MSIL\Microsoft.AnalysisServices.Tabular\Microsoft.AnalysisServices.Tabular.DLL

The triggering function needs to use the above files. To load them into the app, click on function app> Platform features> Advanced tools (Kudu)

5 Oper 5

In a new window, select Debug Console>CMD. Navigate to “bin” folder and load the files into it.

5 Oper 6

6.2.4        Set up the Connection Strings

To set the connection string, select Platform features>Application settings:

5 Oper 7

Fill the connection string values as “name”, “value” and “SQL server”. The “name” will be used in your C# code. The “value” can be selected from SSAS server overview page:

5 Oper 8

5 Oper 9

6.2.5        Add the code

HINT: Remember to change the Run function input, Connection string, database and table names based on your model.

5 Oper 10

https://azure.microsoft.com/en-au/blog/automating-azure-analysis-services-processing-with-azure-functions/

Click Save and Run. You should see the following logs:

5 Oper 11

Voila!


Contributors: Etienne Oosthuysen, Shaun Poursoltan