Get more from your Retail data with Predictive Analytics

This case study showcases our solution that allows Sales and Marketing to match customers to the products they are most likely to buy using retail data and predictive analytics. This case study is just as relevant today as it was just shy of one year ago when we created it. Combining this solution with Cognitive Intelligence such as facial recognition (as shown in the article here) provides even more opportunity in the retail sector.

Better Client Care with IoT – Our Eldercare Case Study

Utilising Internet-connected devices and real-time Advanced Analytics, we created a solution that provides better client care, a reduction in cost, time and potential health risks.

The solution helps the organisation overcome customer wellbeing challenges. Medication and Food must be kept at constant temperatures. This clever solution uses Azure IoT Suite and Cortana Intelligence Suit of technologies to create a proactive monitoring solution and ensures the maximum wellbeing of aged care clients.

See the video case study here:

See the full case study here: exposé case study – Eldercare

We test drive Azure Time Series Insights

azure time series

In this video, we take Azure Time Series Insights for a test drive. It is the new “fully managed analytics, storage, and visualization service that makes it simple to explore and analyse billions of IoT events simultaneously”.

We also importantly look at the differences between Power BI’s real-time visualisation capabilities and Azure Time Series Insights.

See more about Azure

Augmented Reality meets Advanced Analytics – it changes the way we plan

Augmented_reality_1

Using your historical data, putting it on steroids with the help of Machine Learning, then overlaying it with Augmented reality. All of a sudden you can see what impact changes in the physical space will have on your data. You can almost “experience” what the changes will look like.

Combining AR/ VR with Advanced Analytics is especially relevant anywhere where planning in the physical space applies. From city planning, through to event planning, and everything in-between. It’s about integrating data sources to bring contextually relevant information into your maps.

The experience is accessible via Hololens as well as through your mobile or tablet device.

In this demo, we show a segment of the Adelaide city map. It shows how foot traffic is affected by factors such as weather, time of day, the day of the week, etc. It also peers into the future by tying the solution into Machine Learning to see the likely effect on foot traffic in the future, including through substantial infrastructure changes such as replacing a building with a park.

Please see a more comprehensive brochure here – exposé and Cortex Interactive Virtual Planner Solution

See more about Advanced Analytics

info@exposedata.com.au

We test drive Azure Analysis Services – the good, and the not so good.

Microsoft’s announcement of the release of Azure Analysis Services for SSAS Tabular solutions as a PaaS solution has come at just the right time. See the related article here – https://exposedata.wordpress.com/2016/10/30/old-faithful-is-renewed-welcome-to-cubes-as-a-service/

In the modern world of Advanced Analytics, Semantic Models rather than data warehouses will increasingly play the core role in corporate data landscapes – they are places where data is blended into models that business users can understand, they provide single sources of truth, they can increasingly embrace data from anywhere, they serve as centralised repositories for business rules, they are massively responsive to business change, etc.

In addition, as the business adoption of cloud platforms increase and as they move away from server-based/ Infrastructure as a Service (IaaS) solutions towards Platform as a Service (PaaS) solutions, the greater the need becomes for PaaS based Semantic Models.

But as with most new services and technologies, they can take a while to settle down, so we gave this a few months and now decided to give Azure Analysis Services a proper test drive. In this article, we share our experiences in setting up an Azure Analysis Services solution – we run through its set-up and automation, we provide hints worth sharing, and we discuss the advantages of adopting Azure Analysis Services over its Virtual Machine/ IaaS based siblings.

1         Benefits

Many organizations are moving away from on-premise server-based solutions, some are even trying to avoid IaaS based solutions to meet their Analytical Workloads. This is mostly due to the cost advantages and flexibility associated with PaaS over IaaS and the sheer convenience of support, maintenance, and availability.

But some services in the Microsoft Business Intelligence stack, notably SQL Server Reporting Services (SSRS) for paginated reports, SQL Server Integration Services (SSIS) for batch ETL workloads, Master Data Services (MDS) for Master and Master Reference Data management and until recently SQL Server Analysis Services (SSAS) for semantic model databases, could only be deployed as Virtual Machine/ IaaS or server-based solutions.

HINT: Please also note that some of the other IaaS based services are on a current and imminent PaaS roadmap, so we should hopefully see some more PaaS versions of previous IaaS only services. Please make sure you bear this in mind when planning your environments.

Azure Analysis Services is available in the Australian Regions (Australia East and Australia South East) so for me often having to deal with questions around data sovereignty, Azure Analysis Services, and its underlying storage as Storage Accounts, both being local, therefore presents no data sovereignty issues at all. By the way – most major regions worldwide are now supported.

It ties in nicely with dev ops processes. It is for example much more responsive to changing business needs such as responding to unforeseen usage trends, without the need to manage VM’s.

There are obvious cost benefits. The PaaS model for Azure Analysis Services means that the solution can be scaled up or down to meet varying workload requirements, and even paused when not used. This all means a much more flexible PAYG cost model.

There are no new dev technologies or skills required. The solutions are still created using Visual Studio (SSDT templates), it is still managed and monitored through SQL Server Management Studio (SSMS), and it is still deployed in a similar way as before. There are however some deployment gotchas and automation pitfalls, but these can be easily overcome. We discuss these later in this article.

Power BI will find it much simpler to connect to Azure Analysis Services as it need not worry about an Enterprise Gateway. This makes for an even simpler architecture.

Below are our test drive results for Azure Analysis Services. We assessed it through the following criteria:

·       How easy is it to set up the service?

·       What about the development experience?

·       What does deployment look like?

·       How easy is it to actually access and use it?

·       Are there any operationalize pitfalls? We step through how to overcome these.

Say “hello Azure Analysis Services!”

2       Set up

Set up was really easy – you select a name, resource group, location, administrator (a named user from your AAD tenancy), and a pricing tier.

1 setup

HINT: The pricing tier comes in Developer, Basic and Standard options. Our suggestion is to opt for the Developer tier for evaluation, dev and test purposes and then to advance to either basic or standard for production depending on whether Perspectives, Multiple Partitions, and Direct Query mode were used during the development cycle.

1 setup 2

https://azure.microsoft.com/en-au/pricing/details/analysis-services/

The most basic Developer tier (D0) will equate to approx. $126 (AUD) per month if run 24 x 7, based on standard Microsoft pricing as at the date of this article.

Administrators can pause and resume the server as required. There are no charges whilst the server is paused. The developer cost quoted is, therefore, worst case scenario for this tier.

In addition, Azure Analysis Services use very cost effective Azure Storage Accounts as its primary storage mechanism.

HINT: We suggest to select the same location for the storage account as was selected for the Azure Analysis Services for performance reasons. We selected Australia South East.

3       Development

Azure Analysis Services uses the same development regime as SSAS Tabular, it’s only the deployment that is different. So, there should be no issues as long as you know your way around SSDT.

2 Dev

Developers can use SQL Server Data Tools in Visual Studio for creating models and deploying them to the service. Administrators can manage the models using SQL Server Management Studio and investigate issues using SQL Server Profiler.

HINT: Make sure you have the latest version of SSMS and SSDT otherwise you may run into trouble:

https://docs.microsoft.com/en-au/sql/ssms/download-sql-server-management-studio-ssms

https://docs.microsoft.com/en-au/sql/ssdt/download-sql-server-data-tools-ssdt

4       Deploy

Deployment can be tricky, but nothing too severe that cannot easily be overcome. Deployment is in theory very straightforward, but we suspect most people may run into compatibility level problems – we fall into that trap. We think it’s safe to say most people will still be using SQL Server 2012 or 2014. This requires some upgrades before you can deploy to Azure Analysis Services.

3 Deplpy

4.1      Ensure a consistent compatibility level

HINT: Deployment to Azure Analysis Services will only work at a compatibility level of 1200 or higher. Therefore (as in my case) upgrade the local version of SQL Server (or whatever is used as the workspace server) from 2014 to 2016.

HINT: Ensure Visual Studio also uses the correct compatibility level.

When we initially created the Visual Studio solution we would have selected a compatibility mode and quite likely “Do not show this message again”.

3 Deplpy 2

This means that all of our projects will have defaulted to the compatibility level we originally specified (1103). To change this, we had to log into the Tabular database in SSDT, then select Tools > Options. We then had to set the compatibility level as per below.

3 Deplpy 3

HINT: Keep in mind, upgrading the compatibility level is irreversible.

4.2      Actual deployment

  • Right-click the project
  • Select properties
  • Alter your Server, the database, and the model name
  • Click apply

3 Deplpy 4

Back in your project

  • Right-click the project
  • Select Deploy
  • Log in to your subscription
  • The Deployment should now start and hopefully complete without errors

5       Using the Azure Analysis Services Model

You may want to connect to your new Azure Analysis Services cube via SQL Server Management Studio (SSMS) or via a reporting tool such as Power BI or Excel.

All of these mechanisms proved to be seamless and simple as long as you are familiar with accessing IaaS or on premise-based SSAS Tabular databases. The fact that no Enterprise Gateway complications exist also makes using PaaS for Azure Analysis Services a very compelling option.

4 Use

5.1      SSMS

We opted for Active Directory Password Authentication.

4 Use 2

Browsing the Tabular database was exactly the same as the IaaS or on premise-based equivalent. We found performance to be pretty good too, even at the D0 tier.

4 Use 3

5.2         Power BI

The connection to Azure Analysis Services is found under the Azure section of the Get Data functionality.

4 Use 4

We opted for Live Connection, i.e. allow the data to remain in Azure Analysis Services rather than bring it into Power BI.

4 Use 5

Simple log in using my Organisational account

And we’re in!

4 Use 6

5.3         Other

Other 3rd party BI tools can also connect to Azure Analysis Services. This article, for example, discusses connections from Tableau.

https://azure.microsoft.com/en-au/blog/connect-tableau-to-an-azure-analysis-services-server/

6       Operationalise, Processing and other management tasks

You can, of course, process your Tabular database through SSMS manually, but in this section, we are automating that process on a scheduled basis. This can still be achieved via ETL, but as we are highlighting a PaaS solution here, we are limiting this discussion to PaaS only (SSIS still at the date of authoring of this article still an IaaS service only).

This automation has proved to be the most difficult of all criteria assessed. It is however relatively simple once it’s up and running, but it took quite a bit of effort to get to that point.

Our lower rating here is therefore not so much a judgement on Azure Analysis Services, but on Azure’s overall Automation regime, which is still very code-heavy and, in our opinion, not as nicely integrated and mature as other services’ integration, for example, the integration between Azure Event Hubs and Azure Stream Analytics.

5 Oper

In the sections below, we step you through how to automate the processing of Azure Analysis Services.

6.1      Azure Functions

In order to Automate Azure Analysis Services, it’s important to understand Azure Functions. Azure functions enable us to run small pieces of code, and it allows for this code to run without having to worry about an application or infrastructure to run the application. It also gives us the choice of various development languages (C#, F#, Node.js, Python, and PHP). In the following we list some of the key features of Azure functions:

HINT: Azure Functions come in two pricing plans as:

“Consumption plan – When your function runs, Azure provides all the necessary computational resources. You don’t have to worry about resource management, and you only pay for the time that your code runs.

App Service plan – Run your functions just like your web, mobile, and API apps. When you are already using App Service for your other applications, you can run your functions on the same plan at no additional cost.”

https://docs.microsoft.com/en-us/azure/azure-functions/functions-overview

6.2      Automating the scheduled processing of Azure Analysis Service using Azure Functions

In this section, we will walk you through how to use Azure Functions to trigger the processing the Azure Analysis Services database or tables.

6.2.1        Create the Azure Function

5 Oper 2

After creating new function app, open the app and choose “Timer” with “CSharp” language to create a new function:

5 Oper 3

6.2.2       Set up the Timer

Click on Integrate and select the Timestamp parameter name and schedule. In this example, we set the schedule to trigger the function every 8 hours.

5 Oper 4

6.2.3        Function App Configuration Setting

To run the Azure analysis service, upload the following files to the existing function. These files can be found from your local computer.

HINT: Make sure you have the latest data providers installed on your computers.

To get more info and download the latest data (if not available on your local computer) please see https://docs.microsoft.com/en-us/azure/analysis-services/analysis-services-data-providers

C:\Windows\Microsoft.NET\assembly\GAC_MSIL\Microsoft.AnalysisServices.Core\ Microsoft.AnalysisServices.Core.DLL

C:\Windows\Microsoft.NET\assembly\GAC_MSIL\Microsoft.AnalysisServices.Tabular\Microsoft.AnalysisServices.Tabular.DLL

The triggering function needs to use the above files. To load them into the app, click on function app> Platform features> Advanced tools (Kudu)

5 Oper 5

In a new window, select Debug Console>CMD. Navigate to “bin” folder and load the files into it.

5 Oper 6

6.2.4        Set up the Connection Strings

To set the connection string, select Platform features>Application settings:

5 Oper 7

Fill the connection string values as “name”, “value” and “SQL server”. The “name” will be used in your C# code. The “value” can be selected from SSAS server overview page:

5 Oper 8

5 Oper 9

6.2.5        Add the code

HINT: Remember to change the Run function input, Connection string, database and table names based on your model.

5 Oper 10

https://azure.microsoft.com/en-au/blog/automating-azure-analysis-services-processing-with-azure-functions/

Click Save and Run. You should see the following logs:

5 Oper 11

Voila!


Contributors: Etienne Oosthuysen, Shaun Poursoltan

Important Power BI news release

Power BI News

Starting June 1, 2017, Microsoft is making some changes to the way Power BI is licensed, there are also some important changes to the Power BI Service. So if you use Power BI, or intend to use Power BI, please be aware of these changes.

Exposé has been at the forefront of the Power BI revolution and we view these changes as even more positive steps towards a cost-effective, scalable and maturing BI and Analytics platform. We found that organizations really benefit from some guidance on the administration side of Power BI. If you’d like further advice on these changes or assistance with this transition and how it affects you and your organization, please don’t hesitate to get in touch.

Here are the changes:

Power BI free tier

Microsoft is now giving all free tier users the following capabilities:

  • ability to connect to all of the data sources that Pro users can connect to
  • a storage quota increase from 1GB to 10GB
  • data refresh maximum from once daily to once hourly
  • streaming data rates from ten thousand rows per hour to one million rows per hour

But in doing this they will be removing the following capabilities:

  • sharing reports and dashboards with other users
  • using group workspaces (now to be called app workspaces)
  • export to PowerPoint, CSV, Excel
  • analyze in Excel

This makes the licensing of the free tier truly for personal use only as all private sharing capabilities are now no longer available within the Power BI free license.

To help ease the transition to the new licensing model, Microsoft is allowing people who have had a license with the Power BI service on or before May 2, 2017 and have at least signed in once between May 2, 2016 and May 2, 2017 to apply for an extended trial of a Power BI Pro license. This license will enable the use of all Power BI Pro features until May 31, 2018. If you meet these requirements you will be sent an email from MS and will also have a notification appear when you log in to the service.

If you require an organizational use of Power BI, you will now either need to license all users for Power BI Pro or their new tier, Power BI Premium.

Power BI Premium

Power BI Premium is a new capacity-based licensing model coming late in the second quarter of 2017. It allows organizations to acquire only Power BI Pro licenses for report creators and the rest of the organization to consume these reports and dashboards without having to purchase a Pro license.

The charging model for this is based on a Premium node within the Azure environment that can be scaled according to an organizations performance requirements. Microsoft has provided a calculator service here to help estimate costs.

Power BI Report Server

Coming late in the second quarter of 2017, Microsoft will be offering the capability to publish Power BI reports on-premise using Power BI Report Server.

The on-premises server will allow the deployment and distribution of interactive Power BI reports and traditional paginated reports within the boundaries of an organization’s firewall.

To enable the use of Power BI Report Server, you will need to either be licensed under Power BI Premium or have a per-core license of SQL Server Enterprise Edition with Software Assurance.

Power BI Apps

Power BI content packs are changing to become known as Power BI apps.

At the moment, there won’t be a large difference between apps and content packs, mostly a change in interface and publishing process. But Microsoft has a roadmap for improvement under the new app model.

They are planning the following enhancements to app workspaces in the coming months:

  • Creating app workspaces will not create corresponding entities in O365 like group workspaces do. So you can create any number of app workspaces without worrying about different O365 groups being created behind the scene (you can still use an O365 group’s OneDrive for business to store your files).
  • Today you can add only individuals to the members and admin lists. In the next iteration, you will be able to add multiple AD security groups or modern groups to these lists to allow for easier management.

The impact, for now, is that Microsoft will rename all group workspaces to app workspaces and you can publish an app from any of these workspaces.

Power BI Embedded

Microsoft has also announced the convergence of the Power BI Embedded service with the Power BI service. This means that there will be one Power BI API that will have feature parity with the current Power Embedded service and so any existing apps built using Embedded today should continue to function but you will be required to prepare for migration over to the new service.

Power BI Service Interface

Finally, and for those who may not have been aware, Microsoft has been trialling a new interface for the Power BI service over the past few months. As of May, this interface will become the default. I’d recommend taking some time to understand what the new interface is like as there are some large changes to what you may be used to in relation to your workflow.

Can you predict if a student will drop out? Yes!

predict attrition

Our case study on a Higher Education Institution and the 360-degree student attrition solution we designed and developed. An intelligent way to understand attrition in the past, and using that to predict attrition in the future.

It allows the institution to foster relationships, at-risk students, before the event, ultimately decreasing attrition and avoiding revenue leakage.

 

See more about student attrition here

From operational challenges to a modern, automated and simplified organisation – Our Business SA case study

An Exposé case study around our advanced analytics solution for the ‘voice of business in South Australia’,  Business SA. The solution was an important component of a large digital transformation program that saw Business SA transition to a modern, automated and simplified organization which was underpinned by the following technology changes;

• A cloud-first strategy which reduced Business SA’s dependence on resources that provided no market differentiation
• Simplified the technology landscape with a few core systems which performed specific functions
• Established a modular architecture which is more able to accommodate change
• Implemented a digital strategy to support an automated, self-service and 24/7 service delivery
• Improve data quality through simpler and more intuitive means of data entry and validation
• Utilise the latest desktop productivity tools providing instant mobility capabilities

See the full case study here: Data and Analyticsexposé case study – Business-SA