The Purple Peeps Seven Favourites from Ignite 2021

What a wonderful couple of days at Ignite! It’s fascinating to hear all these exciting announcements and top tips from experts globally. One of the best things about being a Microsoft Partner is the access to the technologies, products and tech community, and all these together, empower us to drive impact so we can accelerate our customers’ data journey.

As we come to the end of this year’s Ignite, of all the sessions around data and AI, here are the ones our team liked most.

Etienne’s Best Picks

One of our focus areas over the past year has been Synapse, and more recently Purview. Exposé has in fact recently concluded a major program of work, where Synapse performed a central analytical function unifying multiple data across sources, contending with billions of records, and making visual insights available to users within seconds.

It is therefore fantastic to see the various improvements and announcements regarding Synapse, and a closer integration with Purview. Here are some highlights:

Azure Synapse Link

Link provides near real-time integration from Dataverse, Cosmos DB and SQL. With Link, operational data can be integrated into Synapse Analytics without needing to build ETL pipelines. Link for Dataverse is now GA, Link for Cosmos DB is in Private preview, and Link for SQL Server 2022 is expected soon through the upcoming CPT. The business benefit here is an immediate data analytics layer over your operational data. This again changes the whole paradigm around what a modern data warehouse is as it brings operational data in real time through such a modern data warehouse.

Azure Synapse Database templates

Database templates is now in preview. These templates are designed around common industry data warehouse models and will accelerate the build of your modern data warehouse. It contains a set of database templates that includes base entities that can be used as a blueprint (or a starter) for your data warehouse. It also includes a code free Designer that will allow users to map, transform and load the data into these models without code. The business benefit here is a massively accelerated path towards a data warehouse design (a starter model, that can of course be changed to suit your specific needs) that applies to your industry, plus a code free way to load your data into the resulting models.

Azure Synapse Data Explorer

Now forming a part of Synapse, Data Explorer allows you to analyse text heavy log or event data in near real time and contextualise that data with your other business data to assess impact and gain insights. Structured and unstructured data is easily mashed together, and it does this in near real time and at scale. The business benefit lies in the ability to contextualise your event or log data with business data quickly and at scale which adds another vantage point within your modern data warehouse.

Azure Purview Catalogue Search now in Synapse

Purview, only recently released, already has over 57bn data catalogued. It is going to become an invaluable piece of technology for data discovery and governance. The business benefit means the data workers, such as a data engineer, working in Synapse building authoritative data transformations, but do not yet know what is available has the ability to search and discover the datasets across his data ecosystem, right there within his workspace, and this is a huge productivity gain. Please feel free to see our recent video discussion on Purview & modern data governance here.


Jake’s Best Picks

Azure OpenAI Service

Azure OpenAI Service is now  bringing OpenAI’s powerful GPT-3 natural language model to the Azure platform. Already used in some of Microsoft’s existing services and GitHub’s Copilot tool that helps developers write code for them, developers can now leverage this functionality in their own models. The business benefit here is OpenAI service now brings this powerful natural language model to customers within Azure’s enterprise capabilities, scale and security.

Azure Percept

Azure Percept – seamlessly builds and manages intelligence to the edge with Azure AI and Machine Learning with a complete platform including software and hardware components to accelerate IoT development and prototyping. The business benefit here is a complete package solution providing the pre-configured hardware IoT devices, pre-trained models and software for configuring and managing these devices that allows customers to fast-track AI development at the edge.


Willem’s Best Picks

Attending this year’s MS Ignite left me with two primary takeaways. I am firstly excited to see the continuing investment by Microsoft into the Azure Synapse product. It is proving to be a powerhouse on the platform and has been the cornerstone of each of my Azure deliveries to clients within the past year or two. Secondly, often when we consult with business users (not just IT departments) and we see an increasing desire to get closer to data. It is becoming more common to see ‘citizen developers’ within businesses now. Thus, the Power Platform plays a significant role as it helps move things along with low-code projects while still working within some framework that satisfies the IT security requirements.

Power Platform

This platform has again become more powerful (pun intended) with its low- and no-code approach to common data workloads. Some additions include: the Process Advisor to Power Automate; the pay-as-you-go pricing model for Power Apps; and the Power BI app for MS Teams. Process Advisor gives an easy low-code starting point for clients to engage with a commonly asked question about analysing a specific process to look for bottlenecks. The business benefit here is that Power Apps is now more approachable for ‘proof of value’ type projects with lower cost instead of per-user licensing models. Users now have access to their Power BI environment within MS Teams and are able to collaborate right there (where they are already working) without having to swap to the browser.

Honorary mention

The Microsoft and Qlik collaboration for data integration shows a great example of how one can combine the forces of these two providers two solve business problems. The Qlik Replicate and Azure Synapse demo will be especially valuable to those clients working with SAP environments and who are looking at moving towards data lake or data warehousing in Azure Synapse.

Chatbots – how the Azure bot framework is changing the AI game

What are Chatbots?

Communication underpins intelligence. And language underpins communication. But language is complex and must be understood through the prism of intent and understanding. For example:

Take the term, “thong” – in Australian slang this means flip-flops, a meaning lost on someone not familiar with Australian slang, as it means underwear in most other countries.

This is where bots, specifically chatbots come into play. They allow users to interact with computer systems through natural language, and they facilitate the learning and training of, amongst others, language, intention and understanding through machine learning and cognitive APIs.

It is important for the chatbot to be able to leverage trained understanding of human language so that it knows how to respond to the user request, and what to do next. And so, when “John” (who you will meet below) interacts with the computer with the question “do you sell thongs?” the computer understands what it means within the correct context.

Sounds cool, but complicated? Things have become much easier

Five years ago, embarking on a project to build an intelligent chatbot would have been an exercise involving an array of specialists assisting in the interpretation of natural language processing.  It wasn’t something that was affordable for companies other than those in the Fortune 500.

How times have changed – with the development of natural language processing toolkits and bot building frameworks such as wit.ai and api.ai. these tools have allowed web application/lambda developers to have the means to create intelligent yet simple chatbots without the requirement of a natural language processing specialist.

There are different options available to build a chatbot, but in this article, we investigate the Microsoft bot framework and introduce our own EVA (the Exposé Virtual Agent) – a chatbot built within the Microsoft bot framework. But first, let’s have a quick look at why businesses should care (i.e. what are the business benefits)?

Why should businesses care?

It’s mostly about your customer experience!

We have all dealt with customer call centres. The experience can be slow and painful. This is mainly due to the human staff member on the other side of the call having to deal with multiple CRM and other systems to find the appropriate answers and next actions.

Chatbots are different. Providing they can have a conversation with the customer, they are not limited by technology as they have the ability to dig through huge amounts of information to pick out the best “nugget” for a customer. They can then troubleshoot and find a solution or even recommend or initiate the next course of action.

Let’s look at how this can be achieved with the Microsoft Bot Framework.

What is the Microsoft bot framework?

The Microsoft bot framework is a platform for building, connecting, testing and deploying intelligent and powerful bots.  The bot framework works by providing a tool that allows you to bring together all the Microsoft bot related technologies together; easily and efficiently. The core foundation of this framework is the Azure Bot Service.

The Azure Bot Service manages the desired interaction points, natural language processing tools and data sources. This means that all of the interactions go through the bot service before they make use of any natural language or cognitive toolkits, while also using these interactions to utilise information for a variety of data sources; for example Azure SQL Database.

In figure 1, “John” interacts with the Bot Service via a channel (that thing they use to communicate with the Computer in Natural Language). Many readers will already have used Skype and Slack to interact with other humans. They can now use this to interact with Computers too.

Bot Interaction
Figure 1

John is essentially asking about Thongs, its availability and ends up with all the information he needs to buy the product. The Bot framework interacts with the broader Cognitive Services APIs (in this example Language Understanding and Knowledge Base) and various external sources of information, whilst Machine Learning continually learns from the conversation.

Let’s look at a local government example:

A council ratepayer interacts with the council’s bot via the council website and asks for information on the rubbish collection. At this point, the bot will simply refer to a particular knowledge base, and in addition other sources of information such as the website, an intranet site or a database.  The bot’s response is at this stage informative. A response could, for example, be, “Rubbish is collected each week in Parkside on Friday mornings between 530am and 9am. General waste must go in the red bin and is collected each week. Recyclables in the Yellow bin and Garden Waste in the Green bin is alternated each week”.

The user realizes he has no Green bin and so asks the bot where one can obtain a Green bin.

The bot now uses Language Understanding APIs and picks up the words “where can…be obtained” as the user’s intent, and “Bin” and “Yellow” as entities (that could easily also have been “Green Bin” or “Rates Bill”, etc.). This invokes an interaction with the council’s Asset application and an order of the Asset required, and likely also any financials that go with it through the Billing system.

The question, therefore, leads to a booking and a delivery and bill; all without having to visit or call the council office and no on-hold telephone waits.

Who is our own Eva?

Eva
Eva – Exposé Virtual Assistant

It’s just been Christmas time, and Eva joined festivities 😊

If you browse to the Exposé website, http://exposedata.com.au/, you will meet Eva if you select “Chat with us now”. Eva was initially (Eva version 1) built to act as an intermediary between the website visitor and our knowledge base of questions and answers.  She is a tool that allows you to insert a series of questions and she returns answers. She learns from the questions and the answers using machine learning in order to improve the accuracy of responses. The net result is users spending less time searching for information on our website.

Eva version 2 was meant to solve our main pain point – what happens if the content on the web (or blog) site changes? With Eva version 1 we would have had to re-train Eva to align with new/ altered content. So, in version 2 we allowed Eva to dynamically search our WordPress blog site (this is where most of the content changes occur) so as to better answer user questions with up-to-date information.

And if the user’s question could not be answered, then we log this to an analytics platform to give us insight as to the questions visitors are asking.

Analytics
Eva – Analytics

In addition, we trained a language model in Microsoft Language Understanding Intelligent Service (LUIS) and built functionality inside of the Azure bot service to utilize functionality from the WordPress Exposé blog.

An example of an interaction with Eva can be seen below. As there are a few blogs that involve videos Eva will identify the videos and advise the visitor if there is a video on the requested subject.

EvaInteraction

Eva clearly found a video on predictive analytics on the blog site and so she returns a link to it. But she could not find anything on cats (we believe everyone loves cat videos 😊) and informs the visitor of this gap. She then presents the visitor with an option to contact us for more information.

Eva has learnt to understand the context of the topic in question. The answer is tailored depending on how the question is asked about “Predictive Analytics”. For example…

Chat

Go and try this for yourself, and try and replace “predictive analytics” with any of the topics below to get a relevant and contextual answer.

  • Advanced Analytics
  • Artificial Intelligence
  • Virtual Reality *
  • Augmented Reality *
  • Big Data *
  • Bot Framework
  • Business Intelligence
  • Cognitive Services *
  • Data Platform
  • Data Visualization *
  • Data Warehouse
  • Geospatial
  • IoT *
  • Machine Learning *
  • Predictive Analytics *

* Note that at the time of publishing of this article we only have videos for these topics. A comprehensive list of videos can be found here

Eva is ever evolving and she will soon become better at answering leading chained questions too.

GOTCHA: Eva was developed whilst the Azure Bot Service was in preview, but Bot names must now contain at least 4 characters.

Did this really help?

Often technology that looks appealing lacks a true business application.

But as you have seen from the example with Eva, we asked her about a video on a particular topic. Imagine using your intranet (e.g. SharePoint), data held in a Database or even an operating system as a source of information for Eva instead to interact with.

Authors: Chris Antonello (Data Analytics Consultant, Exposé) & Etienne Oosthuysen (Head of Technology and Solutions, Exposé)

Transforming the business into a data centric organisation through an Advanced Analytics and Big Data solution – our ACH Group case study

Big Data

An Advanced Analytics and Big Data solution allows for the acquisition, aggregation and blending of large volumes of data often derived from multiple disparate sources. Incorporating IoT, smart devices and predictive analytics into the solution.
Our ACH Group case study shows how a clever data platform architecture and design facilitates transformation into a data-centric organisation in response to comprehensive regulatory changes and to leverage opportunities presented by technology in order to create a better experience for customers and staff.

See the case study here: Exposé case study – ACH Group

See more about advanced analytics

Power BI Report Server – A Quick Walk Through

Power BI

Just about any implementation of a reporting environment within an organisation will have different styles of reports for different purposes. These reports may be operational\paginated, self-service using Excel, and\or highly interactive and user-friendly Power BI reports. The central question then beckons, how do all our users access these reports within one location which is easily accessible?

Typically businesses that are heavy users of Microsoft products would use SharePoint to provide a portal to the various documents and reports that need to be shared within the organisation; including Self Service and Operational style reports. SharePoint has been an ideal intranet tool for over 15 years and continues to provide an excellent service for content management. However, Power BI is taking a substantial share of the BI Reporting market, even more so with organisations already heavily invested in Microsoft products.  Therefore having a portal that interacts with this reporting tool is becoming more and more necessary.

For organisations that want to have a one-stop shop for their entire suite of reports, including Power BI, as well as those who’d like to provide a portal for accessing reports without the overhead of a SharePoint installation, then you’ll be happy to learn that as of July 2017 there’s something for you!

We are happy to announce that SQL Server Reporting Services Portal, which was released with SQL Server 2016, has a more enhanced version which is named Power BI Report Server. The Power BI Report Server provides a portal for several types of Microsoft reports, including Excel, but most importantly, is its interactivity with Power BI reports. Previously, Power BI had no mechanism to be interacted with on premises, along with other types of Reports.

In this article, I will:

  • Discuss what this means for businesses.
  • Explain why you would use Power BI Report Server.
  • Provide some examples of what the portal looks like.
  • Highlight this as a great stepping stone from on Premises Power BI desktop Reporting into off premises Power BI as a Service Reporting

What does this mean for businesses (business benefits)

For organisations who already have SQL Server Enterprise with Software Assurance:

The Power BI Report Server is available to you as part of your Software Assurance licensing agreement with SQL Server Enterprise Edition. Your business is primed to make use of the benefits that Power BI Report Server provides with a relatively simple installation process.

Instead of using a network file share or other methods to provide access to report files, upload the reporting files to the Power BI Reporting Server Portal. Folders and security can be applied within the portal to restrict what AD members can see and do. Documentation files and other files types can also be uploaded much like a content management system.

For organisations already using SharePoint for their Reporting Portal:

It is no longer necessary to use SharePoint as a portal to provide access to the suite of reports your business uses. Transitioning from SharePoint to Power BI Report Server can save a considerable amount from licensing, but also time spent maintaining and having skills on hand to manage your SharePoint server. Instead of continuing to be dependent on a complex and costly on-premises content management system, the Power BI Report Server is almost like for like for accessing Reports. It has one other significant benefit; interaction with Power BI.

For organisations wanting to use Power BI on premises:

Power BI Report Server is the on-premises portal of Power BI. Power BI is also the replacement of Power View which has existed since SQL Server 2012, and it is a significantly more feature rich tool in comparison. Therefore, if wanting to deploy Power BI Desktop Reports on-premise within your organisation, alongside other reports such as Reporting Services and Excel, and primarily make these reports accessible to a range of users using a portal, then Power BI Report Server is your best option.

See this short video on how the single Power BI Report Server hosts Excel, Power BI and Reporting Services reports and how the Portal is used to access all three.

Why Power BI Report Server

Power BI Report Server is the content management system for Power BI Desktop Reports and other Microsoft compatible reports, although it can store other file types too. It is an on-premises solution which provides a portal and integration of several reporting services in one location. This includes Power BI Desktop, SSRS and Excel among other types. The Power BI Reporting Server is more or less the SQL Server Reporting Services Portal, except it provides for the first time, a portal, for the entire suite of reports that Microsoft offers. Power BI Report Server is SQL Server Reporting Services Portal on steroids!

  • Power BI Report Server is compatible with SSRS, Excel and Power BI Desktop files.
  • Other report styles are also offered within this portal such as:
    • Mobile Reports which are mobile device friendly
    • Report Builder for ad-hoc report generation
    • KPI’s (Key Performance Indicators) are simplistic reports which can give an indication of a simple trend.
  • Data Connections can be created and subsequently used throughout the reports uploaded to the portal. Direct Connections include:
    • Microsoft SQL Server
    • Microsoft Azure SQL Database
    • Microsoft SQL Server Analysis Services
    • Microsoft SharePoint Lists
    • Oracle Essbase
    • SAP BW
    • OLE DB
    • ODBC
    • XML
  • Documents can be uploaded to the portal for use as training guides.
  • Branding, including colour schemes, can be applied
  • Active Directory User Groups\members can be used to set access controls to allow what users can see, and\or connect to.
  • PowerBI essentially replaces the need for PowerView except PowerBI is a much more powerful tool
  • Power BI Desktop (on-prem) doesn’t have all the functionality of the Power BI as a Service (Cloud) has, such as Natural Language Queries, Dashboards and PBI Apps.
  • The Power BI Desktop files will render within the Reporting portal but can be downloaded as a file and opened in Power BI Desktop allowing the user to make changes to the report, which can then be uploaded as a new report.
  • An Excel Worksheet\Template can be uploaded with a particular look and feel ready for Users to interact with, without starting from scratch. The Excel file can be rendered within the browser if Office Online Server (OOS) has been installed on the server, otherwise, will need to be downloaded to the user’s PC before it can be manipulated.

An example of what the Portal looks like

Power BI Report Server is a great stepping stone towards Power BI as a Service

An alternative to using Power BI Desktop is Power BI as a Service, which is the cloud offering of Power BI. Power BI in the cloud has its own style of the content management system, but this is off premises. Using Power BI Desktop on premises with Power BI Desktop Server as the portal to share these reports, is a great stepping stone towards a transition to using Power BI in the cloud, as the power BI Desktop reports are fully compatible with Power BI in the cloud, therefore can be uploaded when ready in the future.

Conclusion

Power BI Report Server is the recommended portal to access reports generated by the entire suite of Microsoft Reporting tools. PowerBI is an industry leader in BI functionality and is highly recommended to be leveraged. The Power BI Report Server therefore seamlessly integrates SSRS, Excel and Power BI desktop together along with other reporting styles like Mobile and KPI’s. It responds similarly to SharePoint for content management, but with much less overhead for installation and administration. Security within Windows AD Groups will also be able to be applied like for like compared with SharePoint which makes for a quite a simple transition from SharePoint to Power BI Report Server. Using Power BI Report server and getting used to developing BI Reports within Power BI is also future-proofing your organisation, as Power BI as a service will readily accept any Power BI reports developed for this platform and be usable within the cloud.

* Update – 5th of November 2017

As mentioned in this video walkthrough, Power BI Report Server has its own release cycle, ensuring updates and enhancements are more frequently provided than would be if associated with SQL Server releases. The latest release of Power BI Report Server was in late October 2017 with these features:

  • Support for imported data in Power BI reports
  • Ability to view excel workbooks within the web portal. This is done by configuring Office Online Server.
  • Support for the new Power BI table and matrix visuals.
  • REST API support

More detail about the release notes is provided on the Power BI Website.

See another blog post about Power BI Here

Game changing – see how Power BI models become Azure Analysis Services models

Businesses are increasingly embracing an empowered regime when it comes to data analytics and business intelligence. Subject matter experts inside business units are increasingly at the forefront of the creation of data models on which reports and dashboards rely. Various technologies (such as Tableau, Qlik and Power BI), now facilitate user-access to a wide variety of data sources and makes the task of data modelling easier than ever before.

Until now, these technologies did not draw a clear separation between modelling and reporting and dashboards. It meant that businesses locked themselves into one technology and that usually had costly licencing implications. One way businesses could overcome this inflexibility was to create data models in more advanced ICT based semantic model technology such as SQL Server Analysis Services, Oracle Essbase or IBM Cognos TM1. But authoring models in these technologies were often not within the skill-set of business based subject matter experts.

In an ideal world, data workers (including the business based subject matter experts) want easy and cost-effective environments to create and deploy data models (data acquisition, data transformation, enhancements and relationships) without having to learn very complex coding and technical skills. And then for the business to leverage such deployed models, either in a related visual technology or in another technology they may prefer altogether.

We are happy to announce that this is becoming increasingly possible. Microsoft recently introduced the ability to import Power BI Desktop files into Analysis services, and this is a serious game changer.

In this article, I will:

  • Discuss what this means for businesses.
  • Briefly delve into why tools such as Power BI, Qlik and Tableau lacked modularity and how this is now changing with the play between Power BI Desktop and Analysis Services.
  • Walk the reader through deploying the Power BI Desktop authored model and how to make it an Analysis Services model.
  • Describe some examples of what is possible with a deployed model.
  • Use Tableau to connect to my new model for reporting and dash-boarding.

What does this mean for businesses (business benefits)

The user creates his/her model using Power BI Desktop; a free, easy to use business analytics tool provided by Microsoft. It has both comprehensive semantic modelling and reporting and dash-boarding capabilities, but for this article, we focus on the data modelling rather than the visual capabilities. Once created, the user can deploy their solution to a Power BI environment, or import it into an Azure Analysis Service model.

Deploy to a Power BI environment – In this deployment model, there is only limited the only separation between models, reports and dashboards. This applies to both Power BI Service (cloud) and Power BI Report Server (on-premise). The deployed models remain available mostly to Power BI visualisations, and to some extent, Excel.

Deploy to Analysis Services – if the solution is imported to Analysis Services then separation of model vs reports and dashboards are maximised. The advantages of this are:

  • The Analysis Services model (that started life in Power BI Desktop) can now be accessed through other BI tools the business may choose to use for visualisation and self-service (reports and dashboards), for example, Power BI, Excel, Tableau and Qlik.
  • It becomes easier for businesses to change their BI tools from one technology to another as the underlying data model (now in Analysis Services) remains in place.
  • The business can control performance by changing the pricing tier of Analysis Services and scale up during peak workloads, and scale down when there is less demand for the data model.

The business can better control cost by pausing Analysis Services during zero demand periods. This is typically a much more compelling cost model compared to conventional annual licences.

A question of modularity

Ever since Microsoft introduced Power Query in Power BI version 1 a few years back, data workers found a powerful data modelling ally that gave them modelling capabilities (data acquisition, transformations, relationships, calculated column and measure, and hierarchies) without having to understand complex coding or data modelling skills. Competitors such as Qlik and Tableau, have similar capabilities, so a business’ preference for Power BI vs Qlik or Tableau (etc.) came down to factors such as familiarity, loyalty, perception and cost.

The problem with this was a lack of modularity (a lack of separation between the model itself, and how great the interactive visual report and dashboard capabilities the tools provide). If you created a model in Power BI Desktop or Qlik or Tableau, you were pretty much stuck with visualizations within your selected tool. There was no logical separation between the model and the visuals.

It is now possible to achieve modularity and separation of the model and visuals through the close relationship between Power BI and Analysis Services:

  • The data worker creates his/her model using Power BI Desktop.
  • The Power BI Desktop file (a PBIX file) is then imported into Analysis Services, and it becomes an Analysis Services model.
  • The Analysis Services model can then be accessed for development and enhancement by the business and ICT.
  • The Analysis Services model can be accessed by creators of self-service reports and dashboards through BI tools of their choice.

Gotcha – “Please note that for PBIX import, only Azure SQL Database, Azure SQL Data warehouse, Oracle, and Teradata are supported as model data sources. Also, Direct Query models are not yet supported for import. Microsoft will be adding new connection types for import every month and add additional functionality”Read more here

Caveat – I am not saying modularity is a prerequisite to great BI solutions! Some businesses are pretty happy with whole adopting a comprehensive BI technology that includes the data model, visuals, and many other capabilities, but some of the business benefits achieved with modularity just won’t be available in such deployments.

I have a Power BI model; I want to maximize its use in other BI technologies, how do I do that?

If you do not already have an Azure Analysis Services service provisioned, one needs to be created:

  • Log into your Azure tenant.
  • Select New > Data + Analytics > Analysis Services.
  • Complete all the required settings, and Create.

1 provision aas

Add your Power BI model

  • Open your Analysis Services service, and if it’s not already started, click on Start (please note you pay for this service for every minute it runs, cost stops when paused).
  • In the Overview pane, open the Web Designer (as at 7th October 2017, this feature is still in preview, so functionality is still a work in progress).
    • “Microsoft Azure Analysis Services Web Designer” is a new Analysis Services experience, currently still in Preview, that allows developers to create and manage Azure Analysis Services (AAS) semantic models quickly and easily. SQL Server Data Tools and SQL Server Management Studio remain the primary tools for development, but this new experience is intended to make simple changes fast and easy (including the ability to import Power BI Desktop authored models quickly).
  • Click on Add a new Model.
  • Select an appropriate name, and select the source of your data (this is either Azure SQL Database, Azure SQL Data warehouse, or Power BI Desktop file). For this article, we will select a Power BI Desktop file.

2 import pbix

  • Navigate to your Power BI Desktop file and select Import.
  • If your Power BI model uses a data source not yet supported then you will receive an error as per below, and you will have to wait until your data source becomes available. See the gotcha earlier in this article.3 poss error• If your Power BI model uses available data sources and functionality, then your Power BI Desktop file (PBIX) is converted to Analysis Services.

Quick access to Web Designer to edit and query the model

You can immediately access your model right here in Web Designer and perform simple drag and drop queries, basic development changes, and edit relationships:

  • Perform simple query drag and drop:

4 add 14 add 2• Perform some basic development changes:

4 add 3

  • Edit relationships:

4 add 4

More comprehensive editing and querying

You can alternatively also open the model in one of the following technologies:

  • Visual Studio – for comprehensive editing
  • Power BI Desktop – for comprehensive editing and visualisations
  • Excel – for visualisations

4 add 5

Build Visualisations from your Model (in this example via Tableau)

  • Open your BI tool of choice, for example, Power BI, Excel or Tableau. In this example, I am using Tableau Desktop (10.4 Profesional). Each BI technology may have slightly different ways of doing so.

5 tab 1

  • Connect to a server.

5 tab 2

  • Select Microsoft Analysis Services.

5 tab 3

  • Navigate to your Analysis Services service and copy the server information from the overview page:

x server

  • Paste the server information into the Server field.
  • Select “Use a specific username and password.
    • Use your Azure Active Directory (O365) credentials used for your Azure Services
  • Select Sign-in.

5 tab 4

  • Now select the specific Database and Cube.
    • The “Premiums and Claims” database and Model originated as a Power BI Desktop file, which was imported into Azure Analysis Services as explained earlier in this article.

5 tab 5

  • To start authoring your Visual reports, click on “Sheet 1”.

5 tab 6

  • Here is an example of my Tableau report created over my Analysis Services model that originated in Power BI Desktop:

6 Tab.JPG

Scale the performance within minutes (up/ down)

In case performance needs to be improved, navigate back to Analysis Services.

  • Select the Pricing Tier, and scale the tier up to a higher performing service level (note this will mean higher pay-as-you-go costs.
    • During the scale, up/ down process queries won’t be able to run.
  • I observed significant performance improvements through the Tableau client when I scaled Analysis Services from tier S0 to S2.

Conclusion

As the Web Designer and its ability to import PBIX files as Analysis Services models go into general availability and mature after that, business will, if they want to leverage the advantages of modularity, have to keep an eye on this functionality.

This is game-changing as it will put pressure on competitors (such as Qlik and Tableau) to open up their models to other BI vendors.

Breaking: Microsoft announced a brand new Azure Machine Learning service, available in the Australian region

Exciting news hot off the press – Microsoft just announced new features in the Azure Machine Learning offering, available in preview in the following regions: East US, West Central US, and Australia East

One of the new feature offerings is Azure Machine Learning Services which provides additional functionality and regions over the existing and user-friendly Azure Machine Learning studio offering.

Azure Machine Learning Services includes:

  • ML on a bigger scale
  • AI-powered data wrangling
  • Spark
  • Docker
  • Cognitive Toolkit
  • TensorFlow
  • Caffe
  • Etc.
  • And relevant to Australian customers with data sovereignty concerns, the ability to provision this service in Australia East.

The differentiation seems to be that this new offering will apply to professional data scientists and allow for an end-to-end Data Science solution, whereas Azure machine Learning Studio will still be used by data analytics professionals who are more casual data scientists.

In Azure Machine Learning Services the model development and training occurs in Machine Learning Experimentation (I.e. the Azure Machine Learning Workbench application). When this is provisioned in Azure, it invokes a local Workbench Installer.

Once installed, the user can access the Workbench App. This is where ML training and scoring projects are managed. The screenshot below clearly shows it is much different from its Azure Machine Learning Studio predecessor. 

Once you log in to the Workbench you will be in the Workbench Dashboard, which is where projects are created and managed. It also contains templates that can be used by new users as starting points to learn from.

There is a good high-level overview here.

Employee Expenses Reporting Platform – our South Australian Government case study

A solution to provide a regular view of Employee Expenses against budgets. Our solution overcomes a lack of capacity and flexibility in the incumbent Human Resources system. See our case study on the solution that combines data from disparate systems, presenting it to the business in an easy to consume format, allowing the business to gain an unprecedented aggregated and detailed view of their expenses to enable cost reductions.

exposé case study – SA Government – Employee Expenses Reporting Solution

Transmissions Dashboard – our Energy Infrastructure Provider case study

This Transmissions Dashboard solution we designed and developed for this national Energy Infrastructure Provider involved both a data platform and analytical dashboards.

It reduced reporting turnaround potential by over 1000%, allowing staff to focus on business-critical tasks.

It embedded a trusted source of truth and quality and consistency of analysis.

It improved analytical agility and timely decision making, in and out of the office.

Please see our case study here: exposé case study – Energy Infrastructure Provider – Transmission Dashboard