See how we used modern methodology, cloud analytical technologies and thought leadership to architect and create this public facing interactive export analytical solution that empowers Australian wine exporters to make informed, data-driven decisions.
Exposé designed and developed a solution that uncovered whether there were strong relationships between known characteristics of branches and their success in order to determine new locations to be considered and the services they will offer.
Data preparation is undoubtedly one of the most competency-reliant and time-consuming parts of report generation. For this reason, it is fast becoming the new focal area for further development and we are seeing a large uptick in the number of options being made available to help alleviate these issues.
One recent entrant to this space is Tableau with the announcement of their new tool, Tableau Prep. This tool brings a new user experience to the artform of data preparation and follows the similar user-centred design form as their reporting tool.
Tableau Prep concentrates on providing a ‘no-code required’ solution in data preparation with a view to enabling a greater number of users’ accessibility and a quicker turnaround for organisations in wrangling datasets.
Every step within Tableau Prep is visual and shows the immediate effects of any transforms on data. Its proficiency is in hiding complex smart algorithms that carry out the data manipulation and surface them with one-click operations is greatly simplifying the data preparation process.
The preparation paradigm concentrates on having the user set up a pathway from the dataset through to the output, introducing the required transformations along the way.
By clicking on an element in the workflow, it will bring up a secondary pane showing more details relevant to the step selected.
Adding steps within Tableau Prep is as simple as clicking on the “+” icon and choosing the appropriate method.
Interacting with data within Tableau Prep is similarly a visual experience. In the example below, in performing a group and replace operation, Tableau Prep has recognised that as a result of joining datasets together, some used the full state name of “California” and others used the contraction of “CA”. It then groups these together utilising a fuzzy matching algorithm and presents the options to allow the user to choose which should be the chosen representation of the datapoints.
Tableau Prep provides a visual summary of all changes that have been made within each step. As changes to data are made, it updates the preview in real time, allowing the user to see the effect this has had.
After creating the transformation pathway, generating an output file is done as a final step. Currently Tableau Prep is very focussed on its integration with the Tableau product set. It automatically publishes to Tableau Desktop, Server and Online but it does also offer an output as a CSV file format.
In summary, Tableau Prep is going to enhance the ability of analysts who are used to working with the Tableau product suite. Whilst it won’t replace other more mature and prevalent data preparation products on the market such as Alteryx, Trifacta or Knime, it does offer a significant productivity opportunity to Tableau focussed organisations.
With the release of Tableau Prep, Tableau has also introduced new subscription offerings.
The new subscription levels; Tableau Creator, Explorer, and Viewer have been packaged around expected usage within an organisation. Tableau Prep has been included in the Creator package along with Tableau Desktop.
If you’ve been following Microsoft’s recent press releases, chances are you’ll have been exposed to the term “Common Data Service” (CDS). Read on as we shed light on the exact nature of CDS and what it can mean to your business.
Back in November 2016, Microsoft released their Common Data Service to general availability. In a nutshell, CDS is Microsoft’s attempt at providing a solution to counter the time and effort customers are spending to bring together disparate apps, services and solutions. At its most basic level, it provides a way to connect disparate systems around a focal point of your data. The intention is that Microsoft will provide the “heavy lifting” required to ensure the data flows back and forth as required.
To achieve this, Microsoft has defined a set of core business entities and then built them into what is known as the Common Data Model (CDM). For example, they have exposed entities for managing data around Accounts, Employees, Products, Opportunities and Sales Orders (for a full list see: https://docs.microsoft.com/en-us/powerapps/developer/common-data-service/reference/about-entity-reference). Where there isn’t an existing entity to suit a business requirement, Microsoft has made the CDM extensible, which allows you to add to your organisation’s instance of the CDM to meet your needs. As your organisation adapts and changes your CDS instance, Microsoft will then monitor this and look for common patterns amongst the business community that it will use to modify and extend the standard CDS.
Microsoft is committed to making their applications CDS aware and is working with their partners to get third party applications to interact effectively with the CDS.
When establishing CDS integration from an organisational use perspective, it should ideally be a simple configuration of a connector from a source application to the CDS, aligning its data entities with the reciprocal entities within the CDS. This will ensure that as products are changed to meet business needs over time, the impact should be almost negligible to other systems. This negates the need for an organisation to spend an excessive amount of time ensuring the correct architecting of a solution in bringing together disparate apps and siloed information. This can now be handled through the CDS.
Since its release in 2016, CDS has evolved with Microsoft recently announcing the release of two new services; Common Data Service for Apps (CDS for Apps) and Common Data Service for Analytics (CDS for Analytics).
CDS for Apps was released in January 2018 with CDS for Analytics expected for release in second quarter 2018.As a snapshot of how the various “pieces” fit together, Figure 1 provides a logical view of how the services will interact.
Common Data Service for Apps
CDS for Apps was initially designed for businesses to engage with their data on a low-code/no-code basis through Microsoft’s PowerApps product. This allows a business to rapidly develop scalable, secure and feature-rich applications.
For organisations needing further enhancement, Microsoft offers developer extensions to engage with CDS for Apps.
Common Data Service for Analytics
CDS for Analytics was designed to function with Power BI as the visual reporting visual product. Similarly to the way CDS for Apps is extensible by developers, CDS for Analytics will also provide extensibility options.
Figure 2 below provides the current logic model for how CDS for Analytics will integrate.
Implementing the CDS for Apps and CDS for Analytics will enable you to be able to easily capture data and then accelerate your ability to provide insights into your business data.
To assist in this acceleration, Microsoft and expose data, as their partners, will be building industry specific apps that immediately surface deep insights to an organisation’s data. An initial example is currently being developed by Microsoft; Power BI for Sales Insights will address the maximisation of sales productivity by providing insights into which opportunities are at risk and where salespeople could be spending their time more efficiently.
The ease of development and portability of solutions aren’t possible, however, without having a standardised data model. By leveraging Microsoft’s new common data services and with the suite of Microsoft’s platform of products being CDS aware, utilisation of tools such as Azure Machine Learning and Azure Databricks for deeper analysis of your organisation’s data becomes transformational.
If you’d like to understand more about how to take advantage of the Common Data Service or for further discussion around how it can assist your business, please get in touch.
Power BI has truly evolved over the past few years. From an add-on in Excel to a true organisation wide BI platform, capable of scaling to meet the demands of large organisations; both in terms of data volumes and the number of users. Power BI now has multiple flavors and a much more complicated licencing model. So, in this article, we demystify this complexity by describing each flavor of Power BI and their associated pricing. We summaries it all at the end with some scenarios and in a single cheat sheet for you to use.
Desktop, Cloud, On-premise, Pro, Premium, Embedded – what does all of this mean?
I thought it best to separate the “why” (i.e. why do you use Power BI – Development or Consumption), the “what” (i.e. what can you do given your licence variant), and the “how much” (i.e. how much is it going to cost you) as combining these concepts often leads to confusion as there isn’t necessarily an easy map of why what and how much.
Let’s first look at the “why”
“Why” deals with the workload performed with Power BI based on its deployment – I.e. why do you use Power BI? Is it for Development or for Consumption. This is very much related to the deployment platform (i.e. Desktop, Cloud, On-Premise or Embedded).
The term “consumption” for the purpose of this article could range from a narrow meaning (I.e. the consumption of Power BI content only) to a broad meaning (i.e. consumption of-, collaboration over-, and management of Power BI content – I refer to this as “self-serve creators”).
Now let’s overlay the “why” with “what”
In the table above, I not only dealt with the “why”, but I also introduced the variants of Power BI; namely Desktop, Free, Pro, On-Premise and Embedded. Variants are related to the licence under which the user operates and it determines what a user can do.
Confused? Stay with me…all will become clearer.
Lastly let’s look at the “how much”
The Power BI journey (mostly) starts with development in Desktop, then proceeds to a deployed environment where it is consumed (with or without self-serve). Let’s close the loop on understanding the flavours of Power BI by looking at what this means from a licencing cost perspective.
Disclaimer: The pricing supplied in the following table is based on US-, Australian-, New Zealand- and Hong Kong Dollars. These $ values are by no means quotes but merely taken from the various calculators and pricing references supplied by Microsoft as at the date of first publication of this article.
**Other ways to embed Power BI content are via Rest API’s (authenticated), SharePoint online (via Pro licencing) and Publish to Web (unauthenticated), but that is a level of detail for another day. For the purpose of this article, we focus on Power BI Embedded as the only embedded option.
Pro is pervasive
Even if you deploy to the Cloud and intend to make content available to pure consumers of the content only (non-self-serve users), whether it be in PowerBi.com or as embedded visuals, you will still need at least one Pro licence to manage your content. The more visual content creators (self-server creators) you have, the more Pro licences you will need. But, it is worth considering the mix between Pro and Premium licences, as both Pro and Premium users can consume shared content, but only Pro users can create shared content (via self-service), so the mix must be determined by a cost vs capacity ratio (as discussed below).
A little bit more about Premium
Premium allows users to consume shared content only. It does not allow for any self-service capabilities. Premium licences are not per user, but instead, based according to planned capacity, so you pay for a dedicated node to serve your users. Consider Premium licencing for organisations with large numbers of consumers (non-self-serve) that also require the dedicated computer to handle capacity. The organisation would still require one or more Pro licences for content management and any self-serve workload.
Premium licencing is scaled as Premium 1, 2 or 3 dependant on the number of users and required capacity. You can scale up your capacity by adding more nodes as P1, P2 or P3, or scale up from P1 to P2, and from P2 to P3.
The mix between Pro and Premium
Given that Pro users can do more than Premium users, and given that you will need to buy one or more Pro licences anyway, why would you not only use Pro rather than Premium? There are two reasons:
There is a tipping point where Pro becomes more expensive compared to Premium, and
With Pro licences you use a shared pool of Azure resources, so is not as performant as Premium which uses dedicated resources, so there is a second tipping point where your capacity requirements won’t be sufficiently served by Pro.
The diagram below shows the user and capacity tipping points (discussed further in scenario 1 below):
Put this all together
Right, you now understand the “why”, “what” and “how much” – let’s put it all together through examples (I will use Australian $ only for illustrative purposes). Please note that there are various ways to achieve the scenarios below and this is not a comprehensive discussion of all the options.
A large organisation has 10 Power BI Developers; their Power BI rollout planning suggest that they will grow to 50 self-service creators and 1450 additional high activity consumers in 12 months. And that they will grow to 125 self-serve creators and 5000 high activity consumers in 48 months:
Initially, they will require
10 x Power BI Desktop licences = $0 x 10 = $0
500 x Power BI Pro licences to cover both self-serve users and consumers = $12.70 x 500 = $6,350
Total – A$6,350.00pm
Once they exceed 500 they can revert to
50 x Power BI Pro licences to cover self-serve users = $12.70 x 50 = $635
1 x P1 node to cover the next tranche of high activity consumers = $6,350
Total – A$6,985.00pm
Add Power BI Pro licences as required up to their planned 125 = $12.70 x 125 = $1,588
Add 1 additional P1 node at 1,450 users, and again at 2,900 users, and again at 4,250 users = $25,400 for 4 x P1 nodes
Total after 4 years at 5000 high activity consumers and 125 self-serve creators – A$26,988.00pm
A small organisation with 1 Power BI developer, 5 additional self-service creators and 10 additional consumers of visual content, with no custom applications/ websites.
1 x Free version of Power BI Desktop: 1 x $0
15 x Pro licences as both visual creators and mere consumers will take part in shared content: 15 x $12.70
Total – A$190.50pm
A small ISV organisation with 3 Power BI developers want to embed Power BI content in an application that they sell. The application must be up 24 x 7 and do not require a very high volume of concurrent users, but licencing cannot be on a per-user basis.
3 x Free version of Power BI Desktop: 3 x $0
1 x Pro licences acting as the mater of the Shared content: 1 x $12.70
A1 Node pricing: 1 x $937
Total – A$950.00pm
A medium sized organisation with 5 Power BI developers want to embed Power BI content in an internal portal such as SharePoint which is used by potentially 250 users. They also have 10 self-service creators and 25 consumers of Power BI content through the Power BI portal.
5 x Free version of Power BI Desktop: 3 x $0
26 x Pro licences acting as 1 mater of the Shared content and 25 consumers: 26 x $330.20
A1 Node pricing: 1 x $937
Total – A$1,267.20pm
Power BI – licence variant, workload, deployment & cost cheat sheet
Any process is shown in Australian $
Disclaimer: The pricing supplied in the following table are by no means quotes, but merely taken from the various calculators and pricing references supplied by Microsoft as at the date of first publication of this article.
The Internet of Mice – Our IoT and Advanced Analytics Solution
Understanding how animals involved in research move and eliminating as much human handling as possible makes for a much more humane environment for the animals. The outcome is more accurate results for the researchers. See how our IoT and Advanced Analytics solution developed for our customer strives towards a humane research environment and delivers more intelligent insights to researchers.
Businesses are increasingly embracing an empowered regime when it comes to data analytics and business intelligence. Subject matter experts inside business units are increasingly at the forefront of the creation of data models on which reports and dashboards rely. Various technologies (such as Tableau, Qlik and Power BI), now facilitate user-access to a wide variety of data sources and makes the task of data modelling easier than ever before.
Until now, these technologies did not draw a clear separation between modelling and reporting and dashboards. It meant that businesses locked themselves into one technology and that usually had costly licencing implications. One way businesses could overcome this inflexibility was to create data models in more advanced ICT based semantic model technology such as SQL Server Analysis Services, Oracle Essbase or IBM Cognos TM1. But authoring models in these technologies were often not within the skill-set of business based subject matter experts.
In an ideal world, data workers (including the business based subject matter experts) want easy and cost-effective environments to create and deploy data models (data acquisition, data transformation, enhancements and relationships) without having to learn very complex coding and technical skills. And then for the business to leverage such deployed models, either in a related visual technology or in another technology they may prefer altogether.
We are happy to announce that this is becoming increasingly possible. Microsoft recently introduced the ability to import Power BI Desktop files into Analysis services, and this is a serious game changer.
In this article, I will:
Discuss what this means for businesses.
Briefly delve into why tools such as Power BI, Qlik and Tableau lacked modularity and how this is now changing with the play between Power BI Desktop and Analysis Services.
Walk the reader through deploying the Power BI Desktop authored model and how to make it an Analysis Services model.
Describe some examples of what is possible with a deployed model.
Use Tableau to connect to my new model for reporting and dash-boarding.
What does this mean for businesses (business benefits)
The user creates his/her model using Power BI Desktop; a free, easy to use business analytics tool provided by Microsoft. It has both comprehensive semantic modelling and reporting and dash-boarding capabilities, but for this article, we focus on the data modelling rather than the visual capabilities. Once created, the user can deploy their solution to a Power BI environment, or import it into an Azure Analysis Service model.
Deploy to a Power BI environment – In this deployment model, there is only limited the only separation between models, reports and dashboards. This applies to both Power BI Service (cloud) and Power BI Report Server (on-premise). The deployed models remain available mostly to Power BI visualisations, and to some extent, Excel.
Deploy to Analysis Services – if the solution is imported to Analysis Services then separation of model vs reports and dashboards are maximised. The advantages of this are:
The Analysis Services model (that started life in Power BI Desktop) can now be accessed through other BI tools the business may choose to use for visualisation and self-service (reports and dashboards), for example, Power BI, Excel, Tableau and Qlik.
It becomes easier for businesses to change their BI tools from one technology to another as the underlying data model (now in Analysis Services) remains in place.
The business can control performance by changing the pricing tier of Analysis Services and scale up during peak workloads, and scale down when there is less demand for the data model.
The business can better control cost by pausing Analysis Services during zero demand periods. This is typically a much more compelling cost model compared to conventional annual licences.
A question of modularity
Ever since Microsoft introduced Power Query in Power BI version 1 a few years back, data workers found a powerful data modelling ally that gave them modelling capabilities (data acquisition, transformations, relationships, calculated column and measure, and hierarchies) without having to understand complex coding or data modelling skills. Competitors such as Qlik and Tableau, have similar capabilities, so a business’ preference for Power BI vs Qlik or Tableau (etc.) came down to factors such as familiarity, loyalty, perception and cost.
The problem with this was a lack of modularity (a lack of separation between the model itself, and how great the interactive visual report and dashboard capabilities the tools provide). If you created a model in Power BI Desktop or Qlik or Tableau, you were pretty much stuck with visualizations within your selected tool. There was no logical separation between the model and the visuals.
It is now possible to achieve modularity and separation of the model and visuals through the close relationship between Power BI and Analysis Services:
The data worker creates his/her model using Power BI Desktop.
The Power BI Desktop file (a PBIX file) is then imported into Analysis Services, and it becomes an Analysis Services model.
The Analysis Services model can then be accessed for development and enhancement by the business and ICT.
The Analysis Services model can be accessed by creators of self-service reports and dashboards through BI tools of their choice.
Gotcha – “Please note that for PBIX import, only Azure SQL Database, Azure SQL Data warehouse, Oracle, and Teradata are supported as model data sources. Also, Direct Query models are not yet supported for import. Microsoft will be adding new connection types for import every month and add additional functionality” – Read more here
Caveat – I am not saying modularity is a prerequisite to great BI solutions! Some businesses are pretty happy with whole adopting a comprehensive BI technology that includes the data model, visuals, and many other capabilities, but some of the business benefits achieved with modularity just won’t be available in such deployments.
I have a Power BI model; I want to maximize its use in other BI technologies, how do I do that?
If you do not already have an Azure Analysis Services service provisioned, one needs to be created:
Log into your Azure tenant.
Select New > Data + Analytics > Analysis Services.
Complete all the required settings, and Create.
Add your Power BI model
Open your Analysis Services service, and if it’s not already started, click on Start (please note you pay for this service for every minute it runs, cost stops when paused).
In the Overview pane, open the Web Designer (as at 7th October 2017, this feature is still in preview, so functionality is still a work in progress).
“Microsoft Azure Analysis Services Web Designer” is a new Analysis Services experience, currently still in Preview, that allows developers to create and manage Azure Analysis Services (AAS) semantic models quickly and easily. SQL Server Data Tools and SQL Server Management Studio remain the primary tools for development, but this new experience is intended to make simple changes fast and easy (including the ability to import Power BI Desktop authored models quickly).
Click on Add a new Model.
Select an appropriate name, and select the source of your data (this is either Azure SQL Database, Azure SQL Data warehouse, or Power BI Desktop file). For this article, we will select a Power BI Desktop file.
Navigate to your Power BI Desktop file and select Import.
If your Power BI model uses a data source not yet supported then you will receive an error as per below, and you will have to wait until your data source becomes available. See the gotcha earlier in this article.• If your Power BI model uses available data sources and functionality, then your Power BI Desktop file (PBIX) is converted to Analysis Services.
Quick access to Web Designer to edit and query the model
You can immediately access your model right here in Web Designer and perform simple drag and drop queries, basic development changes, and edit relationships:
Perform simple query drag and drop:
• Perform some basic development changes:
More comprehensive editing and querying
You can alternatively also open the model in one of the following technologies:
Visual Studio – for comprehensive editing
Power BI Desktop – for comprehensive editing and visualisations
Excel – for visualisations
Build Visualisations from your Model (in this example via Tableau)
Open your BI tool of choice, for example, Power BI, Excel or Tableau. In this example, I am using Tableau Desktop (10.4 Profesional). Each BI technology may have slightly different ways of doing so.
Connect to a server.
Select Microsoft Analysis Services.
Navigate to your Analysis Services service and copy the server information from the overview page:
Paste the server information into the Server field.
Select “Use a specific username and password.
Use your Azure Active Directory (O365) credentials used for your Azure Services
Now select the specific Database and Cube.
The “Premiums and Claims” database and Model originated as a Power BI Desktop file, which was imported into Azure Analysis Services as explained earlier in this article.
To start authoring your Visual reports, click on “Sheet 1”.
Here is an example of my Tableau report created over my Analysis Services model that originated in Power BI Desktop:
Scale the performance within minutes (up/ down)
In case performance needs to be improved, navigate back to Analysis Services.
Select the Pricing Tier, and scale the tier up to a higher performing service level (note this will mean higher pay-as-you-go costs.
During the scale, up/ down process queries won’t be able to run.
I observed significant performance improvements through the Tableau client when I scaled Analysis Services from tier S0 to S2.
As the Web Designer and its ability to import PBIX files as Analysis Services models go into general availability and mature after that, business will, if they want to leverage the advantages of modularity, have to keep an eye on this functionality.
This is game-changing as it will put pressure on competitors (such as Qlik and Tableau) to open up their models to other BI vendors.
SA - Lot Fourteen, Margaret Graham Building, Adelaide 5000