Our Local Government case study shows how the establishment of a corporate report style and data model framework greatly assisted business report designers build their operational reports in a more streamlined fashion with greater consistency, and provide the IT team a foundation to expand on.
See how we used modern methodology, cloud analytical technologies and thought leadership to architect and create this public facing interactive export analytical solution that empowers Australian wine exporters to make informed, data-driven decisions.
Exposé designed and developed a solution that uncovered whether there were strong relationships between known characteristics of branches and their success in order to determine new locations to be considered and the services they will offer.
Data preparation is undoubtedly one of the most competency-reliant and time-consuming parts of report generation. For this reason, it is fast becoming the new focal area for further development and we are seeing a large uptick in the number of options being made available to help alleviate these issues.
One recent entrant to this space is Tableau with the announcement of their new tool, Tableau Prep. This tool brings a new user experience to the artform of data preparation and follows the similar user-centred design form as their reporting tool.
Tableau Prep concentrates on providing a ‘no-code required’ solution in data preparation with a view to enabling a greater number of users’ accessibility and a quicker turnaround for organisations in wrangling datasets.
Every step within Tableau Prep is visual and shows the immediate effects of any transforms on data. Its proficiency is in hiding complex smart algorithms that carry out the data manipulation and surface them with one-click operations is greatly simplifying the data preparation process.
The preparation paradigm concentrates on having the user set up a pathway from the dataset through to the output, introducing the required transformations along the way.
By clicking on an element in the workflow, it will bring up a secondary pane showing more details relevant to the step selected.
Adding steps within Tableau Prep is as simple as clicking on the “+” icon and choosing the appropriate method.
Interacting with data within Tableau Prep is similarly a visual experience. In the example below, in performing a group and replace operation, Tableau Prep has recognised that as a result of joining datasets together, some used the full state name of “California” and others used the contraction of “CA”. It then groups these together utilising a fuzzy matching algorithm and presents the options to allow the user to choose which should be the chosen representation of the datapoints.
Tableau Prep provides a visual summary of all changes that have been made within each step. As changes to data are made, it updates the preview in real time, allowing the user to see the effect this has had.
After creating the transformation pathway, generating an output file is done as a final step. Currently Tableau Prep is very focussed on its integration with the Tableau product set. It automatically publishes to Tableau Desktop, Server and Online but it does also offer an output as a CSV file format.
In summary, Tableau Prep is going to enhance the ability of analysts who are used to working with the Tableau product suite. Whilst it won’t replace other more mature and prevalent data preparation products on the market such as Alteryx, Trifacta or Knime, it does offer a significant productivity opportunity to Tableau focussed organisations.
With the release of Tableau Prep, Tableau has also introduced new subscription offerings.
The new subscription levels; Tableau Creator, Explorer, and Viewer have been packaged around expected usage within an organisation. Tableau Prep has been included in the Creator package along with Tableau Desktop.
If you’ve been following Microsoft’s recent press releases, chances are you’ll have been exposed to the term “Common Data Service” (CDS). Read on as we shed light on the exact nature of CDS and what it can mean to your business.
Back in November 2016, Microsoft released their Common Data Service to general availability. In a nutshell, CDS is Microsoft’s attempt at providing a solution to counter the time and effort customers are spending to bring together disparate apps, services and solutions. At its most basic level, it provides a way to connect disparate systems around a focal point of your data. The intention is that Microsoft will provide the “heavy lifting” required to ensure the data flows back and forth as required.
To achieve this, Microsoft has defined a set of core business entities and then built them into what is known as the Common Data Model (CDM). For example, they have exposed entities for managing data around Accounts, Employees, Products, Opportunities and Sales Orders (for a full list see: https://docs.microsoft.com/en-us/powerapps/developer/common-data-service/reference/about-entity-reference). Where there isn’t an existing entity to suit a business requirement, Microsoft has made the CDM extensible, which allows you to add to your organisation’s instance of the CDM to meet your needs. As your organisation adapts and changes your CDS instance, Microsoft will then monitor this and look for common patterns amongst the business community that it will use to modify and extend the standard CDS.
Microsoft is committed to making their applications CDS aware and is working with their partners to get third party applications to interact effectively with the CDS.
When establishing CDS integration from an organisational use perspective, it should ideally be a simple configuration of a connector from a source application to the CDS, aligning its data entities with the reciprocal entities within the CDS. This will ensure that as products are changed to meet business needs over time, the impact should be almost negligible to other systems. This negates the need for an organisation to spend an excessive amount of time ensuring the correct architecting of a solution in bringing together disparate apps and siloed information. This can now be handled through the CDS.
Since its release in 2016, CDS has evolved with Microsoft recently announcing the release of two new services; Common Data Service for Apps (CDS for Apps) and Common Data Service for Analytics (CDS for Analytics).
CDS for Apps was released in January 2018 with CDS for Analytics expected for release in second quarter 2018.As a snapshot of how the various “pieces” fit together, Figure 1 provides a logical view of how the services will interact.
Common Data Service for Apps
CDS for Apps was initially designed for businesses to engage with their data on a low-code/no-code basis through Microsoft’s PowerApps product. This allows a business to rapidly develop scalable, secure and feature-rich applications.
For organisations needing further enhancement, Microsoft offers developer extensions to engage with CDS for Apps.
Common Data Service for Analytics
CDS for Analytics was designed to function with Power BI as the visual reporting visual product. Similarly to the way CDS for Apps is extensible by developers, CDS for Analytics will also provide extensibility options.
Figure 2 below provides the current logic model for how CDS for Analytics will integrate.
Implementing the CDS for Apps and CDS for Analytics will enable you to be able to easily capture data and then accelerate your ability to provide insights into your business data.
To assist in this acceleration, Microsoft and expose data, as their partners, will be building industry specific apps that immediately surface deep insights to an organisation’s data. An initial example is currently being developed by Microsoft; Power BI for Sales Insights will address the maximisation of sales productivity by providing insights into which opportunities are at risk and where salespeople could be spending their time more efficiently.
The ease of development and portability of solutions aren’t possible, however, without having a standardised data model. By leveraging Microsoft’s new common data services and with the suite of Microsoft’s platform of products being CDS aware, utilisation of tools such as Azure Machine Learning and Azure Databricks for deeper analysis of your organisation’s data becomes transformational.
If you’d like to understand more about how to take advantage of the Common Data Service or for further discussion around how it can assist your business, please get in touch.
Power BI has truly evolved over the past few years. From an add-on in Excel to a true organisation wide BI platform, capable of scaling to meet the demands of large organisations; both in terms of data volumes and the number of users. Power BI now has multiple flavors and a much more complicated licencing model. So, in this article, we demystify this complexity by describing each flavor of Power BI and their associated pricing. We summaries it all at the end with some scenarios and in a single cheat sheet for you to use.
Desktop, Cloud, On-premise, Pro, Premium, Embedded – what does all of this mean?
I thought it best to separate the “why” (i.e. why do you use Power BI – Development or Consumption), the “what” (i.e. what can you do given your licence variant), and the “how much” (i.e. how much is it going to cost you) as combining these concepts often leads to confusion as there isn’t necessarily an easy map of why what and how much.
Let’s first look at the “why”
“Why” deals with the workload performed with Power BI based on its deployment – I.e. why do you use Power BI? Is it for Development or for Consumption. This is very much related to the deployment platform (i.e. Desktop, Cloud, On-Premise or Embedded).
The term “consumption” for the purpose of this article could range from a narrow meaning (I.e. the consumption of Power BI content only) to a broad meaning (i.e. consumption of-, collaboration over-, and management of Power BI content – I refer to this as “self-serve creators”).
Now let’s overlay the “why” with “what”
In the table above, I not only dealt with the “why”, but I also introduced the variants of Power BI; namely Desktop, Free, Pro, On-Premise and Embedded. Variants are related to the licence under which the user operates and it determines what a user can do.
Confused? Stay with me…all will become clearer.
Lastly let’s look at the “how much”
The Power BI journey (mostly) starts with development in Desktop, then proceeds to a deployed environment where it is consumed (with or without self-serve). Let’s close the loop on understanding the flavours of Power BI by looking at what this means from a licencing cost perspective.
Disclaimer: The pricing supplied in the following table is based on US-, Australian-, New Zealand- and Hong Kong Dollars. These $ values are by no means quotes but merely taken from the various calculators and pricing references supplied by Microsoft as at the date of first publication of this article.
**Other ways to embed Power BI content are via Rest API’s (authenticated), SharePoint online (via Pro licencing) and Publish to Web (unauthenticated), but that is a level of detail for another day. For the purpose of this article, we focus on Power BI Embedded as the only embedded option.
Pro is pervasive
Even if you deploy to the Cloud and intend to make content available to pure consumers of the content only (non-self-serve users), whether it be in PowerBi.com or as embedded visuals, you will still need at least one Pro licence to manage your content. The more visual content creators (self-server creators) you have, the more Pro licences you will need. But, it is worth considering the mix between Pro and Premium licences, as both Pro and Premium users can consume shared content, but only Pro users can create shared content (via self-service), so the mix must be determined by a cost vs capacity ratio (as discussed below).
A little bit more about Premium
Premium allows users to consume shared content only. It does not allow for any self-service capabilities. Premium licences are not per user, but instead, based according to planned capacity, so you pay for a dedicated node to serve your users. Consider Premium licencing for organisations with large numbers of consumers (non-self-serve) that also require the dedicated computer to handle capacity. The organisation would still require one or more Pro licences for content management and any self-serve workload.
Premium licencing is scaled as Premium 1, 2 or 3 dependant on the number of users and required capacity. You can scale up your capacity by adding more nodes as P1, P2 or P3, or scale up from P1 to P2, and from P2 to P3.
The mix between Pro and Premium
Given that Pro users can do more than Premium users, and given that you will need to buy one or more Pro licences anyway, why would you not only use Pro rather than Premium? There are two reasons:
There is a tipping point where Pro becomes more expensive compared to Premium, and
With Pro licences you use a shared pool of Azure resources, so is not as performant as Premium which uses dedicated resources, so there is a second tipping point where your capacity requirements won’t be sufficiently served by Pro.
The diagram below shows the user and capacity tipping points (discussed further in scenario 1 below):
Put this all together
Right, you now understand the “why”, “what” and “how much” – let’s put it all together through examples (I will use Australian $ only for illustrative purposes). Please note that there are various ways to achieve the scenarios below and this is not a comprehensive discussion of all the options.
A large organisation has 10 Power BI Developers; their Power BI rollout planning suggest that they will grow to 50 self-service creators and 1450 additional high activity consumers in 12 months. And that they will grow to 125 self-serve creators and 5000 high activity consumers in 48 months:
Initially, they will require
10 x Power BI Desktop licences = $0 x 10 = $0
500 x Power BI Pro licences to cover both self-serve users and consumers = $12.70 x 500 = $6,350
Total – A$6,350.00pm
Once they exceed 500 they can revert to
50 x Power BI Pro licences to cover self-serve users = $12.70 x 50 = $635
1 x P1 node to cover the next tranche of high activity consumers = $6,350
Total – A$6,985.00pm
Add Power BI Pro licences as required up to their planned 125 = $12.70 x 125 = $1,588
Add 1 additional P1 node at 1,450 users, and again at 2,900 users, and again at 4,250 users = $25,400 for 4 x P1 nodes
Total after 4 years at 5000 high activity consumers and 125 self-serve creators – A$26,988.00pm
A small organisation with 1 Power BI developer, 5 additional self-service creators and 10 additional consumers of visual content, with no custom applications/ websites.
1 x Free version of Power BI Desktop: 1 x $0
15 x Pro licences as both visual creators and mere consumers will take part in shared content: 15 x $12.70
Total – A$190.50pm
A small ISV organisation with 3 Power BI developers want to embed Power BI content in an application that they sell. The application must be up 24 x 7 and do not require a very high volume of concurrent users, but licencing cannot be on a per-user basis.
3 x Free version of Power BI Desktop: 3 x $0
1 x Pro licences acting as the mater of the Shared content: 1 x $12.70
A1 Node pricing: 1 x $937
Total – A$950.00pm
A medium sized organisation with 5 Power BI developers want to embed Power BI content in an internal portal such as SharePoint which is used by potentially 250 users. They also have 10 self-service creators and 25 consumers of Power BI content through the Power BI portal.
5 x Free version of Power BI Desktop: 3 x $0
26 x Pro licences acting as 1 mater of the Shared content and 25 consumers: 26 x $330.20
A1 Node pricing: 1 x $937
Total – A$1,267.20pm
Power BI – licence variant, workload, deployment & cost cheat sheet
Any process is shown in Australian $
Disclaimer: The pricing supplied in the following table are by no means quotes, but merely taken from the various calculators and pricing references supplied by Microsoft as at the date of first publication of this article.
The Internet of Mice – Our IoT and Advanced Analytics Solution
Understanding how animals involved in research move and eliminating as much human handling as possible makes for a much more humane environment for the animals. The outcome is more accurate results for the researchers. See how our IoT and Advanced Analytics solution developed for our customer strives towards a humane research environment and delivers more intelligent insights to researchers.