Power BI Report Server and BI Platform Upgrade – our PIRSA case study

The Department for Primary Industries and Regions South Australia  (PIRSA) faced an increased demand for consumption of Power BI dashboards and reports in a modern and centralised platform. With the business not quite ready to move to the cloud, we architected and created a solution that could serve as a stepping stone that could satisfy their needs, provide mobile reports, retain the ability to access paginated and Excel reports, leverage newer capabilities introduced into SQL Server since 2012 and provide a roadmap to later cloud adoption.

Read our case study here.

Building a unified model for consistent and coherent team reporting – our City of Adelaide case study

We created a unified model for the City of Adelaide across fragmented data sources (including on premise, cloud, file based and SaaS) to deliver a consist and coherent reporting solution across different teams.

Manual, costly and inconsistent preparation and curation effort were eliminated through the automated solution, delivering a single source of interactive Customer Program performance reporting.

See our case study here.

Say that Again? Power BI Commentary extends to Reports

Power BI recently announced the extension of  its commentary capability to Power BI reports. Yes, you can now add comments to both report pages or specific visuals to improve your data discussions!

These conversations are automatically bookmarked, so the report context is retained exactly as the comment was written, complete with the original filters. Reporting by exception is embraced with those mentioned by @mentions receiving a push notification to their mobile device to alert them.

Whilst commentary is nothing new in BI tools – Power BI is a bit late to the game – its here now and we’ve subsequently put it through its paces to see how it stacks up!

Backstory

The following exposé samples show the analysis for a retail organisation. The data, which updates hourly, is sourced from 3 different on-premise systems and modelled into a user-friendly sales model with a specific focus on Products, Customers and Suppliers & Export. The Head of Sales noticed an unusual spike in sales (in $ terms) back in April and created a comment for his sales managers to see. His sales manager picked up the comment and conducted the visual analysis, finding the reason for the spike. By retaining the conversation, anyone with access to the sales analysis can visually play back what was said and see the context of the discussion visually.

This saves staff time –  they don’t need to rediscover the reason for what may well be a very common question.

In the sections below, we step though these events, culminating in our conclusions on this new functionality in Power BI.

Let’s have a look

The first set of images shows the 4 relevant visuals the Head of Sales would have initially looked at, either on his laptop or on his mobile phone. They analyse sales through the lenses of Product, Customer Country, Export (Supplier Country) and Sales (over time) respectively.

The Head of Sales picks up the unusual spike in April in the 4th visual, Product Sales. And he posts his first comment.

This comment is then picked up by one of the Sales Managers, who conducts some interactive analysis and subsequently responds to the Head of Sales. The Head of Sales is notified, clicks on the comment to see the full visual context – see how selecting the comment plays back the visual as it would have looked appeared when the comment was made, and spotlights the specific played back visual clearly showing the 4 products.

The Head of Sales now has a further comment, asking for clarification as to where these 4 specific products are sold.

This specific Sales Manager (note I simply use one of our guest accounts to represent him) is notified of the comment and does further interactive analysis, and responds.

The Head of Sales is notified of the new comment and clicks on the new comment to see the full visual context – selecting the comment again plays back the visual to what it would have looked like when the comment was made, and spotlights the specific played back visual clearly showing the 2 countries.

This now gives the Head of Sales enough context to understand what lead to the spike. He/ his delegate now jump into Power BI and create a new visual from the user friendly sales model that will continue to track and trend these 4 specific ‘focus’ products within the Germany and US ‘high volume’ markets. This shows them that they are becoming popular and that they should invest in some additional marketing around those 4 products.

How this works

Using commentary requires no update or reinstall. Simply navigate to your report in Power BI Service and create comments. This can be done on the visuals themselves after analysis has been done to retain the context.

On on the report page in totality.

In my sales example here, I used a combination of the report page and specific contextual visual commentary in my discussion. The comments page will show all relevant comments and selecting any one of them will play the report and the context back to the time of the report.

Conclusion

The new commentary capabilities are still object based, and not intimately linked to the data as it was, for example in Business Objects – where commentary is made and written back to the solution based on the actual intersection of data—for example, a Sales Value of Product X for 1st of January 2019, in Vancouver in Canada, by Mary Jackson. The difference, however, could be quite subtle as Power BI could allow for the comment on a visual that shows the Sales Value has been filtered to Product X for 1st of January 2019, in Vancouver in Canada, by Mary Jackson.

One of the main downsides of this object based approach is that the commentary data itself remains inaccessible if you, for example, wanted to use it as raw contextual time based data itself. Disclaimer: I say this data is inaccessible, as I am unaware of where it would be stored or accessed. Happy to be advised of the contrary

The ability to play the report and visuals back to what it looked like when the comment was made is, however, a very nice feature—the reader can as it were, “step back in time” and see what happened when the comment was made. This seems to be the case even as more data is appended to the model (in this case) on an hourly basis.

There is no workflow attached to the commentary, which is quite common in financial reporting where commentary and narrative undergo review and approval.

This feature is not available to public facing reports using the “Embed to Web” functionality. But if you’re interested in looking at the sample reports I used for this user story, they can be viewed and interacted with here.

Establishing a corporate report style and data model framework – our Local Government case study

Our Local Government case study shows how the establishment of a corporate report style and data model framework greatly assisted business report designers build their operational reports in a more streamlined fashion with greater consistency, and provide the IT team a foundation to expand on.

Read the case study here: exposé case study – Local Government – Data Model and PBI Templates

Bringing Australian Wine to the World – our Wine Australia case study

See how we used modern methodology, cloud analytical technologies and thought leadership to architect and create this public facing interactive export analytical solution that empowers Australian wine exporters to make informed, data-driven decisions.

See our case study here.

Have a look at the solution in the link below. Use any of the “Get Started” questions to start your journey. Market Explorer Tool

See our short video here.

Branch locations and success criteria using predictive analytics – our Agribusiness case study

Exposé designed and developed a solution that uncovered whether there were strong relationships between known characteristics of branches and their success in order to determine new locations to be considered and the services they will offer.

See our case study here: exposé case study – Agribusiness – Branch locations and success criteria using predictive analytics

Tableau Prep – we test drive this new user-centred data preparation tool

Data preparation is undoubtedly one of the most competency-reliant and time-consuming parts of report generation. For this reason, it is fast becoming the new focal area for further development and we are seeing a large uptick in the number of options being made available to help alleviate these issues.

One recent entrant to this space is Tableau with the announcement of their new tool, Tableau Prep. This tool brings a new user experience to the artform of data preparation and  follows the similar user-centred design form as their reporting tool.

Tableau Prep concentrates on providing a ‘no-code required’ solution in data preparation with a view to enabling a greater number of users’ accessibility and a quicker turnaround for organisations in wrangling datasets.

Every step within Tableau Prep is visual and shows the immediate effects of any transforms on data. Its proficiency is in hiding complex smart algorithms that carry out the data manipulation and surface them with one-click operations is greatly simplifying the data preparation process.

The preparation paradigm concentrates on having the user set up a pathway from the dataset through to the output, introducing the required transformations along the way.

Preparation Workflow

By clicking on an element in the workflow, it will bring up a secondary pane showing more details relevant to the step selected.

Dataset input step

Adding steps within Tableau Prep is as simple as clicking on the “+” icon and choosing the appropriate method.

Add prep step menu

Interacting with data within Tableau Prep is similarly a visual experience. In the example below, in performing a group and replace operation, Tableau Prep has recognised that as a result of joining datasets together, some used the full state name of “California” and others used the contraction of “CA”. It then groups these together utilising a fuzzy matching algorithm and presents the options to allow the user to choose which should be the chosen representation of the datapoints.

Group and Replace

Tableau Prep provides a visual summary of all changes that have been made within each step. As changes to data are made, it updates the preview in real time, allowing the user to see the effect this has had.

Example of field change descriptions

After creating the transformation pathway, generating an output file is done as a final step. Currently Tableau Prep is very focussed on its integration with the Tableau product set. It automatically publishes to Tableau Desktop, Server and Online but it does also offer an output as a CSV file format.

Conclusion

In summary, Tableau Prep is going to enhance the ability of analysts who are used to working with the Tableau product suite. Whilst it won’t replace other more mature and prevalent data preparation products on the market such as Alteryx, Trifacta or Knime, it does offer a significant productivity opportunity to Tableau focussed organisations.

New Subscriptions

With the release of Tableau Prep, Tableau has also introduced new subscription offerings.

Subscription offerings

The new subscription levels; Tableau Creator, Explorer, and Viewer have been packaged around expected usage within an organisation. Tableau Prep has been included in the Creator package along with Tableau Desktop.

Also see our video test drive here.

Common Data Service (CDS) – A Common Data Model. Accelerate your ability to provide insights into your business data

If you’ve been following Microsoft’s recent press releases, chances are you’ll have been exposed to the term “Common Data Service” (CDS). Read on as we shed light on the exact nature of CDS and what it can mean to your business.

Back in November 2016, Microsoft released their Common Data Service to general availability. In a nutshell, CDS is Microsoft’s attempt at providing a solution to counter the time and effort customers are spending to bring together disparate apps, services and solutions. At its most basic level, it provides a way to connect disparate systems around a focal point of your data. The intention is that Microsoft will provide the “heavy lifting” required to ensure the data flows back and forth as required.

To achieve this, Microsoft has defined a set of core business entities and then built them into what is known as the Common Data Model (CDM). For example, they have exposed entities for managing data around Accounts, Employees, Products, Opportunities and Sales Orders (for a full list see: https://docs.microsoft.com/en-us/powerapps/developer/common-data-service/reference/about-entity-reference). Where there isn’t an existing entity to suit a business requirement, Microsoft has made the CDM extensible, which allows you to add to your organisation’s instance of the CDM to meet your needs. As your organisation adapts and changes your CDS instance, Microsoft will then monitor this and look for common patterns amongst the business community that it will use to modify and extend the standard CDS.

Microsoft is committed to making their applications CDS aware and is working with their partners to get third party applications to interact effectively with the CDS.

When establishing CDS integration from an organisational use perspective, it should ideally be a simple configuration of a connector from a source application to the CDS, aligning its data entities with the reciprocal entities within the CDS.  This will ensure that as products are changed to meet business needs over time, the impact should be almost negligible to other systems. This negates the need for an organisation to spend an excessive amount of time ensuring the correct architecting of a solution in bringing together disparate apps and siloed information. This can now be handled through the CDS.

Since its release in 2016, CDS has evolved with Microsoft recently announcing the release of two new services; Common Data Service for Apps (CDS for Apps) and Common Data Service for Analytics (CDS for Analytics).

CDS for Apps was released in January 2018 with CDS for Analytics expected for release in second quarter 2018. As a snapshot of how the various “pieces” fit together, Figure 1 provides a logical view of how the services will interact.

Figure 1 – Common Data Service Logical Model

Common Data Service for Apps

CDS for Apps was initially designed for businesses to engage with their data on a low-code/no-code basis through Microsoft’s PowerApps product. This allows a business to rapidly develop scalable, secure and feature-rich applications.

For organisations needing further enhancement, Microsoft offers developer extensions to engage with CDS for Apps.

Common Data Service for Analytics

CDS for Analytics was designed to function with Power BI as the visual reporting visual product. Similarly to the way CDS for Apps is extensible by developers, CDS for Analytics will also provide extensibility options.

Figure 2 below provides the current logic model for how CDS for Analytics will integrate.

Figure 2 – CDS for Analytics Logic Model

Business Benefits

Implementing the CDS for Apps and CDS for Analytics will enable you to be able to easily capture data and then accelerate your ability to provide insights into your business data.

To assist in this acceleration, Microsoft and expose data, as their partners, will be building industry specific apps that immediately surface deep insights to an organisation’s data. An initial example is currently being developed by Microsoft; Power BI for Sales Insights will address the maximisation of sales productivity by providing insights into which opportunities are at risk and where salespeople could be spending their time more efficiently.

The ease of development and portability of solutions aren’t possible, however, without having a standardised data model. By leveraging Microsoft’s new common data services and with the suite of Microsoft’s platform of products being CDS aware, utilisation of tools such as Azure Machine Learning and Azure Databricks for deeper analysis of your organisation’s data becomes transformational.

If you’d like to understand more about how to take advantage of the Common Data Service or for further discussion around how it can assist your business, please get in touch.