Transforming the business into a data centric organisation through an Advanced Analytics and Big Data solution – our ACH Group case study

Big Data

An Advanced Analytics and Big Data solution allows for the acquisition, aggregation and blending of large volumes of data often derived from multiple disparate sources. Incorporating IoT, smart devices and predictive analytics into the solution.
Our ACH Group case study shows how a clever data platform architecture and design facilitates transformation into a data-centric organisation in response to comprehensive regulatory changes and to leverage opportunities presented by technology in order to create a better experience for customers and staff.

See the case study here: Exposé case study – ACH Group

See more about advanced analytics

Important Power BI news release

Power BI News

Starting June 1, 2017, Microsoft is making some changes to the way Power BI is licensed, there are also some important changes to the Power BI Service. So if you use Power BI, or intend to use Power BI, please be aware of these changes.

Exposé has been at the forefront of the Power BI revolution and we view these changes as even more positive steps towards a cost-effective, scalable and maturing BI and Analytics platform. We found that organizations really benefit from some guidance on the administration side of Power BI. If you’d like further advice on these changes or assistance with this transition and how it affects you and your organization, please don’t hesitate to get in touch.

Here are the changes:

Power BI free tier

Microsoft is now giving all free tier users the following capabilities:

  • ability to connect to all of the data sources that Pro users can connect to
  • a storage quota increase from 1GB to 10GB
  • data refresh maximum from once daily to once hourly
  • streaming data rates from ten thousand rows per hour to one million rows per hour

But in doing this they will be removing the following capabilities:

  • sharing reports and dashboards with other users
  • using group workspaces (now to be called app workspaces)
  • export to PowerPoint, CSV, Excel
  • analyze in Excel

This makes the licensing of the free tier truly for personal use only as all private sharing capabilities are now no longer available within the Power BI free license.

To help ease the transition to the new licensing model, Microsoft is allowing people who have had a license with the Power BI service on or before May 2, 2017 and have at least signed in once between May 2, 2016 and May 2, 2017 to apply for an extended trial of a Power BI Pro license. This license will enable the use of all Power BI Pro features until May 31, 2018. If you meet these requirements you will be sent an email from MS and will also have a notification appear when you log in to the service.

If you require an organizational use of Power BI, you will now either need to license all users for Power BI Pro or their new tier, Power BI Premium.

Power BI Premium

Power BI Premium is a new capacity-based licensing model coming late in the second quarter of 2017. It allows organizations to acquire only Power BI Pro licenses for report creators and the rest of the organization to consume these reports and dashboards without having to purchase a Pro license.

The charging model for this is based on a Premium node within the Azure environment that can be scaled according to an organizations performance requirements. Microsoft has provided a calculator service here to help estimate costs.

Power BI Report Server

Coming late in the second quarter of 2017, Microsoft will be offering the capability to publish Power BI reports on-premise using Power BI Report Server.

The on-premises server will allow the deployment and distribution of interactive Power BI reports and traditional paginated reports within the boundaries of an organization’s firewall.

To enable the use of Power BI Report Server, you will need to either be licensed under Power BI Premium or have a per-core license of SQL Server Enterprise Edition with Software Assurance.

Power BI Apps

Power BI content packs are changing to become known as Power BI apps.

At the moment, there won’t be a large difference between apps and content packs, mostly a change in interface and publishing process. But Microsoft has a roadmap for improvement under the new app model.

They are planning the following enhancements to app workspaces in the coming months:

  • Creating app workspaces will not create corresponding entities in O365 like group workspaces do. So you can create any number of app workspaces without worrying about different O365 groups being created behind the scene (you can still use an O365 group’s OneDrive for business to store your files).
  • Today you can add only individuals to the members and admin lists. In the next iteration, you will be able to add multiple AD security groups or modern groups to these lists to allow for easier management.

The impact, for now, is that Microsoft will rename all group workspaces to app workspaces and you can publish an app from any of these workspaces.

Power BI Embedded

Microsoft has also announced the convergence of the Power BI Embedded service with the Power BI service. This means that there will be one Power BI API that will have feature parity with the current Power Embedded service and so any existing apps built using Embedded today should continue to function but you will be required to prepare for migration over to the new service.

Power BI Service Interface

Finally, and for those who may not have been aware, Microsoft has been trialling a new interface for the Power BI service over the past few months. As of May, this interface will become the default. I’d recommend taking some time to understand what the new interface is like as there are some large changes to what you may be used to in relation to your workflow.

From operational challenges to a modern, automated and simplified organisation – Our Business SA case study

An Exposé case study around our advanced analytics solution for the ‘voice of business in South Australia’,  Business SA. The solution was an important component of a large digital transformation program that saw Business SA transition to a modern, automated and simplified organization which was underpinned by the following technology changes;

• A cloud-first strategy which reduced Business SA’s dependence on resources that provided no market differentiation
• Simplified the technology landscape with a few core systems which performed specific functions
• Established a modular architecture which is more able to accommodate change
• Implemented a digital strategy to support an automated, self-service and 24/7 service delivery
• Improve data quality through simpler and more intuitive means of data entry and validation
• Utilise the latest desktop productivity tools providing instant mobility capabilities

See the full case study here: Data and Analyticsexposé case study – Business-SA

Is the Data Warehouse Dead?

data warehouse

I am increasingly asked by customers – Is the Data Warehouse dead?

In technology terms, 30 years is a long time. This is how old the Data Warehouse is – that makes the Data Warehouse an old timer. Can we consider it a mature yet productive worker, or is it a worker gearing up for a pension?

I come from the world of Data Warehouse architecture and in the mid to late naughties (2003 to 2010) whilst working for various high profile financial service institutions in the London market, Data Warehouses were considered all important and companies spent a lot of money on their design, development, and maintenance. The prevailing consensus was that you could not get meaningful, validated and trusted information to business users for decision support without a Data Warehouse (whether it followed an Inmon, or a Kimbal methodology – the pros and cons of which are not under the spotlight here). The only alternative for companies without the means to commit to the substantial investment typically associated with a Data Warehouse was to allow Report Writers to develop code against the source systems database (or a landed version thereof), but this, of course, leads to the proliferation of reports, and it caused a massive maintenance nightmare and it went against every notion of a single trusted source of the truth.

Jump ahead to 2011, and businesses started showing a reluctance to invest in Data Warehouses – a trend that accelerated from that point onward. My observations of the reasoning for this ranged from the cost involved, the lack of quick ROI, a low take-up rate, difficulty to align it to ongoing business change, and, more recently, a change in the variety, volume and velocity of data that businesses are interested in.

In a previous article “From watercooler discussion to corporate Data Analytics in record time” (https://exposedata.wordpress.com/2016/09/01/from-watercooler-discussion-to-corporate-data-analytics-in-record-time/) I stated that the recent acceleration of changes in the technology space, “…now allows for fast response to red-hot requirements… and how the “…advent of a plethora of services in the form of Platform-, Infrastructure- and Software as a Service (PaaS, IaaS and SaaS)…are proving to be highly disruptive in the Analytics market, and true game changers.

Does all of this mean the Data Warehouse is dead/ dying? Is it an old timer getting ready for pension, or does it still have years of productive contribution to a corporate data landscaper left?

My experience across the Business Intelligence and Data Analytics market, across multiple industries and technology taught me that:

A Data Warehouse is no longer a must-have for meaningful, validated and trusted information to the business users for decision support. As explained in the previous article the PaaS, SaaS and IaaS services that focus on Data Analytics (for example the Cortana Intelligence Suite in Azure (https://www.microsoft.com/en-au/cloud-platform/cortana-intelligence-suite), or the Amazon Analytics Products (https://aws.amazon.com/products/analytics/) allows for modular solutions that can be provisioned as required which collectively answers all the Data Analytics challenges and ensures data gets to users (no matter where it originates, its format, its velocity or its volume) fast, validated and in a business-friendly format.

But this does not mean that these modular Data Platforms that use a clever mix of PaaS, Saas, and IaaS services can easily provide some of the fundamental services provided by a Data Warehouse (or more accurately, components typically associated with a Data Warehouse), such as:

  • Where operational systems do not track history and the analytical requirements require such history to be tracked through (for example slowly changing dimensionality type 2).
  • Where business rules and transformations are so complex that it makes sense to define the rules and transformations by way of detailed analysis and for it to be hardcoded into the landscape through code and materialised data in structures that the business can understand and is often reused (for example dimensions and facts resulting from complex business rules and transformations).
  • Where complex hierarchies are required by the reporting and self-service layer.
  • To assist regulatory requirements such as proven data lineage, reconciliation, and retention by law (for example for Solvency II, Basel II and III and Sarbanes-Oxley).

Where these requirements exist, a Data Warehouse (or more accurately, components typically associated with a Data Warehouse) is required. But even in those cases, a Data Warehouse (or more accurately, components typically associated with a Data Warehouse) will merely form part of a much larger Data Analytics Data Landscape. It will perform the workloads described above, and there is a larger data story delivered by complimentary services.

In the past, Data Warehouses were key to delivering optimized analytical models that normally manifested themselves in materialized Data Mart Star Schemas (the end result of a series of layers such as ODS, staging, etc.) Such optimized analytical models are now instead handled by business-friendly metadata layers (e.g. Semantic Models) that source data from any appropriate source of information, bringing fragmented sources together in models that are quick to develop and easy for the business to consume. These sources include those objects typically associated with a Data Warehouse/ Mart (for example materialized SCD2 Dimensions, materialized facts resulting from complex business rules, entities created for regulatory purposes, etc.) and they are blended with data from a plethora of additional sources. The business user still experiences that clean and easy to consume Star Schema-like model. The business-friendly metadata layer becomes the Data Mart, but is easier to develop, provides a quicker ROI, is much more responsive to business change, etc.

Conclusion

The data warehouse is not dead but its primary role as we knew it is fading. It is becoming complementary to a larger Data Analytics Platforms we see evolving. Some of its components will continue to fulfil a central role, but it will be surrounded by all manner of services and collectively these will fulfil the organisation’s data needs.

In addition, we see the evolution of Data Warehouse as a Service (DWaaS). This is not a Data Warehouse in the typical sense of the word as spoken of in this article, but rather a service optimized for Analytical Workloads. Can it serve those requirements typically associated with a Data Warehouse such as SCD2, materialization due to complex rules, hierarchies or regulatory requirements? Absolutely. But its existence does not change the need for those modular targeted architectures and the need for a much larger Data Analytics Data Landscape using a variety of PaaS (including DWaaS), IaaS and SaaS. It merely makes the hosting of typical DW workloads much simpler, better performing and more cost-effective. Examples of DWaaS are Microsoft’s Azure SQL DW and Amazon’s Redshift.

 

 

Advanced Analytics and Big Data Platform – RAA Case Study

data platform

An Exposé case study around our advanced analytics and big data platform for RAA that allows for the acquisition and blending of large volumes of fragmented geospatial data, transforming it using massive processing capacity, using predictive analytics to assess the risk of millions of properties, and providing interactive and geospatial visualisations of the blended data and results.

This video case study shows a solution summary:

See the full case study here: expose-case-study-raa

See another big data solution here

Disrupting the banking market

This video shows a comprehensive solution geared around disruption in the Banking market, from transactions through to advanced analytics.

The viewer meets 3 different customers, their challenges and how the bank responds to them.