Artificial Intelligence and Occupational Health and Safety – AI an enabler or a threat

We increasingly hear statements like, “machines are smarter than us” and “they will take over our jobs”. The fact of the matter is that computers can simply compute faster, and more accurately than humans can. So, in the short video below, we instead focus on how machines can be used to assist us do our jobs better, rather than viewing AI as an imminent threat. It shows how AI can assist in better occupational health and safety in the hospitality industry. It does however apply to many use cases across many industries, and positions AI as an enabler. Also see an extended description of the solution after the video demo.

Image and video recognition – a new dimension of data analytics

With the introduction of video, image and video streaming analytics, the realm of advanced data analytics and artificial intelligence just stepped up a notch.

All the big players are currently competing to provide the best and most powerful versions;   Microsoft with Azure Cognitive Services APIs, Amazon with AWS Rekognition, Google Cloud Video Intelligence as well as IBM with Intelligent Video Analytics.

Not only can we analyse textual or numerical data historically or in real time, we’re now able to extend this to use cases of videos and images. Currently, there are API’s available to carry out these conceptual tasks:

  • Face Detection

o   Identify a person from a repository / collection of faces

o   Celebrity recognition

  • Facial Analysis

o   Identify emotion, age, and other demographics within individual faces

  • Object, Scene and Activity Detection

o   Return objects the algorithm has identified within specific frames i.e. cars, hats, animals

o   Return location settings i.e. kitchen, beach, mountain

o   Return activities from video frame i.e. riding, cycling, swimming

  • Tracking

o   Track movement/path of people within a video

  • Unsafe Content Detection

o   Auto moderate inappropriate content i.e. Adult only content

  • Text Detection

o   Recognise text from images

The business benefits

Thanks to cloud computing, this complex and resource demanding functionality can be used with relative ease by businesses.  Instead of having to develop complex systems and processes to accomplish such tasks, a business can now leverage the intelligence and immense processing power of cloud products, freeing them up to focus on how best to apply the output.

In a nutshell, vendors offering video and image services are essentially providing users API’s which can interact with the several located cloud hosts they maintain globally. All the user needs to do, therefore, is provide the input and manage the responses provided by the many calls that can be made using the provided API’s. The exposé team currently have the required skills and capability to ‘plug and play’ with these API’s with many use cases already outlined.

Potential use cases

As capable as these functions already are, improvements are happening all the time.  While the potential scope is staggering, the following cases are based on the currently available. There are potentially many, many more – the sky really is the limit.

Cardless, pinless entry using facial recognition only

This is a camera used to view a person’s face, which then gets integrated with the facial recognition API’s.  This then sends a response, which can be used to either open the entry or leave it shut. Not only does this improve security, preventing the use of someone else’s card, or pin number, but if someone were to follow another person through the entry, security can be immediately alerted. Additional cameras can be placed throughout the secure location to ensure that only authorised people are within the specified area.

Our own test drive use case

As an extension of the above cardless, pinless entry using facial recognition only use case, additional API’s can be used to not only determine if a person is authorised to enter a secure area, but to check if they are wearing the correct safety equipment. The value this brings to various occupational health and safety functions is evident.

We have performed the following scenario ourselves, using a selection of API’s to provide the alert. The video above demonstrates a chef who the API recognises using face detection.  Another API is then used to determine that he is wearing the required head wear (a chef’s hat). As soon as the chef is seen in the kitchen not wearing the appropriate attire, an alert is sent to his manager to report the incident.

Technical jargon

To provide some understanding of how this scenario plays out architecturally, here is the conceptual architecture used in the solution showcased in the referenced Video.

Architecture Pre-requisite:

·        Face Repository / Collection

Images of faces of people in the organisation. The vendors solution maps facial features, e.g. distance between eyes, and stores this information against a specific face. This is required by the succeeding video analytics as it needs to be able to recognise a face from various angles, distances and scenes. Associated with the faces are other metadata such as name, date range for permission to be on site, and even extra information such as work hours.

Architecture of the AI Process:

·        Video or Images storage

Store the video to be processed within the vendors storage location within the cloud, so it is accessible to the API’s that will be subsequently used to analyse the video/image.

·        Face Detection and Recognition API’s

Run the video/images through the Face Detection and Recognition API to determine where a face is detected and if a particular face is matched from the Face Repository / Collection.  This will return the timestamp and bounding box of the identified faces as output.

·        Frame splitting

Use the face detection output and 3rd party video library to extract the relevant frames from the video to be sent off to additional API’s for further analysis.  Within each frames timestamp create a subset of images from the detected faces bounding box, there could be 1 or more faces detected in a frame.  The bounding box extract will be expanded to encompass the face and area above the head ready for the next step.

·        Object Detection API’s

Run object detection over the extracted subset of images from the frame.  In our scenario we’re looking to detect if the person is wearing their required kitchen attire (Chef hat) or not.  We can use this output in combination with the person detected to send an appropriate alert.

·        Messaging Service

Once it has been detected that a person is not wearing the appropriate attire within the kitchen an alert mechanism can be triggered to send to management or other persons via e-mail, SMS or other mediums. In our video we have received an alert via SMS on the managers phone.

Below we have highlighted the components of the Architecture in a diagram:

Conclusion

These are just a couple of examples of how we can interact with such powerful functionality; all available in the cloud. It really does open the door to a plethora of different ways we can interact with videos and images and automate responses. Moreover, it’s an illustration of how we can analyse what is occurring in our data, extracted from a new medium – which adds an exciting new dynamic!

Video and image analytics opens up immense possibilities to not only further analyse but to automate tasks within your organisation. Leveraging this capability, the exposé team can apply our experience to your organisation, enabling you to harness some of the most advanced cloud services being produced by the big vendors. As we mentioned earlier, this is a space that will only continue to evolve and improve with more possibilities in the near future.

Do not hesitate to call us to see how we may be able to help.

 

Contributors to this solution and blog entry:

Jake Deed – https://www.linkedin.com/in/jakedeed/

Cameron Wells – https://www.linkedin.com/in/camerongwells/

Etienne Oosthuysen – https://www.linkedin.com/in/etienneo/

Chris Antonello – https://www.linkedin.com/in/christopher-antonello-51a0b592/

 

Branch locations and success criteria using predictive analytics – our Agribusiness case study

Exposé designed and developed a solution that uncovered whether there were strong relationships between known characteristics of branches and their success in order to determine new locations to be considered and the services they will offer.

See our case study here: exposé case study – Agribusiness – Branch locations and success criteria using predictive analytics

European GDPR and its impact on Australian organisations. We give you the low-down from an analytic tool perspective.

What is the European GDPR and how will it impact Australian organisations?  We give you the low-down from an analytic tool perspective.

GDPR (General Data Protection Rules) is the European privacy and data protection law that comes into effect on the 25th of May 2018.  This surely doesn’t affect Australian companies, right? Wrong!

The thing is, whilst the new regulation governs data protection and privacy for all EU citizens it also addresses personal data outside of the EU. The impact will be far-reaching, including Australian businesses, as all businesses concerned with the gathering and analysis of consumer data could be affected.

What the law says

According to the Office of the Australian Information Commissioner (OAIC), Australian businesses of any size may need to comply. In addition, all Australian businesses must comply with the Australian Privacy Act 1988.

Are these two laws complimentary? Some of the common requirements that businesses must adhere to include:

  • Implementation of privacy by design approach to compliance
  • An ability to demonstrate compliance with privacy principles and obligations
  • Adoption of transparent information handling practices
  • Appropriate notification in case of any data breach
  • Conduction of Privacy impact assessments

But some GDPR requirements are not part of the Australian Privacy Act, such as the “right to be forgotten”.

What now?

We would suggest that Australian businesses firstly establish whether they need to comply with GDPR.  If they do, then they should take prompt steps to ensure their data practices comply. Businesses should already comply with the Australian Privacy Act, but also consider rolling out additional measures required under GDPR which are not inconsistent with the Privacy Act.

Who is affected

In a nutshell, the GDPR applies to any data processing activities undertaken by an Australian business of any size that:

  • Has a presence in the EU
  • Has a website/s that targets EU customers or mentions customers or users in the EU
  • Tracks individuals in the EU to analyse (for example to predict personal preferences, behaviours and attitudes)

Refer to the following link for more information: https://www.oaic.gov.au/media-and-speeches/news/general-data-protection-regulation-guidance-for-australian-businesses

Do analytic tools comply?

Once a need for your organisation to comply has been established, it is worth ascertaining whether the actual tools you are using for analytics comply; specifically regarding the last bullet point above (tracking and analysing individuals).

In the next section of this article we look at two common players in the analytics space; Power BI and Qlik, through the lens of GDPR (and by default the Australian Privacy Act).

The scope of GDPR is intended to apply to the processing of personal data irrespective of the technology used. Because Power BI and Qlik may be used to process personal data, there are certain requirements within the GDPR that compel users of these technologies to pay close attention:

  • Article 7 states that consent must be demonstrable and “freely given” if the basis for data processing is consent.  The data subject must also have the right to withdraw consent at any time
  • Articles 15 to 17 covers the rights to access, rectification, and erasure. This means that mechanisms must allow data subjects to request access to their personal data and receive information on the processing of that data. They must be able to rectify personal data if it is incorrect. Data subject must also be able to request the erasure of their personal data (i.e. the “right to be forgotten”)
  • Articles 24 to 30 require maintenance of audit trails and documentary evidence to demonstrate accountability and compliance with the GDPR
  • Article 25 requires businesses to implement the necessary privacy controls, safeguards, and data protection principles so that privacy is by design
  • Articles 25, 29 and 32 require strict data security access control to personal data through for example role-based access and segregation of duties

Microsoft Power BI

Power BI can be viewed through the lens of GDPR (and the Australian Privacy Act for that matter) via four pillars in the Microsoft Trust Centre. With specific reference to GDPR, Microsoft states, “We’ve spent a lot of time with GDPR and like to think we’ve been thoughtful about its intent and meaning”.  Microsoft released a whitepaper to provide the reader with some basic understanding of the GDPR and how it relates to Power BI. But meeting GDPR compliance will likely include a variety of different tools, approaches, and requirements.

Security

Power BI is built using the “Security Development Lifecycle”, Through Azure Active Directory Power BI is protected from unauthorised access by simplifying the management of users and groups, which enables you to assign and revoke privileges easily.

Privacy

The Microsoft Trust Centre clearly states that “you are the owner of your data” and it is not used for mining for advertising.  http://servicetrust.microsoft.com/ViewPage/TrustDocuments?command=Download&downloadType=Document&downloadId=5bd4c466-277b-4726-b9e0-f816ac12872d&docTab=6d000410-c9e9-11e7-9a91-892aae8839ad_FAQ_and_White_Papers

From the Power BI white paper, “We use your data only for purposes that are consistent with providing the services to which you subscribe. If a government approaches us for access to your data, we redirect the inquiry to you, the customer, whenever possible. We have challenged, and will challenge in court, any invalid legal demand that prohibits disclosure of a government request for customer data.” https://powerbi.microsoft.com/en-us/blog/power-bi-gdpr-whitepaper-is-now-available/  

Compliance

Microsoft complies with leading data protection and privacy laws applicable to Cloud services, and this is verified by third parties.

Transparency

Microsoft provides clear explanations on:

  • location of stored data
  • the security of data
  • who can access it and under what circumstances

Qlik

The BI vendor, Qlik, released a statement that declares “With more stringent rules and significant penalties, GDPR compels businesses to use trusted vendors. Qlik is committed to our compliance responsibilities – within our organization and in delivering products and services that empower our customers and partners in their compliance efforts.” – https://www.qlik.com/us/gdpr

Qlik released an FAQ document as a GDPR compliant vendor stating that they have various measures in place to protect personal data and comply with data protection/privacy laws, including GDPR:

  • Legal measures to ensure the lawful transfer
  • Records of data processing activities (Article 30)
  • Ensuring Privacy-By-Design and Privacy-By-Default
  • Data retention and access rules
  • Data protection training and policies

For more information, please view the links below:

https://www.qlik.com/us/-/media/files/resource-library/global-us/direct/datasheets/ds-gdpr-qlik-organization-and-services-en.pdf?la=en

Conclusion

The two vendors discussed are clear in their commitment to ensuring their security arrangements can comply with GDPR. This does not mean that other major players (Tableau, Google, etc.) do not have the same initiatives in flight, we have only focused on Microsoft and Qlik.

Whilst there is no ‘magic button’ available to ensure all regulations are miraculously met, it is possible regardless of vendor:

  • To ensure security policies can meet GDPR compliance
  • To design with privacy in mind.  Even though platforms may meet “privacy is by design”, your specific solution must still be proactively designed.  You cannot simply rely on the vendor
  • To conduct an appropriate solution audit with aligned to GDPR (or Australian Privacy Act) as a good final step

GDPR can indeed be a tricky landscape to navigate – if in doubt, check it out.

We can certainly assist in guiding you through the process from an Data and Analytics perspective.

A Power BI Cheat Sheet – demystifying its concepts, variants and licencing

Power BI has truly evolved over the past few years.  From an add-on in Excel to a true organisation wide BI platform, capable of scaling to meet the demands of large organisations; both in terms of data volumes and the number of users. Power BI now has multiple flavors and a much more complicated licencing model. So, in this article, we demystify this complexity by describing each flavor of Power BI and their associated pricing. We summaries it all at the end with some scenarios and in a single cheat sheet for you to use.

Desktop, Cloud, On-premise, Pro, Premium, Embedded – what does all of this mean?

I thought it best to separate the “why” (i.e. why do you use Power BI – Development or Consumption), the “what” (i.e. what can you do given your licence variant), and the “how much” (i.e. how much is it going to cost you) as combining these concepts often leads to confusion as there isn’t necessarily an easy map of why what and how much.

Let’s first look at the “why”

“Why” deals with the workload performed with Power BI based on its deployment – I.e. why do you use Power BI? Is it for Development or for Consumption. This is very much related to the deployment platform (i.e. Desktop, Cloud, On-Premise or Embedded).

The term “consumption” for the purpose of this article could range from a narrow meaning (I.e. the consumption of Power BI content only) to a broad meaning (i.e. consumption of-, collaboration over-, and management of Power BI content – I refer to this as “self-serve creators”).

Why – workload/ deployment matrix

Now let’s overlay the “why” with “what”

In the table above, I not only dealt with the “why”, but I also introduced the variants of Power BI; namely Desktop, Free, Pro, On-Premise and Embedded. Variants are related to the licence under which the user operates and it determines what a user can do.

Confused? Stay with me…all will become clearer.

What – deployment/ licence variant matrix

Lastly let’s look at the “how much”

The Power BI journey (mostly) starts with development in Desktop, then proceeds to a deployed environment where it is consumed (with or without self-serve). Let’s close the loop on understanding the flavours of Power BI by looking at what this means from a licencing cost perspective.

Disclaimer: The pricing supplied in the following table is based on US-, Australian-, New Zealand- and Hong Kong Dollars. These $ values are by no means quotes but merely taken from the various calculators and pricing references supplied by Microsoft as at the date of first publication of this article.

How much – licence variant/ cost matrix

https://www.microsoft.com/en-Us/sql-server/sql-server-2017-pricing

https://powerbi.microsoft.com/en-us/calculator/

https://azure.microsoft.com/en-us/pricing/calculator/

**Other ways to embed Power BI content are via Rest API’s (authenticated), SharePoint online (via Pro licencing) and Publish to Web (unauthenticated), but that is a level of detail for another day. For the purpose of this article, we focus on Power BI Embedded as the only embedded option.

Pro is pervasive

Even if you deploy to the Cloud and intend to make content available to pure consumers of the content only (non-self-serve users), whether it be in PowerBi.com or as embedded visuals, you will still need at least one Pro licence to manage your content. The more visual content creators (self-server creators) you have, the more Pro licences you will need. But, it is worth considering the mix between Pro and Premium licences, as both Pro and Premium users can consume shared content, but only Pro users can create shared content (via self-service), so the mix must be determined by a cost vs capacity ratio (as discussed below).

A little bit more about Premium

Premium allows users to consume shared content only. It does not allow for any self-service capabilities. Premium licences are not per user, but instead, based according to planned capacity, so you pay for a dedicated node to serve your users. Consider Premium licencing for organisations with large numbers of consumers (non-self-serve) that also require the dedicated computer to handle capacity. The organisation would still require one or more Pro licences for content management and any self-serve workload.

Premium licencing is scaled as Premium 1, 2 or 3 dependant on the number of users and required capacity. You can scale up your capacity by adding more nodes as P1, P2 or P3, or scale up from P1 to P2, and from P2 to P3.

Premium capacity levels

The mix between Pro and Premium

Given that Pro users can do more than Premium users, and given that you will need to buy one or more Pro licences anyway, why would you not only use Pro rather than Premium? There are two reasons:

  • There is a tipping point where Pro becomes more expensive compared to Premium, and
  • With Pro licences you use a shared pool of Azure resources, so is not as performant as Premium which uses dedicated resources, so there is a second tipping point where your capacity requirements won’t be sufficiently served by Pro.

The diagram below shows the user and capacity tipping points (discussed further in scenario 1 below):

Capacity planning Premium 1 vs Pro: Users/ Cost/ Capacity

Put this all together

Right, you now understand the “why”, “what” and “how much” – let’s put it all together through examples (I will use Australian $ only for illustrative purposes). Please note that there are various ways to achieve the scenarios below and this is not a comprehensive discussion of all the options.

Scenario 1

A large organisation has 10 Power BI Developers; their Power BI rollout planning suggest that they will grow to 50 self-service creators and 1450 additional high activity consumers in 12 months. And that they will grow to 125 self-serve creators and 5000 high activity consumers in 48 months:

Initially, they will require

10 x Power BI Desktop licences = $0 x 10 = $0

500 x Power BI Pro licences to cover both self-serve users and consumers = $12.70 x 500 = $6,350

Total – A$6,350.00pm

Once they exceed 500 they can revert to

50 x Power BI Pro licences to cover self-serve users = $12.70 x 50 = $635

1 x P1 node to cover the next tranche of high activity consumers = $6,350

Total – A$6,985.00pm

Thereafter

Add Power BI Pro licences as required up to their planned 125 = $12.70 x 125 = $1,588

Add 1 additional P1 node at 1,450 users, and again at 2,900 users, and again at 4,250 users = $25,400 for 4 x P1 nodes

Total after 4 years at 5000 high activity consumers and 125 self-serve creators – A$26,988.00pm

Scenario 2

A small organisation with 1 Power BI developer, 5 additional self-service creators and 10 additional consumers of visual content, with no custom applications/ websites.

1 x Free version of Power BI Desktop: 1 x $0

15 x Pro licences as both visual creators and mere consumers will take part in shared content: 15 x $12.70

Total – A$190.50pm

Scenario 3

A small ISV organisation with 3 Power BI developers want to embed Power BI content in an application that they sell. The application must be up 24 x 7 and do not require a very high volume of concurrent users, but licencing cannot be on a per-user basis.

3 x Free version of Power BI Desktop: 3 x $0

1 x Pro licences acting as the mater of the Shared content: 1 x $12.70

A1 Node pricing: 1 x $937

Total – A$950.00pm

Scenario 4

A medium sized organisation with 5 Power BI developers want to embed Power BI content in an internal portal such as SharePoint which is used by potentially 250 users. They also have 10 self-service creators and 25 consumers of Power BI content through the Power BI portal.

5 x Free version of Power BI Desktop: 3 x $0

26 x Pro licences acting as 1 mater of the Shared content and 25 consumers: 26 x $330.20

A1 Node pricing: 1 x $937

Total – A$1,267.20pm

Power BI – licence variant, workload, deployment & cost cheat sheet

Any process is shown in Australian $

Disclaimer: The pricing supplied in the following table are by no means quotes, but merely taken from the various calculators and pricing references supplied by Microsoft as at the date of first publication of this article.

Licence variant, workload, deployment & cost cheat sheet

Networks Asset Data Mart – our Energy Infrastructure Provider case study

networks asset

Exposé designed and developed a solution that saw an increasingly temperamental Networks Asset Analytical solution move to the Exposé developed Enterprise Analytics Platform.

The solution now:

• Allows staff to focus on business-critical tasks by utilising the data created by the system.
• Reduces support costs due to the improved system stability.
• Utilises the IT resources for other projects that improve business productivity.

exposé case study – Energy Infrastructure Provider – Networks Asset Data Mart

See another case study here

An Internet of Value – Blockchain, beyond the hype and why CxO’s must take note

A Blockchain, in its simplest form, is a distributed database system where there is no one master (primary) database, but many databases that are all considered primary. All parties participate in populating entries into the respective databases and receive the entries of the other participants.

But how does this apply to your business, and is this profoundly going to change how the world works? Let’s look at an analogy: Imagine I create a song and generate a file of my recording in mp3 format on a USB stick. I can give two of my friends a copy of this; they can do the same, and so on. With thousands of eventual copies going around, it will be impossible to establish which was the real version I own and which I ideally wanted to use in exchange for royalties. By the way, if I ever had to create a song and recorded it, I doubt very much that I would garner thousands of fans. I am just not David Grohl 😊

This is where Blockchain comes in. It is a shared ledger that is used to record any transaction and track the movement of any asset whether tangible, intangible or digital (such as my mp3). It is immutable, so participants cannot tamper with entries, and it is distributed, so all participants share and validate all the entries.

Blockchain will allow “my fans” 😊 to enter into a contract with me directly. As they stream the song, payment goes directly from their wallet into mine. The information about what was listened to and what I was paid, is verified by all the databases in the network and cannot be changed. There are no middlemen (like a central streaming service, or a record label), so the contract (a digital smart contract) is between those that listen to my song and me directly.

It is at this point important to mention that Blockchain is not Bitcoin, or any other cryptocurrency, although it did start life as the technology that underpins cryptocurrencies. This article, the first in a series of three articles, looks beyond its use in cryptocurrencies, and rather highlights use cases to show CxO’s why it is so important to take note of Blockchain and to start controlled proof of concepts (POC’s) and research and development (R&D) in this technology now. We look at some examples across a wide range of industries and use a Courier based use case to delve deeper into what Blockchain could mean for organisations using the Internet of Things (IoT).

Sport

Dope testing and cheating have been quite topical lately with large portions of the Russian contingent banned from the Rio Olympics in 2016, and again from the Winter Games in South Korea in 2018 for systemic manipulation of tests. Blockchain will make the test results immutable and open the results up to all that participate in the data cycle. Even if the athlete changes sports, that data will be available to participating sporting organisation. http://www.vocaleurope.eu/how-technology-can-transform-the-sports-domain-blockchain-2-0-will-boost-anti-doping-fight-sports-investments-and-e-sports/

Health

Some countries are planning health data exchanges with the aim of addressing a lack of transparency and improving trust in patient privacy as well as fostering better collaboration. Blockchain will provide practitioners and providers with better access to health, patient and research information. Adoption of Blockchain will lead to closer collaboration and better treatment and therapies, sooner.

Blockchain in healthcare is real and imminent. This study from IBM shows how pervasive Blockchain is expected to become with 16% of 200 responding health executives aiming to implement a Blockchain solution shortly. https://www.ibm.com/blogs/think/2017/02/Blockchain-healthcare/

Banking

 Australia’s Commonwealth Bank collaborated with Brighann Cotton and Wells Fargo to undertake the world’s first global trade transaction on Blockchain between independent banks – an unbroken digital thread that ran between a product origin and its final destination, capturing efficiencies by digitising the process and automating actions based on data. https://www.commbank.com.au/guidance/business/why-blockchain-could-revolutionise-the-shipping-industry-201708.html

CommBank is taking this a few steps further with an appointed head of Blockchain and a whopping 25 proof of concepts over the past five years, including the ability to peer-to-peer transfer of funds offshore within minutes rather than days, and the issuing of smart contracts. http://www.innovationaus.com/2017/12/CBA-outlines-a-blockchain-future

Insurance

Customers and insurers will be able to manage claims better, transparently and securely. Claim records, which are tamper proof once written to the chain, will streamline the claim process and minimise claimant fraud such as multiple claims for the same incident.

With Smart Contracts, payments can be triggered as soon as certain minimum conditions are met. There are also many smart contract rules that could ascertain when a claim is also fraudulent automatically denying the claim. https://www2.deloitte.com/content/dam/Deloitte/ch/Documents/innovation/ch-en-innovation-deloitte-Blockchain-app-in-insurance.pdf

Courier Delivery

Couriers deliver millions of items each day, very often crossing vast geographical distances and across multiple sovereign boundaries with unique laws and processes.

These businesses, who often make heavy use of IoT devices, will benefit hugely from Blockchain to improve the ability to track every aspect of a package delivery cycle and minimise fraud.

There were 20 billion connected IoT devices in 2017 and projected to grow to 75 billion by 2025. https://www.statista.com/statistics/471264/iot-number-of-connected-devices-worldwide/

The current centralised approach for insertion and storage of IoT data (see the image below) simply won’t be able to cope with volume demands and transactional contracts will have to rely on multiple 3rd parties. Also Managing data security can be very complex because data will flow across many administrative boundaries with different policies and intents.

In contrast, the Blockchain decentralised peer-to-peer approach for insertion and storage of IoT data eliminates issues with volume demand, (the data is stored across a potentially unlimited number of databases). There is no single point of failure that can bring the whole IoT network to a halt (computation and storage is shared and there is no one primary). It supports tamper-proofing (all participating databases validate a transaction, which is then shared and becomes immutable), which means increased security from rogue participants such as IoT device spoofers and impersonators (Spoofing can occur when security is breached through a lowly secured device on a shared IoT network. If the lowly/ unsecured device is hackable, then the whole network is compromised as it will believe that the hacker is encrypted as the intruder is on it through the easily hacked device).

Delving deeper into our Courier Delivery use case – Blockchain and IoT, creating an Internet of Value

In a courier parcel delivery ecosystems, the movement of parcels is tracked every step of the delivery process via IoT devices that reads a barcode, or another form of identification that can be picked up by the sensor. From the original warehouse to a vehicle, a plane, another warehouse, and finally your home.

By using Blockchain, each sensor participates in the chain and records “possession” of the delivery item (and so also the location). Each time it is read by a new sensor, the new location is broadcast to, inserted, then shared and agreed on by the remaining participants on the Blockchain. Every block is subsequently a transaction that is unchangeable once inserted into the blockchain.

Each Blockchain entry (i.e. the barcode, the location of the package and a date-time stamp) is encrypted into the Blockchain. The “possession” steps are tracked no matter who is involved in the package delivery process (from the origin which could be the delivery at an Aus Post outlet, to an Aus Post vehicle to the airport, to QANTAS en route to the US, to a DHL distribution centre in a US airport, and finally to a DHL delivery vehicle en route the destination address). This enhances trust in the system as there is no need to adhere and interface with a single primary system, and package tracking is put on steroids. If you have ever sent anything abroad, you would know that granular tracking effectively ends at the border. This won’t be the case with Blockchain. https://www.draglet.com/Blockchain-applications/smart-contracts/use-cases

Conclusion

It must be noted that Blockchain technology has not been around for very long and is rapidly evolving. Widespread commercialisation beyond cryptocurrencies is still in its infancy. But all indications are that it will be a hugely disruptive technology.

The many examples of important players taking this technology seriously move Blockchain beyond hype:

CxO’s may ask, why to invest in something they cannot yet fully understand, but this was probably a very similar question asked in the 90’s about the internet. The learning curve will no doubt be steep, but that makes investing in targeted R&D and POC’s early all the more important so that they do not get caught off guard once commercialisation starts increasing.

In the next article, Blockchain, lifting the lid on confusing concepts, we will delve a little bit deeper and describe the concepts in more depth.