Big Data Is about Improving the Quality of Decision Making

‘Big Data’ is frequently  the driving force behind much of the data driven insights and emerging technologies that we see today, whether this be Artificial Intelligence or the Internet of Things.

That we are generating ‘new’ data at an exponential rate is nothing new.  The challenge continues to be how to generate value from this data, in a timely manner, so that this insight can be used to make better decisions.  Better could be faster, it could be more accurate, ideally it is both.

Competitive advantage is achieved through a variety of methods, accuracy of decision making drives better decisions and speed allows you to outflank the competition, through better customer service or quicker product launch.

Obtaining Data Is Often Not the Difficult Part

One of the challenges when implementing Big Data is determining the data do you decide to ingest and store.  Broadly speaking, two alternative approaches can be adopted:

  1. Gather as much as possible, index only very key metadata (e.g. host, source, date & time) and worry about building analytical models when you have a decent range of data to look at.
  2. Analyse data sources in depth, assessing their quality and trying to determine in advance the value of relationships between different data sets.

The first approach is based on the ‘you don’t know what you don’t know’ theory, favoured by for example, Splunk.  Be mindful of the cost, both in terms of software license and for hosting and storage (on premise, fully cloud or some kind of hybrid).  If your data is of poor quality (inconsistent, missing etc.) you may end up building data models that don’t reflect true operations and from that making poor decisions.  However, you can get started quickly and provide value in days or weeks not months.

Taking the more analytical approach can help you reduce your storage and processing needs but equally can take a lot of time with little to show for it.  Although hindsight is a wonderful thing, building analytical models with limited data sets can lead to limited benefits.

The Challenges Are Both Technical and Non-Technical

Developing the data architecture and the analytical capability that will drive business benefit faces significant challenges, often the non-technical ones have greater impact:

  1. Privacy.  Notwithstanding the introduction of the GDPR legislation in Europe to provide for visibility and control over who has our data and what they do with it, a personal and business ‘data trail’ follows us everywhere.
  2. Security.  Major corporate data breaches are reported with increasing frequency.   The impact of financial data breaches tends to be inconvenient but rarely life threatening.  IoT solutions widely deployed with AI making operational decisions could render hacks and threats catastrophic in impact.
  3. Discrimination.  ‘Garbage in, garbage out’ remains as apt as ever – AI fed with incorrect data inevitably results in incorrect decisions.

In Summary

For businesses, the ability to leverage Big Data is likely to become increasingly critical in the coming years.  To get started with your project, or simply to discuss your ideas, call us on 0113 242 3795.