Data is the new oil – or is it?

Data can only be ‘the new oil’ when it has value – and value is determined by data quality and business purpose. How can you influence the outcome to ensure you strike it rich?

Data alone is meaningless, so to claim that ‘data is the new oil’ is a bit misleading. Data value shines through when you put a business purpose behind it, and ensure the quality is good enough. And next-generation technologies help you to analyse, manage, and make sense of the insights locked within:

Big data technologies ingest and distil your data. These valuable insights help to enhance business processes, improve the user experience, and create new value for your business.

Artificial intelligence (AI) and machine learning technologies analyse complex data sets, such as voice and facial recognition. As well as augment the results using algorithms that learn from their environment, you deliver a more tailored outcome to each user.

Internet of Things technologies make it possible to proactively monitor data. Rather than wait on a report, real-time alerts generated by these devices notify you the moment you need to act.

Cloud Computing services, such as Software as a Service (SaaS), Platform as a Service (PaaS), Data Centre as a Service (DCaaS), provide the resource needed to help you manage and act on your data at a price point that is commercially justifiable.

But before you even think about the data migration…

Check the data used to power these next-gen technologies is fit-for-purpose. Organisations waste between 10-30% of their revenue on handling data quality issues. Whether the data is inaccurate, incomplete, or out of date, data that isn’t fit-for-purpose results in poor decision-making, increases reputational risk, and causes missed opportunities. In direct and indirect costs, poor data costs an average of $9.7 million.

So, what factors affect quality data? And what can you do to boost your data value?

Only quality data gives accurate insights

People and technologies need excellent quality data to create an accurate outcome. But there are several factors that can influence your results, including:

Volume of data

Every day in 2020, we sent and received 306.4 billion emails, watched 5 billion videos on YouTube, and published 500 million updates on Twitter. That year, the world produced an unimaginable 64.2 zettabytes of data. But only 2% was saved and retained.

When left unmanaged, huge data volumes place your business at risk. When we manage data in a system or database, inbuilt controls govern its use and safety. Left unmanaged, and therefore ungoverned, data is easy to lose, vulnerable to attack, and you risk overlooking important data.

To retain oversight of your most precious asset, identify what data you have, where it is, and who has access to it (we offer a Landscape Analysis service). With visibility of your data estate, you can put appropriate controls in place to protect sensitive, personal, and confidential data. As well as fulfil your legal requirements under Article 17 of the GDPR to delete data that is no longer necessary.

Legacy data

Whenever you set up a new system, application, platform (even a spreadsheet!), it starts with beautifully clean data and the best of intentions. But then work gets in the way. For example, your sales team will skip over contact fields in your CRM system because they’re jumping from call to call. Marketing gets frustrated at the lack of data and set up a new entry without deleting the old one. In the latest release, your technical team used a variation on the product spelling. And then a well-meaning individual adds an update to the wrong field.

Without continuous, real-time maintenance, data in old systems is compromised because its accuracy starts to decay. Yet, 3 in 5 businesses don’t know how much bad data is costing them, because they don’t measure the impact.

To maintain quality data, it’s essential to cleanse what you have – particularly before a data migration. It takes time and effort, but pays dividends in the long-term because your data is:

  • Valid: data adheres to the guidelines/constraints that support your business purpose.
  • Complete: data is accessible to the people and technologies that need it.
  • Consistent: where systems contain the same data, there are no discrepancies.
  • Uniform: different systems use the same format to record data.
  • Accurate: your data matches the actual value.

Nuance of language

Many cloud service providers flippantly refer to “seamless integration”. During a data migration, you “simply” plug in the APIs that allow data to flow freely. But integration is complex. Problems stem from the diverse ways we refer to and record data.

For example, System A and System B may both have fields for job titles. But System A refers to it as ‘Job’, while System B refers to it as ‘Job Title’. Connect the systems together, and the field names don’t exactly match, so your data can’t flow and automatically populate. Now think about all the other tools you use in your business and the field names they use. ‘Title’, ‘Job role’, ‘Role’, ‘Job description’, ‘Profession’, ‘Occupation’, ‘Position’, ‘Post’ – very quickly it becomes a mess.

Successful integrations (or data migrations) start with a solid plan. Identify and map all the fields in your systems – including custom fields – so data reaches the right destination. Always test a sample first, because if something goes wrong it’s easier to fix 10 bad entries compared to 10,000. Once you’re happy the integration works, make appropriate backups, just in case you need to roll back. And if in doubt, always ask for help!

Unconscious bias

Technology is great because it does what you’ve programmed it to do without argument. But what if bias infects the data used to programme the tech? You’ll end up with the wrong outcome.

In the realm of AI and machine learning, unconscious bias plagues data sets. Think about your legacy data. What if historically men had been more successful job applicants than women? If you use this data set to programme AI technology that screens CVs, it will discount women. It’s not that the incoming data is ‘wrong’, we’ve simply not considered or accounted for the bias when creating the tech.

Apple has some eye-opening examples of unconscious bias affecting their product development. The AI-powered facial recognition technology in the iPhone X couldn’t identify Chinese people because it was only tested on white faces. And its health application failed to track women’s reproductive health because it was only assessed on men.

These are interesting lessons we must learn from to ensure the outcomes from our data are accurate. We must consciously seek to limit prejudice through greater diversity within the development teams and within the testing audiences. Once we have an outcome, we must monitor it to identify instances where something isn’t right, so we can go back in, investigate the problem, and fix the issue.

“No organisation needs, wants, or will pay for perfect quality data”

Practical Data Migration, by Johny Morris

The pursuit of perfection is a distraction from your core business. Therefore, seek appropriate data, of the right quality, to the right place, at the right time. To discover more about what is ‘appropriate data’ and ‘the right quality’, download our white paper Cloud Migration for Digital Transformation.

Alternatively, why not get in touch to see how Euler can help?

We ensure your customer data is accurate and accessible to the people and technologies that need it, to fuel sales, build loyalty and transform customer experiences. From data management and integration to business intelligence and analytics, campaign orchestration and more, we can help you get the most from your data.



Related Articles

WHITEPAPER – Cloud Migration for Digital Transformation

18 Jul 2022

Cloud migration can be challenging, but done correctly, it can be the key to your digital transformation…

How much is poor quality data costing you?

2 Nov 2021

Experts think that handling data quality issues typically costs organisations between 10% and 30% of revenue (Experian 2021). 

WHITEPAPER – Preparing for a major data initiative – The importance of a Landscape Analysis

19 Oct 2021

In this whitepaper Rob Jones explains exactly why analysing your data landscape prior to starting your project…