UK Air Traffic Control Chaos – A Hard Data Governance Lesson

If data doesn’t confirm to standard or criteria, the implications can be huge. We explore why the UK’s Air Traffic Control Service might not have had the right level of data governance in place. 

Recently, problems around UK air traffic over the August Bank Holiday weekend have been all over the news. Thousands of flights have been delayed, leaving passengers stranded overseas, often having to sleep overnight in airport lounges (and often on the floor). 

The problem appears to have originated in some incorrectly formatted piece of data submitted to air traffic control systems, likely originating from an airline. The BBC reported that NATS (National Air Traffic Services) chief executive Martin Rolfe said: ‘Initial investigations into the problem show it relates to some of the flight data we received’. The Times expanded further on this by adding ‘Our systems responded by suspending automatic processing to ensure that no incorrect safety related information could be presented to an air traffic controller or impact the rest of the air traffic system’. 

It would seem then, that while NAT’s systems have successfully prevented any safety issues, the fiasco was caused by an issue with data governance. Specifically, data that doesn’t confirm to a specific standard or criteria has resulted in enormous issues, impacting thousands and incurring an enormous financial cost. With a myriad of different airlines likely inputting data into NATS’s systems, it’s vital for organisations to implement clear data quality and data governance rules. 

Data governance, or data quality rules can address six key factors concerning your data: 

  1. Accuracy – ensures your data is reliable and trustworthy 
  1. Consistency – data across multiple systems must be the same to avoid inconsistencies 
  1. Completeness – all essential fields are present and are accurate 
  1. Uniqueness – duplicate data shouldn’t exist within a single system, such as name variations (Dave vs David) 
  1. Validity – data must be validated against a specific set of criteria before storage 
  1. Timeliness – up-to-date data is essential for use in decision making 

The key is to have the right framework in place to ensure that your data meets the above needs. Euler specialise in deploying Data Quality Review (DQR) services which address these six key factors, providing a workflow that allows an issue or rule to be recognised, scoped, designed and implemented within your data.

To do this, we rely on data quality technology, such as a DQR database available in a range of platforms. Finally, we help you appoint and empower the right people within your organisation to manage internal governance.

Data governance and powerful data quality solutions are the key to ensuring that erroneous data doesn’t derail your operations and result in disgruntled customers and an additional financial burden. Click here to learn more about our Data Quality services.    

Related Resources

The 3 data ethics principles you need to know for targeting

18 Jan 2023

The 3 data ethics principles you need to know for targeting are Deontology, Utilitarianism, and Virtue Ethics….

Data-driven Marketing and Manipulation

6 Dec 2022

An important aspect of marketing is how to engage your audience effectively. To do this, marketers use…

WHITEPAPER – The Ethics of Data-driven Marketing

28 Nov 2022

Data-driven targeting is often under fire in the media, but how should marketing professionals approach this ethical…