The role of innovation 

The beauty of innovation is in its non-linear characteristics. Instead of a traditional painting, innovation nowadays presents itself as an abstract watercolour piece highlighting the elegance of opportunity, risk, and reward.

With the rise of machine learning and artificial intelligence (AI) seeping into almost every industry sector ranging from improving the diagnosis of Cancers to improving safety with industrial machinery, an important consideration comes into play: where do companies draw the line between innovating and respecting regulation?

With the backdrop of a challenging macroeconomic climate and continued political instability, the industry faces a data-driven world unlike anything seen before. 

Generally speaking, data protection laws vary widely across the globe ranging from the General Data Protection Regulation (GDPR) [1] to the California Consumer Privacy Act (CCPA) [2] to more recent laws such as Brazil’s Lei Geral de Proteção de Dados (LGPD) which came into force in September 2020. [3]

Within the UK and Europe, for example, the General Data Protection Regulation (GDPR) has been a globally recognized, staple regulation since its introduction in May 2018 replacing the previous 1995 Data Protection Directive [4].

Legislation of this nature is in place to ensure companies are lawfully processing data without impeding the rights and freedoms of individuals, but on the flip side presents a combination of opportunities and challenges for organizations wishing to take their first step on the AI technologies ladder or scale their technologies beyond their current framework. This is where within the UK, for example, the Data Protection Impact Assessment (DPIA) comes in.

The Data Protection Impact Assessment 

The Data Protection Impact Assessment is a process for companies to identify, analyze, and minimize any data protection risks that might stem from a project. Flexible and scalable in nature, they can be adapted for a variety of sectors and for companies of varying shapes and sizes.

With the DPIA being an important aspect of companies being accountable to GDPR legislation, businesses must ensure there’s sufficient rigor against desired strategic outcomes to ensure privacy risks are fully understood.

The Information Commissioners Office (ICO) provide templates for guidance in structuring the assessment, but it’s perfectly fine for companies to design their own bespoke version for more specific alignment. What’s paramount is companies continually review and iterate their DPIA in alignment with the scalability and iteration of the project. [5] 


Bringing in AI-based technologies is where things become complex, falling under the bracket of high-risk, processing. Risk is an important aspect of a DPIA, but without a universal definition applied to GDPR, the level of subjectivity is vast.

In summary, the ICO defines risk in the following way: “….the risk to the rights and freedoms of natural persons, of varying likelihood and severity, may result from data processing which could lead to physical, material or non-material damage, in particular: where the data processing may give rise to discrimination, identity theft or fraud, financial loss, damage to the reputation, loss of confidentiality…” [6]

In this definition, it’s imperative to recognise the risk recognize of harm can be intangible through social and economic means. 

The modern data economy and generative AI

DPIA aside, the post-COVID-19 data economy has created a fresh wave of consumer reflection on how companies handle and utilize personal data.

According to a 2022 Salesforce Connected Consumer Report, 79% of consumers say they’d be increasingly inclined to trust a company with their data if the company were clearer in explaining how they’d use it. [7]

Combining this with 74% of consumers being concerned over companies collecting more data than they need [7], the science of balancing education and choice will become increasingly challenging as the AI adoption wave heightens.

In addition, with the rise of Generative AI through the introduction of ChatGPT (Chat Generative Pre-Trained Transformer) back in November 2022, a variety of use cases stem from its 100 million (and increasing) user base, [8] creating a myriad of legal and ethical considerations leaving policymakers and practitioners scratching their head for answers.

Despite the combined uptake and magnetic attraction in consumers trusting Generative AI, it has been found ChatGPT-4 – the most recent version of OpenAI’s ChatGPT – hallucinates. [9]

Whilst the output of the chatbot on the surface may be reasonable, the information output could in fact be factually wrong, completely fake, or totally out of context. With 73% of consumers trusting content that’s generative AI-generated [10], this puts into perspective the importance of balancing adoption and risk, especially with the legalities and risks not yet being fully understood. 

In summary, think of the above as the creation of an intriguing cake mix: a packet of educating the consumer, a healthy jug of consumer choice, a splash of strategic objectives, a large bowl of trust, and a large serving spoon of risk: how would the cake look if it was baked in the oven? 


Conclusion 

Last but not least, finding a balance between innovation and regulation isn’t straightforward. If there’s any advice I can give to balance both aspects, it’s this: 

  • Involve your stakeholders in the development of your data strategy from the start. Yes, right from the start. The more you involve your stakeholders the higher the chance of buy-in. After all, a return on trust is equally as important as a return on investment. 
  • Engage with organizations such as the ICO. Although it’s a legal requirement to complete a DPIA for any high-risk processing, it’s ok to pick up the phone and ask for advice. After all, as much as they want you to work within the rules they want you to innovate too!

To round off, the biggest advantage in the data space is the power of networking: being able to speak to practitioners in the same industries who have experienced both the challenges and opportunities. DPIA’s and human-in-the-loop systems are great, but complementing them with real-world advice is key in balancing innovation and regulation. 

Bibliography

[1] GDPR: Smith, J. (2017). The History of the General Data Protection Regulation – European Data Protection Supervisor – European Data Protection Supervisor. [online] European Data Protection Supervisor - European Data Protection Supervisor. Available at: https://edps.europa.eu/data-protection/data-protection/legislation/history-general-data protection-regulation_en. 

[2] California Consumer Privacy Act: State of California Department of Justice (2023). California Consumer Privacy Act (CCPA). [online] State of California - Department of Justice - Office of the Attorney General. Available at: https://oag.ca.gov/privacy/ccpa. 

[3] Brazil’s Privacy Law: Shreya (2021). Data Privacy Laws Around the World: A Quick Look. [online] CookieYes. Available at: https://www.cookieyes.com/blog/data-privacy-laws in-the-world/#:~:text=There%20are%20many%20data%20privacy [Accessed 17 Jul. 2023]. 

[4] GDPR History: Smith, J. (2017). The History of the General Data Protection Regulation - European Data Protection Supervisor - European Data Protection Supervisor. [online] European Data Protection Supervisor - European Data Protection Supervisor. Available at: https://edps.europa.eu/data-protection/data-protection/legislation/history general-data-protection-regulation_en. 

[5] ICO DPIA: ico.org.uk. (2023). What is a DPIA? [online] Available at: https:// ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/accountability-and governance/data-protection-impact-assessments-dpias/what-is-a-dpia/#what1 [Accessed 5 Jul. 2023]. 

[6] ICO High Risk Processing: ico.org.uk. (2023). Examples of processing ‘likely to result in high risk’. [online] Available at: https://ico.org.uk/for-organisations/uk-gdpr-guidance and-resources/accountability-and-governance/data-protection-impact-assessments-dpias/ examples-of-processing-likely-to-result-in-high-risk/ [Accessed 5 Jul. 2023]. 

[7] Salesforce Consumer Connected Report: State of the Connected Customer FIFTH EDITION Insights from nearly 17,000 consumers and business buyers on the new customer engagement landscape. (n.d.). Available at: https://www.salesforce.com/content/dam/web/ en_us/www/documents/research/salesforce-state-of-the-connected-customer-fifth-ed.pdf. 

[8] ChatGPT User Stats: Ruby, D. (2023). ChatGPT Statistics for 2023: Comprehensive Facts and Data. [online] demandsage. Available at: https://www.demandsage.com/chatgpt statistics/.

[9] ChatGPT-4 Technical Report: OpenAI (2023). GPT-4 Technical Report. [online] Available at: https://cdn.openai.com/papers/gpt-4.pdf. 

[10] Capgemini Generative AI Report: Why Consumers Love Generative AI. (n.d.). Available at: https://prod.ucwe.capgemini.com/wp-content/uploads/2023/06/ GENERATIVE-AI_Final_WEB_060723.pdf [Accessed 18 Jul. 2023].