Can corporate social/digital responsibility safeguard data use?

Research Title: Data responsibility, corporate social responsibility, and corporate digital responsibility

Research Authors: Joanna van der Merwe and Ziad Al Achkar

Research Publisher: Cambridge University Press, Data & Policy

Research Publication date: February 2022

Can corporate social/digital responsibility safeguard data use?

The hypothesis that corporate social responsibility (CSR) or corporate digital responsibility (CDR) has the capacity to safeguard data appears to be a theory that is all carrot and no stick. In the truest definition of business success, what is the bottom-line benefit of CSR and CDR on profits? Why would corporations be invested in safeguarding data and enforcing government regulations at the potential expense of their profit? And at such a critical juncture of human history where every individual generates novel data everyday – through their informed consent or otherwise – can (and should) such a responsibility rest in the realm of self-regulation by corporations who inherently profit off said data?

In this article, the authors explore the history and development of CSR and CDR, their common enforcement mechanisms, their purposes, and their relation to data responsibility while positing the need for a broader societal and comprehensive approach to genuinely implement responsible data use.

An inherent flaw of CSR and CDR has been their reliance on self-regulation and societal pressure to govern their appropriate application i.e. soft measures rather than government enforced laws and regulations. We need look no further than the environmental protection realm to see the impact optional compared to mandatory regulations can have on retarding progress and allowing private corporations to circumvent safeguarding public good.

As a researcher and peacebuilder, I have grappled firsthand with the challenges of balancing the pressing need for rigorous research to craft viable and effective programming and protecting the dignity, rights, and well-being of individuals. The authors highlight the progress made to date by the humanitarian field in establishing and enforcing data protections and use. In a chapter of the new book Wicked Problems: The Ethics of Action for Peace, Rights, and Justice released by Oxford University Press, Liz Hume and I argue the peacebuilding field requires minimum ethical research practices that establish global best practices and centralize the protection of vulnerable populations through the adoption of recognized research best practices, including informed consent, institutional review boards and ethics committees, and data protection and privacy rights.

With great power comes great responsibility. Anyone, including corporations, who has the capacity to wield power over another requires incentives to protect and mechanisms to enforce said protections. A system with both the carrot and the stick help establish standards and norms to guarantee we all stay safely in our lanes, understand where the guardrails are, and determine when the harms far outweigh the potential benefits – to both an individual and society. Whether intentionally or not, there are far too many examples of abuses of power that have been committed for the sake of profit at others’ expense.

The authors make a compelling case for more attention and regulation of the digital realm. We cannot rely on the private sector alone to ‘do the right thing’ when it comes to protecting individuals’ data and privacy – we all have a major role to play. The current realm is severely underregulated and will require focused and concerted efforts from purveyors, users, and generators of data to develop just and equitable data use regulations. Pursuing a CSR or CDR strategy will not be sufficient to implement nor enforce responsible data use – a fundamental right of all people.

AfP Blog Author: Jessica Baumgardner-Zuzik, Deputy Executive Director - Research & Finance

jessica bz