The European Union’s (EU) forthcoming Corporate Sustainability Due Diligence Directive is a step in the right direction, but a number of improvements are needed to ensure technology companies do not escape accountability for their role in human rights or environmental abuses, says analysis by the Business and Human Rights Resource Centre (BHRRC).
The risks posed by tech firms to human rights and the environment range from the use of forced labour and conflict minerals in supply chains, to the deployment of discriminatory and opaque algorithms in employment decisions, and the invasion of privacy via predictive analytics or technologies such as facial recognition.
The proposed directive is the first attempt to mandate comprehensive human rights and environmental due diligence in the EU, and will force companies to identify, prevent and mitigate any actual or potential risks that arise throughout their operations or value chains.
The BHRRC has said the directive is likely to have broad global implications through the “Brussels Effect” – whereby multinational corporations will often adopt European regulatory standards in order to simplify their operations and supply chains, even when they are not compelled to do so. This means “it is critical to ensure the directive is sufficiently well designed to improve responsible practice” throughout the tech sector, said the organisation.
Although a number of international frameworks already exist to control the behaviour of multinationals – including the United Nations Guiding Principles on Business and Human Rights (UNGPs), the updated Organisation for Economic Co-operation and Development’s (OECD) Guidelines for Multinational Companies, and the International Labour Organization’s Tripartite Declaration of Principles Concerning Multinational Enterprises and Social Policy – they are all voluntary and non-binding.
Although the directive uses the UNGPs as its foundation, and will include additional administrative penalties or civil liability where companies fail to meet their obligations, the BHRRC said a number of changes are needed to effectively transform the tech sector’s response to human rights abuses.
It said the sector has long evaded accountability and, all too often, the burden of proof about its abuses rests on the victims, rather than the companies perpetrating them. “The burden of proof should be with the company to demonstrate it has acted lawfully on due diligence, rather than on the victim to show that the company has not,” said the BHRRC.
To support greater human rights and environmental protection in the industry, as well as shift the balance in favour of victims, the BHRRC has identified a number of key areas where the directive can be strengthened.
The first is to widen the scope in which companies and sectors are caught by the regulation, because under the current draft, many high-risk tech companies, including those that provide surveillance or facial recognition software, would be omitted from the directive’s ambit.
“Unfortunately, neither technology nor digital industries are included in the list of ‘high-impact sectors’, which is a critical oversight,” said the BHRRC’s analysis. “The directive will be far more effective if these sectors are included, which would ensure substantially more tech companies are required to perform at least some human rights and environmental due diligence.
“However, even for those sectors currently included in the directive’s ‘high-impact sectors’, companies are only required to identify and address their severe impacts ‘relevant to the respective sector’, rather than to undertake a broad, risk-based approach to due diligence contemplated by the UNGPs.”
Another issue is that the directive does not encompass tech companies’ full value chain because the due diligence obligations are currently limited to “established business relationships”.
The BHRRC said this “allows for impunity in the face of harmful supply chains” because it doesn’t take account of the fact that business relationships in the tech sector, despite their often transient and sporadic nature, can have major human rights implications.
“For instance, a major tech company may be contracted to develop codes at different points that will form part of a comprehensive worker surveillance tool, impacting gig and service workers, with implications for their welfare,” it said. “But the developer company may well not define this relationship with the buyer, ultimately producing a comprehensive worker surveillance tool as an ‘established business relationship’.
“Similarly, technology may be sold to a government through a single contract while the company continues to upgrade or troubleshoot said technology, without this qualifying as an ‘established business relationship’ under the directive. In line with the UNGPs, the directive should focus on the principle of severity of risk, rather than the longevity of a business relationship to guide the due diligence requirement.”
Following on from this, the BHRRC said the directive also needs to transition from characterising stakeholder engagement as an optional element in the process of identifying and addressing human rights risks, towards making it “unequivocally required”.
It added that human rights defenders, vulnerable or marginalised groups, and technical experts should all be explicitly included as key stakeholders, given the relevance of their experience and knowledge of the tech sector’s negative rights impacts.
Other areas of improvement suggested by the BHRRC include amending the complaints procedure so that a wider range of actors can use it, and removing the broad range of exceptions and mitigating circumstances that “allow reckless tech companies to sidestep their responsibilities”.
As it stands, for example, tech companies will not be liable for damages or harms caused by the activities of an indirect partner with whom it has an “established business relationship”, as long as the firm has taken contractual measures to cascade compliance in its value chain.
“For tech companies, which have constantly changing impacts, this could function as a way to avoid consequences through superficial inclusion of contractual clauses and third-party verification, leaving harmed individuals and groups without redress,” said the BHRRC.
In August 2021, Amnesty International claimed that major venture capital (VC) firms and accelerator programmes involved in funding and developing technology businesses have failed to implement adequate human rights due diligence processes, which means their investments could be contributing to abuses around the world.
Of the 50 VC firms and three accelerators surveyed, only one – Atomico – had due diligence processes in place that could potentially meet the standards set out by the UNGPs.
“Our research has revealed that the vast majority of the world’s most influential venture capitalist firms operate with little to no consideration of the human rights impact of their decisions,” said Michael Kleinman, Silicon Valley director of Amnesty Tech, at the time. “The stakes could not be higher – these investment titans hold the purse strings for the technologies of tomorrow, and with it, the future shape of our societies.”
The EU is also taking forward a separate directive to improve gig economy working conditions, which, if passed, would reclassify millions of people working for platforms such as Uber, Deliveroo and Amazon Mechanical Turk as workers, rather than self-employed, thus entitling them to a much wider range of rights and workplace protections.