Assessing the role and value of Big Tech – a sector with great growth potential, but also one dogged by privacy and security issues – can cause emotions to flare, debate to heat up and questions to arise. How can we determine whether it is an influence for good or ill from a sustainability perspective?
This article is part of our Investment Symposium Series, in which we present thinking on the big issues. For this series, we draw on our annual symposium. This is a core event where investment professionals at BNP Paribas Asset Management zoom in on the themes shaping the future. It is also a venue for high-level external speakers to cast a new light on the challenges of our time, testing our convictions and diversifying our thinking.
Here, Berenice Lasfargues reports on the comments by keynote speaker David Kaye from the University of California Irvine. Kaye is a professor of law, a former UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, and independent chair of the Board of the Global Network Initiative, in which BNP Paribas Asset Management actively participates.
Big Tech – two sides of the coin
On the one hand, technology provides open access, making information and content available to marginalised populations and empowering those that don’t have a voice. Tech is also spawning innovative solutions in healthcare and other industries. In that sense, tech could be said to contribute to social equality. 
At the same time, Big Tech – the handful of global giants dominating the sector – has been tainted from an environmental, social and governance (ESG) perspective: some see it as an enabler of, for example, election tampering  and invasive surveillance. There are other concerns – about market domination, favouring proprietary apps or functionalities over those of rivals , or even the large demand on power grids from their datacentres. 
The power of leading tech companies can be seen in retail and the might of ecommerce giants, and in the linkages between the advertising and social media industries. Private sector tech innovations have driven economic, social and political change and disruption in areas ranging from telecommunications and digital security to food delivery and supply chain management.
When assessing Big Tech, professor Kaye cited struggles, trade-offs and principles. To a certain extent, we believe these mirror three key investor considerations: risk, opportunity and responsibility.
The struggles of complexity and trade-offs
Assessing the power of tech giants is a struggle for two reasons.
One is complexity. Complex software networks, sophisticated and sensitive hardware all require a big leap of translation for policymakers and investors to understand them.
The second part concerns trade-offs involving issues of public life, privacy, public health and national security. While governments and the public have identified online disinformation and hate speech as sources of public harm, any effort to address them can impinge on freedom of expression and access to information. Social media and other digital tools can be harmful to children. However, they can also expand children’s ability to learn, think critically and become fully cognisant adult individuals.
It is clear that we need a framework for understanding how to make appropriate regulatory choices or, for that matter, satisfactory investment choices.
The right regulatory framework for investment
The UN Guiding Principles on Business and Human Rights were developed to address the obligations of governments – and companies – in the context of corporate impacts on human rights. 
The principles say that states must protect against human rights abuse, including business. This means taking appropriate steps to prevent, investigate, punish and redress abuse. They state that companies have responsibilities to prevent or mitigate human rights harms, and have the capacity and the tools to do so.
Investors have used the principles as a framework to evaluate responsible human rights-related corporate behaviour, including an assessment of Big Tech.
Human rights due diligence
Professor Kaye highlighted that for a company, responsible human rights-related behaviour involves making/having policies promoting the prevention and mitigation of harms. It also involves incorporating ‘human rights due diligence’ at all stages, from product development to the end-customer.
This is not much different from policies on the environment or the climate: many governments require companies to undertake environmental impact assessments. So why not human rights impact assessments as well?
We believe investors can be an essential part of efforts to bring companies around to the idea that human rights due diligence policies can be valuable tools for responsible business activity.
Adopting a multi-stakeholder approach
So are we – as the public, as investors, as policymakers – willing to take steps to ensure that Big Tech acts responsibly? 
Advancing responsible company behaviour means that, for example, in the context of content moderation, hate speech and disinformation, companies are encouraged to apply the principles of freedom of expression in a transparent way that also involves remedies. Professor Kaye argued that society has the tools to ensure that Big Tech operates in line with the public interest. However, there is a risk that over time, it can become harder to take firm action.
Notes and references
 See, for example, How 5G can be a force for social equality
 See, for example, Investigation of Competition in Digital Markets
 See, for example, Data Centres and Data Transmission Network
 Under our Responsible Business Conduct policy, we include these principles in our due diligence and screening process. Read about our policy here
 At BNP Paribas Asset Management, we believe stewardship entails direct corporate engagement and proxy voting, working with policymakers on key issues relating to sustainable finance and investment, and taking part in key investor ESG networks. Read more here