Blockchain/AI/IoT | May 13, 2024

Why VC investors need a 1.5°C goal for responsible tech

Paul Fehlinger and Johannes Lenhard

Get a weekly pulse on news and trends in impact investing with our free newsletter.

*I agree to receive marketing emails from ImpactAlpha, its affiliates, and accept our terms of use and privacy policy.
By signing up you agree to receive marketing emails from ImpactAlpha Inc. and accept our Terms of Service and Privacy Policy.
Guest Author

Paul Fehlinger

Guest Author

Johannes Lenhard

Like the 1.5°C goal for climate, responsible tech needs a clear North Star to guide investments and innovations. To build an ecosystem of responsible technology and long-term value, venture capitalists and limited partners need to align on processes, incentives and alternative investment models that go beyond the high-risk “move fast and break things” mentality of the 2010s.

Powerful emerging technologies like artificial intelligence, quantum computing, and extended reality offer tremendous potential for economic and societal progress. However, they also come with high levels of risk and uncertainty if deployed irresponsibly. Recent scandals in the technology and venture capital sphere, from the implosion of FTX to the downfall of WeWork, have laid bare alarming deficiencies, especially in the governance of rapidly scaling startups.

Broader societal concerns are also mounting regarding market consolidation among tech behemoths and the negative impacts of certain tech-driven business models on democracy, mental health, and socio-economic stability. Geopolitical competitive forces risk further accelerating the development of transformative technologies like AI without due ethical consideration. 

With over $45 billion invested in 2023 in European startups and over $350 billion in the US alone, the sheer scale of venture capital flowing into the next generation of technologies is staggering. The decisions on how this money is spent and what is expected from the business models they finance define the digital future. 

Governing emerging technologies remains exceedingly challenging. Philosopher James Collingridge eloquently described this “dilemma of control.” It is nearly impossible to properly assess the risks and impacts of a technology until it becomes deeply embedded across society. This is what happened with social media, driven by giants such as Facebook, Twitter, or now Snap or TikTok. Only now, 12 years after Facebook’s IPO, driven by whistleblower accounts, congressional hearings and court cases, are we more clearly seeing the negative impacts Facebook products have had on users’ mental health and even democracy. 

Attempting to control a technology’s trajectory at the stage Facebook is at has proven frustrating, even for regulators and governments. While heavy-handed bans on innovation would be unwise, the inherent uncertainty of pioneering novel technologies means ethical considerations often become an afterthought to boundless growth. This “move fast and break things” mentality still fails to adequately consider broader societal impacts, despite the techlash movement and countless fines.

Lessons from health and climate 

Historically, ethical standards have emerged in other fields, such as medicine. Since the ancient Hippocratic oath, medical ethics has evolved through eras of discovery, debate, and defined red lines such as human cloning and gene editing. Similarly, in climate policy, it took four decades of research for the UN Intergovernmental Panel on Climate Change to establish the 1.5 degrees Celsius global warming threshold. This North Star metric now guides investment and innovation through frameworks like science-based targets and regulations that penalize negative externalities and incentivize climate-positive innovation. For the first time, innovators and investors can earn more money with responsible climate technologies than with traditional polluting technologies.

But the accelerated pace of technologies like AI and quantum computing means responsible governance frameworks cannot wait for decades. Their rapid development necessitates responsive governance now – and not of the old kind. Ideally, we start at the earliest level of a company’s life cycle before any damage is done and when the uncertain goals and impact of new companies can still more easily be reigned in and safeguarded. 

Following the money up in this chain, takes us to the level of venture capital investors (VCs) and their limited partners (LPs), asset owners, such as pension funds, state funds, endowments and foundations whose money VC investors manage. VCs actively determine which emerging technologies and business models are able to scale through their selective provision of funding to startups. Meanwhile, limited partners wield tremendous influence over venture capital strategy and priorities through their capital allocation and the legal terms set in limited partnership agreements.

The standards, metrics, and incentives established by these two groups can have dramatic multiplier effects across the entire technology industry. The two groups not only influence which industries, sectors, business models and technologies receive funding; they can also influence how companies are built and scaled. It is here where we believe the biggest, quickest and most direct impact can be had to establish a framework of responsible technology. 

Responsible investing in tech

We propose a three-pronged approach to develop and establish a fit-for-purpose approach to responsible investments in technological innovation, from north-star metrics and concrete practices to a new flywheel for capital allocation.  

First, we need to define a north-star metric and a set of sub-factors contributing to this metric. In simple terms: what are good digital technologies and what are we optimizing for? What is the goal, beyond mere return maximization? To evaluate emerging technologies’ risks and uncertainties, establishing a ‘goal’ akin to the 1.5 degrees concept can align global action and stimulate a new generation of impact capitalism and entrepreneurship, as we see in the climate sector.  

Second, we need to understand how VC investors and the companies they invest in can translate such key metrics into action and everyday practice. An integration across functions – a full-body workout – similar to responsible business or ESG integration, needs to be tailored to the material issues in technologies like AI, quantum, and biotech. Both investment and company practices need to be aligned. 

Finally, while responsible investments in technologies have to be based on lucrative business cases for returns on investments, we propose the need for alternative investment models, including innovative alternative exit models for venture investment, that incentivize patient, sustainable growth beyond the pressure to grow 7-10x within 7 years. Developing attractive alternative models for capital allocation can help overcome the high-risk, hypergrowth of the “move fast and break things” mentality of the 2010s. 

Retrofitting VC with the above focus will be a long-term undertaking; starting a new flywheel at the same time will help us accelerate the redirection of the tech ecosystem.

A new chapter

The window for constructive action is closing rapidly. Unlike past epochs, governing AI, quantum, robotics, and other disruptive technologies cannot wait multiple generations or centuries. Existential risks are at stake. Choices made today will irrevocably shape our collective societal and economic future. This marks a pivotal turning point as emerging technologies transform productivity, social fabrics, and human-machine and human-nature relations.

Venture capitalists, limited partners, policymakers, entrepreneurs, corporations, and civil society must urgently come together to define the next agenda steps before it is too late. We believe in the power of the ecosystem to turn around. But it needs to happen fast – and might involve disrupting itself. In theory, this is something VC should be good at. Let’s prove it, for the good of society.

Paul Fehlinger is an affiliate at Harvard University and director of policy, governance innovation and impact at the Project Liberty Institute.

Johannes Lenhard is an affiliate at Cambridge University and co-founder and co-executive director of Venture ESG.