Skip to main content

Public Interest Technology: Closing the Digital Trust Gap

Roland Alston, Appian
November 11, 2021


The benefits of digital transformation are potentially limitless, with nearly endless ways to keep us connected with technology that optimizes efficiency, cost savings and competitive advantage.

But here’s the thing. How do you mitigate any potential ethical risks and unintended consequences of mainstreaming emerging technology when almost two-thirds (65%) of senior executives can’t explain how specific artificial intelligence (AI) model decisions or predictions are made, and 73% struggle to get executive support for prioritizing AI ethics according to a recent study by global analytics firm FICO.

Which is a good segue to the timely theme of this year’s World Usability Day: Design of our Online World: Trust, Ethics, and Integrity. This global event kicks off on November 11, 2021, with communities in technology, industry, government and more exploring ways to make technology easier to access and simpler to use.

The COVID-19 crisis accelerated the already blistering pace of digital transformation that now touches virtually every aspect of our lives. So, the challenge now is: 

  • How do we create trust in what analyst firm Gartner calls the age of hyperautomation?
  • What are the ethical implications of how we design technology?
  • How do we design for accessibility and inclusion to ensure that everyone can use our technology?

This is where Public Interest Technology (PIT) comes in. To get the most out of technology, it’s not enough to “build it and they will come.'' Better, argue leading voices in a growing movement of public interest technologists, to take a human-centered approach to innovation with strategies that invest in research, education, and ways to deploy technology that protect and benefit society. Which gets at something that matters a lot these days: mitigating poorly designed use-cases that undermine consumer trust and amplify calls for technology regulation. 

"Companies that are explicitly incorporating the public interest tech frame are beginning to see real value generated both in terms of their profits, but also in terms of the trust that they're generating with the critical communities that they're working with, showing up for, and engaging with through business relationships," says Michelle Shevin, a leading voice in the public interest technology (PIT) movement and a senior program manager of the PIT Catalyst Fund at the Ford Foundation.

"So, we think public interest tech is really core to a long-term ecosystem of building accountability and trust across constituencies and across customer relationships," says Shevin. “We think it's really core to business growth.”

“For example, as a business leader, you don't want to get 20 steps down the road of designing and deploying facial recognition technology that could soon be regulated out of existence because nobody thought to put guard rails in place, because nobody was consulting with the communities that might be most impacted by the technology."

If that sentiment makes you feel like rolling your eyes, don't. Global research from McKinsey shows we can improve usability by innovating around the three dimensions that people say matter most:

  • Improve trust in digital services by increasing privacy and security. About 44% of consumers surveyed don’t fully trust digital services.
  • Improve the user experience in digital channels by refining user interfaces (UX/UI). About 56% of dissatisfied users conveyed discomfort with digital UX/UI or lack of information about products and services.
  • Improve consumer experience by making all products and services digitally available. 43% of consumers say they prefer digital for convenience, availability.

In other words, people are conflicted about the acceleration of digital transformation, and what it all means for the post-COVID world. But here’s the good news. Taking a human-centered approach to digital innovation can change how people look at technology—if it’s done with intention. Which brings us to PIT thought leader Michelle Shevin who breaks down the PIT movement and how it prioritizes public trust, speeds tech adoption, drives tech equity, and anticipates regulation in the age of hyperautomation. The questions have been edited for clarity and brevity.

We Need a new kind of Technologist


The public interest technology (PIT) movement seems to be gaining momentum. You talked about that in a recent Fast Company article you wrote. You basically argued we need more technologists trained to understand the ethical, legal, and political ramifications of the technology they create. Talk about that and also about what you’ve called the public interest law framing of PIT.


Public interest law is a useful  mental model to keep in mind. In the 1950s and 60s there was this great need for legal expertise that also understood the demands of the civil rights movement and at that time funders started to make big investments in legal defense funds, and institutions like the ACLU, and university law clinics, and pro-bono infrastructure for private law firms, for example.

Because public interest law is a thriving field, we often say that it’s hard to imagine now that this important infrastructure for the pursuit of justice needed to be intentionally built, but it did. And in that continued pursuit of justice we see now a big moment to invest in public interest tech. And this is  partially about responding to the escalating damage of the move-fast-and-break-things era that has characterized much of digital transformation thus far. 

So, it's become clear to stakeholders in philanthropy that we need a new framework to make sure technology is developed, deployed, designed, regulated, and used in a way that protects consumer rights and improves people's lives. So, PIT aims to be that new framework. And it's really a growing field that's made up of a different kind of technologist who tends to be more interdisciplinary and intersectional. 


In what way?


They might not be your classic computer scientist or programmer or developer. They could be journalists, they could be artists, they could be advocates and yes, they could also be coders and programmers. And it's this different kind of technologist that really expects and demands, you know, that technologies be created and used responsibly. And so you may have heard of the terms ‘responsible tech’ or ‘ethical tech’ or ‘tech for good’. Well, PIT resonates with all of those frames. It’s kind of a big tent. It's a movement, it's a field and it really is a home for technologists to call out where technology can better deliver services and contribute to solving big problems. 


So, PIT is really different from traditional computer science or data science expertise.


Increasingly we are going to see mainstream computer science and data science recognizing this, but traditionally it has not. Public interest tech centers a recognition  that historically marginalized groups are most often harmed by technology. Data is impacted by structural inequality. Computing is not immune to power dynamics. Public interest technologists are people who have the knowledge and experience to make technology that really advances values like justice and equity in addition to, for example, optimizing for the bottom line of the business, profit margin, cost savings, or efficiency. PIT is an extremely diverse field, as you could imagine. It spans job categories and it's intentionally diverse in terms of people's backgrounds. And because of that, it's a group who better represents public interests in the U.S. and abroad. 

People and Technology - in that Order


So, when you talk about equitable tech, you’re talking about making sure the technologies we develop really don't harm people or make the digital divide even worse than what it already is. But beyond social impact and doing the right thing, what’s the business case for PIT, and why should business leaders care about it? 


So, the private sector is a critical piece of the PIT ecosystem, and also a place where we think there's enormous potential for deepening commitments, expanding our notions of what accountability looks like in the private sector. And of course, a lot of technology development happens through private industry, which makes the private sector this really critical node. Simply put, companies that center public interest values in the way they hire talent, approach technology, and maintain technical systems will create stronger products and services. 


So, if I’m a business leader who cares about improving usability and driving innovation with a public interest mindset, what’s the playbook for doing that?


First, it’s essential to understand what PIT is, and how values like equity and transparency and accountability get prioritized in a business framework, right? So, we've seen anecdotally and at scale how transformative adopting a PIT mindset can be for businesses. You can look at companies like Twitter, and you’ll see that over the past year they've been hiring more and more public interest technologists.

We've also seen companies embrace PIT and generate real value not just in terms of profits, but also in terms of the trust they're generating with critical communities they're working with. We see these companies engaging with communities through business relationships. For businesses, this is core to building an ecosystem and winning trust across constituencies and across customer relationships.


But business leaders can be skeptical of terms like public interest technology. It can come across as just another way to say we need more tech regulation.


That’s an important distinction, because regulation is inevitable and important but has a reputation as a hard hammer that can inhibit innovation. But as a field and a set of approaches that work across sectors to center public interest values like equity and transparency and justice in the way we design and apply technology, public interest tech is core to driving your business growth and sustainability.

So, we talk about PIT as a way to be anticipatory of forthcoming regulations and standards and important changes in the environment as it relates to regulatory hurdles and compliance.

Sometimes it's Okay to Pump the Brakes on Innovation


You’ve said that one of the roles of public interest technologists is to help us build and deploy technology that’s more human-centric. What did you mean by that?


Yes, when everyone's excited about the next big innovation, PIT sometimes encourages us to actually pump the brakes and more closely examine how tech trends can and do harm marginalized communities, and how we can prevent that from happening. 

I read a recent FICO survey that revealed many executives are poorly equipped to ensure the ethical implications of using AI systems. For example, when asked about the standards and processes in place to govern AI usage, only 38% said their companies had data bias detection and mitigation steps in place. And just 6% said that they tried to ensure their development teams are diverse.

So, that’s a really disappointing snapshot of where we're at when it comes to IT governance and of course, hyperautomation governance. We've seen time and time again, how algorithmic bias has already caused real-world harm for marginalized communities, right? We've seen false arrests, increasing surveillance, increasing marginalization of folks who don't have access to systems that may require them to be machine readable or visible to AI systems. So that's really where public interest tech comes in.


Pragmatically speaking, what’s your advice for businesses that want to prioritize trust, diversity, equity, and inclusion as part of their digital transformation strategy?


So, my pragmatic advice is to consider the consequences of the technology you're building and make every effort to build and use technology in a way that avoids harming marginalized communities. Consider adopting and incorporating frameworks that help to center the values of trust and accountability like we prioritize cost savings, efficiency, profit, and speed.

PS: (This blog was originally published on here.)