Thought Leadership | Strategy | Governance

Rethinking Civic Tech: Bridging Gaps with AI

Last year, I wrote about Building Tech That Bridges, Not Widens, the Gap. This was a quiet plea for our tech designers and builders to create something that actually moves disadvantaged people closer to opportunity, and not pushing them further out of view.

Here I build on that reflection with a lens specifically on civic tech. Where it came from, what it achieved, how it slowed, and why AI has dragged it back to the table in a new, and more complicated form. Because if “ethical tech” is going to be more than a slogan, we need to ask a simple, stubborn question over and over again:

“Who do we serve?”

And just as importantly:

“Can the people we claim to serve actually speak for themselves, whether in, through, and/or against the systems we are building?”

How Civic Tech Started (And What It Got Right)

Civic tech emerged in the late 2000s and early 2010s as a broad, messy, hopeful movement: people using digital tools to strengthen the relationship between citizens and their governments.

It was not just about building apps or fancy web-pages. It was about:

  • opening up budgets and contracts,
  • tracking public projects,
  • reporting potholes or broken streetlights,
  • making it easier to contact your representative,
  • visualising data so ordinary people could see where money and power were actually flowing.
  • Etc.

In the UK and the US for example, you had organisations like mySociety, building tools like FixMyStreet and TheyWorkForYou; and Code for America seeding volunteer “brigades” and digital services that helped people access food stamps or clear old criminal records.

In Kenya and Nigeria, you saw tools like Ushahidi (born in Kenya) crowdmapping election violence and crises, and BudgIT in Nigeria helping citizens track budgets and monitor public projects through platforms like Tracka. So the early wave of civic tech made invisible systems visible (budgets, spending, project locations), and gave citizens new entry points to participate or complain. What’s more it proved that small teams, with modest funding, could unlock real accountability value.

It was not perfect though. Typically skewed towards urban, connected users, while many projects were more exciting to funders than to the everyday people who were meant to use these tools. BUT something important was happening: we were experimenting with what digital power for citizens could look like.

Then the Funding Slowed, and the Shine Wore Off

Fast forward a decade, and the narrative got a bit complicated. Civic tech was even once described as being in its “unruly teenage years”. So past the initial excitement, wrestling with sustainability, politics, and scale.

So since 2019 and thereabouts a few things we experienced and learned:

  • Funding fatigue: In places like Nigeria, as foreign aid receded, civic tech organisations were facing an existential funding crisis. Many relied on grants to survive, and so now as donors shift to other priorities (or to “bigger tech” agendas), the ecosystem is under real strain.
  • Government pushback and co-option: Some governments started to run their own “civic tech portals”. Then also simultaneously shrinking civic space. This is raising real questions about what counts as genuine participation.
  • Who actually benefits? Research from mySociety and others showed us that the users of many civic tech platforms tend to be already educated, connected, and empowered, and so not the people most excluded from decision-making. The people we were trying to bridge to.

At the same time, even the flagship organisations had to adjust. Code for America, for instance, has described how resources for civic tech have slowed, even as the need for human-centred digital government has grown. So to summarize the point, the movement matured, and with maturity came questions we can no longer ignore.

AI, Agents, and the Return of “Tech for Good”

Enter the new wave: AI, automation, and AI “agents.”

Suddenly, we are back at it and governments and civic actors are experimenting once again! Now with this wave we are starting to see:

  • Chatbots helping residents navigate public services, find information, or report issues more easily. The example of local experiments, like an AI chatbot in South Africa designed to help people report broken infrastructure (like water pipes) that affects service delivery in villages.
  • International agencies like UNDP actively exploring how AI can make public services more proactive, targeted, and responsive — while warning that AI can also deepen digital divides if it’s not designed with vulnerable communities in mind.

The uncertainty of this new wave though also brings the global debates on AI ethics, data governance and surveillance are exploding:

  • AI-driven surveillance systems and predictive policing raise serious concerns about privacy, dignity, and human rights.
  • Researchers from the Global South are pointing out how AI systems often extract data from their communities without ownership, control, or benefit flowing back to them.

So we are in a seemingly strange situation:

Civic tech is being rebooted by AI, just as we are realising how easily tech can be used to control, surveil, and extract – not liberate.

Which brings us back to the question:
Who do we serve?

Back to Basics: Designing for Everyday Lives, Not Just Elegant Demos

So if the historical trend we narrated above can be described as civic tech 1.0. And if it was about “open government,” then I believe we need to work this civic tech 2.0 — especially in a world of AI — to be about everyday dignity.

No more dashboards for donors or cool prototypes for conferences. Instead can we build real tools that IRL can change how it feels to be a person in an under-resourced community trying to navigate daily life.

Some concrete directions we could go in:

  1. Design for the most ordinary, stubborn problems 

Whether it’s an informal worker or SME person trying to find out where to safely trade; a farmer needing hyper-local weather and price information to plan; or a parent trying to access quickly, accurate health information in their own language — these are civic problems too.

We already have partial examples:

BudgIT’s Tracka helping communities monitor local projects and push contractors and officials to complete them.

Chatbots in South Africa being tested to help residents report service delivery failures quickly, without needing to navigate complex bureaucracies.

Civic tools in Asia and the Pacific that use mobile and messaging apps to bring citizens into local decision-making processes where internet access is patchy and devices are basic.


These tools matter and can be really useful to reduce the distance between problem and response or save time, money, or frustration for people with very little of all three. Our new civic tech tools need to make power and process legible in people’s own languages and realities.

The prompt for our civic tech creators here is simple:

Start with the queues, the rumours, the forms people hate, the information they can’t find — and solve that.

  1. Build feedback in as a right, not a feature

If data is being extracted from communities, those same communities should have:

  • visibility into how their data is used,
  • a say in how systems are designed,
  • and the ability to correct, resist, or withdraw.

Practitioners in our climes have been clear: AI ethics conversations that ignore lived realities in Africa, Asia, Latin America and the Caribbean. They are incomplete at best, and harmful at worst.

For civic tech, this means we need to:

  • Co-design tools with the people we claim to serve, not just consulting them at the end.
  • Create low-tech feedback channels (SMS, voice, radio call-ins, WhatsApp groups) alongside digital platforms, so people can challenge and shape systems in real time.
  • Be explicit about consent and control: where does the data go, who else sees it, what recourse exists if the system misuses it?

Note: This isn’t just an ethical add-on. It’s a design advantage. Tools built with people tend to be used, defended, and improved by those same people.

  1. Think intersectionally: civic tech is not only “for techies” 

One of the painful lessons of the first civic tech wave is that a lot of projects lived in a narrow corridor: policy wonks + coders + donors.

The next wave needs a much broader cast of non-traditional collaborators like:

  1. community organisers and savings groups,
  2. cooperatives, unions, and informal worker associations,
  3. disability rights advocates and gender justice organisers,
  4. artists, storytellers, designers, radio hosts,
  5. local linguists and cultural custodians,
  6. social workers, extension workers, health workers, teachers.
  7. Etc.

The OECD last year pointed out that emerging technologies for participation only work when they are embedded in real civic cultures and institutions. As designers and implementers of civic tech 1.0, we know that tech alone doesn’t transform governance but rather the people, relationships, and organising do.

So if you are a tech creator or AI innovator, some practical prompts:

  • Whose work does my tool make easier? (nurses, market women, teachers, paralegals, youth organisers?)
  • Who already holds trust in the community, and how can they be co-owners of this tool?
  • Am I designing for a grant report, or for someone’s typical afternoon?

If you are not a technologist — you’re a community builder, researcher, artist, educator, spiritual leader — your work absolutely intersects with civic tech:

  • You know where misinformation travels and why.
  • You understand who is left out of town halls, budget meetings, or digital portals.
  • You see the patterns of quiet harm before they show up in data.

Civic tech without you will repeat the mistakes of the last decade.

Ethical Tech: Not Just “Do No Harm”, But “Who Do We Serve?”

The current AI ethics frameworks are typically focused on bias, safety, and transparency. All of that is important. But for those of us in and from under-resourced contexts, there’s an important and often overlooked layer: power and purpose. For this reason, researchers working on ethical AI from a Global South lens are pushing for frameworks that start from communities’ own priorities, not just from Western institutions’ risk lists.

So the civic tech + AI question in our contexts cannot be limited to safety and fairness. It must also be:

  • Does this system shift power or concentrate it?
  • Whose labour, data, and time are we extracting — and who benefits?
  • Can the communities affected speak back to it and reshape it?

“Ethical tech” that doesn’t change material realities for those at the sharpest edges of inequality will remain, at best, a branding exercise.

A Call to the New Civic Tech Builders (and Their Unlikely Allies) We are in a moment where:

  • Civic tech is searching for its second act,
  • AI and agents are rushing into the public sector,
  • and communities are still fighting for basic things: water that flows, clinics that work, schools that teach, streets that are safe.

This is not the time for either naïve optimism or comfortable cynicism. It is a time for intersectional, grounded collaboration. If you are building tech, ask yourself regularly:

Who do we serve?
Can they see themselves in this system?
Can they shape it?
Can they say no?

If you are working in community, governance, philanthropy, media or movement spaces, consider this an invitation:

Your work already intersects with civic tech.
You are not a “user persona”; you are a co-architect.

The tools we build now — especially with AI — will either quietly deepen the gap or deliberately bridge it. My hope is that we choose the latter, together.

Yop Rwang Pam


Learn with us at Project by Projects!

Subscribe to get the latest posts sent to your email.

Who Leads the Team?

PxP is led by Yop Rwang Pam, a systems strategist and philanthropic advisor known for helping bold institutions navigate complexity and unlock transformative clarity.

Set Up a Quick Consult

You are not here by accident. If you’re holding big questions and need a trusted thought partner, let’s begin with a quick clarity consult.