Causality Lost
W.B. Yeats: The best lack all conviction, while the worst are full of passionate intensity.
Before we start I have two things to say. The first and most important is that this article is dedicated to my longtime friend and occasional colleague who passed away very recently and completely unexpectedly. He was a fantastic software developer, chronically curious and had a mind that was somehow both full and wide open at the same time. I will miss him, his intelligence, his ability to overlook my oh-so-many flaws and his ideas which came in the flavors Brilliant and Batshit. The second thing is that this article felt like it needed a playlist. Here it is. I hope you enjoy.. well, all of it I suppose.
Once upon a time–in a galaxy far, far away, causality was the bedrock of civilization—a shared understanding that actions had predictable consequences. Or at least a willingness to accept the causal relationships between two temporally distinct but casually coupled events. Trust in institutions like the media, the political system, and the judiciary ensured that societies could function cohesively, even in times of disagreement. But today, this foundation has crumbled. Truth itself has become a contested space, leaving us adrift in chaos. Everyone is invited to, or feel that they are expected to, "do their own research," to cobble together a subjective reality based on Machiavellian fantasies in a post-truth world.
For us as software developers, this era–in the author's usual humble opinion–presents unique risks. Our reliance on logic, often considered a strength, can blind us to the complex, chaotic forces shaping the world—and our work. Worse still, it may lead us to unwittingly participate in perpetuating this chaos.
This isn’t merely an abstract philosophical problem.[1] It’s a tangible crisis that has infiltrated democracies around the world, from the rise of authoritarianism to the dissolution of shared truths. To understand how we got here, we must examine the deliberate erosion of trust in institutions and the insidious ways it reshapes society—and, inevitably, the systems we create.
Veterans of The Truth Wars
In a 2019 essay for Foreign Affairs, journalist and author George Diez chronicled the breakdown of rational discourse in Germany. Following far-right riots in the city of Chemnitz, where mobs attacked refugees and chanted neo-Nazi slogans, the facts of the incident were well-documented. Videos showed individuals being chased through the streets, and eyewitness accounts corroborated these scenes of violence.
Yet Hans-Georg Maassen, the head of Germany’s domestic intelligence agency at the time, dismissed the evidence. He publicly questioned the authenticity of the videos, suggesting they were fabricated to manipulate public opinion. Maassen, as Diez noted, had no evidence to support these claims—only the rhetorical tools to sow doubt. His denial was echoed by other politicians, who shifted the conversation from the events in Chemnitz to debates about media reliability.
This deliberate obfuscation is emblematic of the "war on truth" that defines our age. Diez described it as a "post-truth hall of mirrors," where facts become fungible and accountability evaporates. Without a shared reality, there can be no responsibility—only endless debate over what is "real."
Meanwhile, Nina Jankowicz, in a 2024 Foreign Affairs essay, highlights the alarming spread of disinformation in the United States. She describes how foreign operatives, domestic actors, and social media platforms have together created a polluted information ecosystem. Platforms amplify falsehoods for profit, while domestic politicians weaponize disinformation to stoke division. This virulent mix has paralyzed governments and eroded trust, leaving democracies vulnerable.
The evolution of disinformation is not a new phenomenon. During the Cold War, propaganda campaigns were systematically deployed to undermine enemy states, often with significant success. Today’s disinformation campaigns, however, have an unprecedented advantage: the internet. Social media platforms have transformed propaganda from a state-driven effort into an open marketplace of falsehoods. The democratization of deception means that anyone with a keyboard can shape reality, often with devastating consequences.
Case Studies From the Deluge
The intersection of technology and disinformation is fraught with an endless supply of ethical challenges. We are all aware–hopefully?–of the infamous case of Cambridge Analytica, which exploited Facebook user data to manipulate voter behavior during the 2016 U.S. presidential election. The scandal exposed how digital platforms, designed to connect people, could be weaponized to polarize societies. Algorithms optimized for engagement unwittingly amplified divisive content, creating echo chambers that hardened ideological divides.
Twitter offers another illuminating example. Before its acquisition by Elon Musk in 2022, the platform made strides in combating disinformation by implementing transparency measures and fact-checking systems. Yet even these efforts were imperfect, as the rapid dissemination of false information often outpaced attempts to correct it. After Musk’s takeover, these measures were rolled back, illustrating how corporate priorities can override societal responsibility.
On the flip side, there are success stories. During the 2020 U.S. presidential election, the Cybersecurity and Infrastructure Security Agency (CISA) worked with tech companies to identify and dismantle foreign disinformation campaigns. Christopher Krebs, the agency’s director at the time, declared the election "the most secure in U.S. history." Yet his dismissal by then-President Trump highlighted the precariousness of such efforts in a politicized environment.
The Unknown Knowns And So On
Philosopher Slavoj Žižek’s concept of "Unknown Knowns" provides a useful framework for understanding these challenges. Unknown Knowns refer to the assumptions and biases we unconsciously adopt—the invisible frameworks that shape our decisions. Like fish unaware of water, we operate within systems of thought that we rarely question.
Software developers are especially susceptible to this. We pride ourselves on logical thinking, yet our work is often guided by unexamined premises. Consider the algorithms we write: designed to be "neutral," they frequently reinforce existing inequities because they reflect the biases baked into their training data. Or think about the way we optimize for efficiency, often overlooking the social costs of our designs[2].
This is particularly dangerous in the context of AI and machine learning. These systems, often heralded as paragons of objectivity, are deeply shaped by the data they’re trained on. If the data reflects societal biases, the outputs will too—sometimes in ways that are difficult to detect or challenge. The result is a feedback loop where systemic inequities are amplified under the guise of "neutral" technology. Since there has been some research published focusing on bot usage on major social media platform I will come back to this subject in a future article.
Causality Dies In Darkness
While developers wrestle with these epistemic blind spots, democracies face their own existential threats. Diez’s account of German politics shows how authoritarian tendencies exploit chaos to consolidate power. By denying documented events, figures like Maassen not only erode public trust but also normalize the idea that reality is subjective.
Jankowicz paints a similar picture in the United States, where disinformation has been weaponized to undermine public trust. She recounts her time leading the Disinformation Governance Board, a short-lived attempt to coordinate anti-disinformation efforts. Despite its limited scope, the board became a lightning rod for partisan attacks, driven by baseless claims that it was a "Ministry of Truth." These attacks, amplified by social media, ultimately led to the board’s disbandment, marking a victory for those seeking to undermine efforts to counter disinformation.
Disinformation is not just a political weapon; it is a tool of control. Autocrats and extremists alike use it to sow confusion, divide communities, and delegitimize opposition. When truth becomes subjective, power is concentrated in the hands of those who can manipulate narratives most effectively[3].
Chaos Breeds Systemic Debt
The antidote to chaos is not certainty; it is responsibility. For software developers, this is a profound shift in mindset. We often view our work as solving abstract problems, operating in a domain where logic and technical elegance take precedence. Yet the systems we build are far from neutral. They are shaped, consciously or not, by the same societal forces that erode trust and by the biases we carry as individuals. To counter this, we must step beyond the confines of code and algorithms and see ourselves as active participants in shaping the broader ecosystem.
This shift requires a redefinition of what it means to be a developer. We are not mere builders of tools but architects of experiences, gatekeepers of fairness, and stewards of accountability. A search engine does not just index information—it determines whose voices are amplified. A recommendation algorithm doesn’t simply connect users to content—it decides which ideas gain traction and which are buried. These choices, often buried in the machinery of our systems, have profound ethical and societal implications. Yes, I’m sure edgelords have a lot of fun, but it is also the refuge of the coward since everything is just a joke. Except the jokes aren’t funny. In fact they aren’t even on the same continent as funny. We should expect much better than that from ourselves, our colleagues and not least our managers.
Responsibility begins with active doubt. It requires developers to adopt a posture of constant questioning—of the data we rely on, the assumptions we make, and the outcomes we prioritize. When integrating datasets, do we pause to consider the biases inherent in the sources? When optimizing algorithms for efficiency, do we stop to evaluate the societal costs of those optimizations? Active doubt compels us to resist the allure of simplicity and grapple with the messy, complex realities that our systems influence.
But responsibility doesn’t end with questioning. It demands proactive action. Developers must go beyond avoiding harm to intentionally build systems that promote accountability, fairness, and trust. This might mean designing transparency into algorithms so that their decisions can be scrutinized and explained. It might involve prioritizing inclusivity by testing for unintended biases or ensuring that marginalized communities are represented in the datasets we use. Responsibility is not a passive state; it is an active, ongoing process of engaging with the ethical dimensions of our work.
Moreover, responsibility requires collaboration. No developer works in isolation[4], and no system exists in a vacuum. The problems we face—bias, disinformation, inequity—are too large and too complex for any one person or team to solve alone. Addressing them demands cooperation across disciplines, with input from ethicists, sociologists, and impacted communities. It also requires transparency within our organizations: creating spaces where questioning assumptions and challenging decisions is encouraged, not penalized.
Perhaps most critically, responsibility is about envisioning and striving for a better future. What kind of society do we want our systems to support? What values should they reflect? These are not questions of technical feasibility but of moral clarity, and answering them requires courage. It means acknowledging our role not just as technologists but as citizens and custodians of a shared world. In the face of chaos, our greatest strength lies not in our ability to impose order but in our willingness to embrace responsibility and act with purpose.
So What Did We Learn On The Show today, Craig?
Other democracies have taken varied approaches to counter disinformation. Nordic countries, for instance, emphasize media literacy education from an early age, equipping citizens to critically evaluate information. Australia has implemented robust legislation to hold tech companies accountable for harmful content, offering a model for regulatory frameworks.
These efforts underscore a crucial point: combating disinformation requires a multi-pronged approach. Governments, tech companies, and civil society must work together to create resilient information ecosystems. Developers have a critical role to play in this process by designing systems that prioritize transparency, inclusivity, and fairness.
Here’s the punchline: reality doesn’t debug itself.[5] And if we treat our work as neutral, we risk becoming pawns—or worse—in a system that thrives on chaos.
Now, you will have to excuse me. It is in my nature to joke even–and perhaps especially–when discussing hard problems like not-slow-enough collapse of democracy as we know it. Unsurprisingly beneath the jokes lies a solemn truth: the stakes are astronomical. Democracies falter, institutions erode, and we need to accept the responsibility as a source for good since the systems we create have real consequences. Our actions matter. If we don’t challenge the chaos, we will–wittingly or unwittingly– perpetuate it.
So, start with doubt. Doubt your choices. Doubt your assumptions. Doubt the idea that you’re immune to the forces shaping this chaotic age. And above all, take responsibility—not just for your work but for the world it helps create.
Because in this fight for truth, developers have a role to play. And the first step is seeing the water around us.
Footnotes
1. SURPRISE
2. If you’ve never pondered the water around you, congratulations—you’re either a fish or a developer who just pushed to production.
3. Turns out, ‘move fast and break things’ works just as well for democracy as it does for tech—except no one’s pushing a hotfix for that.
4. Although there’s always that one developer you wish would.
5. Debugging is like therapy: you have to admit the problem exists before you can fix it.