Mechanical Sympathy, Engineering Empathy
"You don't have to be an engineer to be be a racing driver, but you do have to have Mechanical Sympathy" - Jackie Stewart
[ This text is very much the outcome of an internal discussion earlier today around context engineering and our tendency to ignore our own need of quality context when asked to solve problems. The fact that we can - occasionally - power through and deliver results in subpar conditions shouldn’t be the yardstick by which we measure other engineers, nor the tools we use. Instead we should actively strive for ensuring that we can provide clear and unambiguous context. We, after all, know how much it sucks to not have it. ]
The phrase mechanical sympathy was first made famous by Jackie Stewart, one of Formula 1’s great drivers. He used it to describe a driver’s ability to feel the car, to understand what it was trying to tell them, and to treat it with a kind of respect. Drivers who lacked this sympathy would over-rev engines, abuse brakes, and thrash gearboxes into early retirement. The sympathetic driver, on the other hand, could sense the car’s limits, work within them, and, paradoxically, go faster and further. It wasn’t simply mechanical knowledge, nor was it indulgent gentleness; it was awareness. It was harmony between human and machine, an understanding that performance and longevity came from cooperation rather than force. Being, to put it simply, simpatico across boundaries.
That idea eventually found its way into computing. Martin Thompson and others began using mechanical sympathy to describe software that worked with the underlying realities of hardware rather than against them. Writing code that is cache-friendly, that respects memory bandwidth, that avoids thrashing the instruction pipeline - these are not esoteric tricks but ways of listening to what the machine is telling us. To build systems that ignore these constraints is to grind the gears and ride the brakes until everything runs hot and slow. To build systems with mechanical sympathy is to recognize that hardware is not an infinite abstraction but a living environment with contours and limits.
But the landscape of computing is shifting. Increasingly, the “machine” is no longer a passive substrate we program against but an active partner we collaborate with. AI models, developer assistants, code generators, automated testers - all of these are agentic tools, capable of interpreting intent, producing artifacts, and even negotiating trade-offs on our behalf. Treating them as dumb engines misses the point. Just as we needed mechanical sympathy to unlock performance from cars and CPUs, we now need a comparable awareness of how to work with tools that act, decide, and respond in ways that feel almost human.
That’s where engineering empathy comes in. If machines require respect to reach their potential, people require it even more - and now our tools are joining that circle. Documentation, requirements, specifications, and development environments are all written for human minds, but they are increasingly also written for machines that parse, generate, and refine them. An ambiguous requirement doesn’t just frustrate a developer; it derails an AI assistant. A poorly structured knowledge base doesn’t just waste a colleague’s time; it cripples the retrieval pipeline of a code-completion model. If drivers could only perform as well as their cars allowed, and engineers can only perform as well as their context allows, then agentic tools can only perform as well as the clarity and respect we extend to them.
Engineering empathy, then, is about recognizing that humans and their tools are part of a shared system. Writing documentation with empathy means considering not just what a colleague will need in six months, but also how an automated agent will consume and interpret it. Creating requirements with empathy means removing ambiguity and contradiction so that both human engineers and machine assistants can act decisively. Choosing tools with empathy means evaluating whether they enable clear thinking and sustainable practices rather than whether they produce flashy but fragile results.
The effects are not just about efficiency, though those certainly follow. They are about creating conditions where people and their tools can complement one another rather than collide. When empathy informs how we structure our processes, mistakes become rarer, frustration diminishes, and the systems we build reflect a coherence that scales across humans and machines alike. Sympathetic drivers extended the life of their cars and earned better results; empathetic engineering cultures extend the creative lifespan of their people while enabling agentic tools to reach their real potential.
Technology has always tempted us to think in absolutes: the machine as pure logic, the human as pure judgment. But those lines blur more every day. Our systems are built not only on silicon and syntax but on human interpretation and on machine assistance. Mechanical sympathy showed us that to master performance we must align with the physical truths of our hardware. Engineering empathy shows us that to master engineering itself we must align with the cognitive truths of our teams and the operational truths of our tools. Neither sympathy nor empathy is indulgence. Both are forms of respect, rooted in the understanding that working with the grain produces better results than working against it.
In the end, the lesson is simple. Machines deserve our sympathy, people deserve our empathy, and the agentic tools that bridge the two deserve thoughtful partnership. When we grant all three the conditions they need to work with us rather than against us, the systems we design become not only more efficient but also more sustainable, more humane, and ultimately more powerful.