AI and Change: What Are We Really Talking About?

We, as a society
Government and public administration carry a particular burden: guardians of public value in a time of acceleration. Not only as regulator, but also as commissioner, user, and referee. AI touches the very fibres of the rule of law—equality, proportionality, transparency—and tempts us towards convenience. Governing here requires character: the ability to carry speed without losing the compass.
Legitimacy begins with intent. The model is not central; the central question is which public values we want to optimise, which people are affected, and who intervenes when it starts to chafe. That requires clear boundaries around what remains fundamentally human, and making visible how judgements are formed. In this way, doing justice gains a rhythm that can be followed.
Beneath that lies the infrastructure that makes public space possible. Open standards, shared data spaces, independent assessment, and real options to stop or switch give technology a proper foundation. Procurement becomes a carrier of values when explainability, verifiability, and reversibility are not an appendix but the starting point. Healthy countervailing power—including towards one’s own systems—is what demonstrates authority.
Besturen is relationeel werk. Rijk, provincies, gemeenten en uitvoeringsorganisaties vormen één weefsel waarin fouten zich kunnen verspreiden, maar leren dat ook kan. Vaste momenten van gezamenlijke reflectie maken resultaten, uitzonderingen en morele frictie zichtbaar. Niet om te wijzen, maar om verantwoordelijkheid te dragen en zonodig te stoppen. Publieke waarde ontstaat niet achter gesloten deuren. Burgers moeten kunnen begrijpen, bevragen en weerspreken wat systemen doen. Dat vraagt meer dan een register; het vraagt verstaanbare taal, tijdige betrokkenheid en een plek waar herstel daadwerkelijk kan beginnen. Democratie wordt zo een praktijk, geen procedure alleen.
Professionals, public leaders, and supervisors need a new kind of literacy—one that does not require mastering every detail of the code, but does demand sensitivity to assumptions, data quality, bias, and model drift. Judgement is practised through case-based work and dissent, so that the human measure becomes a skill rather than a slogan. Between experimentation and precaution lies the art of discernment. Where risks are small, innovation can be given room within clear guardrails. Where dignity or legal standing is at stake, slowing down deserves priority and countervailing forces are organised explicitly. In this way, speed gains a foundation that remains human.
What remains is an invitation to have the conversations where it truly rubs: about power, about pace, about responsibility that cannot be delegated to machines. Do we make visible where we accelerate, and name honestly where we slow down? In those choices, public administration shows its stature—transformation from within, with society as co-designer.
AI and Who I Am
AI is reshaping the day-to-day work of professionals—and the inner life of every person. It is not only tasks that shift; self-image and craft, the experience of time, and the way we carry responsibility move with it. We now work in a field where people, machines, and meaning continuously redraw one another—on the screen, in the team, and in the undercurrent of what we consider good and right.
For the professional, the rhythm changes. Routine may become lighter, but judgement becomes heavier. Models offer suggestions; you sign for the consequences. That both clarifies and tempts: it feels good when something thinks along with you, and unsettling when it starts to decide with you. The core of the work shifts towards shaping the relationship—between speed and care, between convenience and meaning, between what the model can do and what you are morally willing to do.

Skills take on a different cut. Alongside domain expertise comes model literacy: recognising assumptions, sensing data quality, questioning outputs, spotting drift. It is less about code and more about an intuition for context. At the same time, impoverishment lurks: if the system proposes more and more, you exercise your own muscle less often. Deskilling is not a technical risk but an identity risk: who are you when your tools appear to take over the thinking?
Autonomy becomes relational. AI becomes a colleague, a mirror, and sometimes an invisible foreman. Dashboards steer priorities, recommendations colour attention, and the pressure to comply is subtle but real. The question “am I allowed to deviate?” becomes more important than “am I allowed to use it?”. Ownership then means being able to say why you do or do not go along—and carrying what that requires in time, explanation, and courage.
The emotional layer shifts as well. Professionals may feel relief (“finally, help”), shame (“can I still do it myself?”), mistrust (“based on what?”), and sometimes grief for a craft that is disappearing. That palette belongs to transformation from within. It calls for teams that do not smooth tension away, but make it workable—by normalising conversations about power, meaning, and boundaries.
The relationship with clients, students, or citizens shifts with it. Interaction may become more consistent, but also more impersonal when the basis of a judgement can no longer be explained. Trust remains human work. Explanation in plain language, clear routes for remedy when things go wrong, and explicitly marking decisions that remain fundamentally human form the backbone of legitimacy.
Boundaries are being redrawn. The data we leave behind as “professional exhaust” feeds systems that then assess us in return. Privacy and dignity are not side issues, but conditions for being able to think freely. The right to pause, to doubt, and to be incomplete—a measure of professional “opacity”—keeps work human.
Health lives in rhythm. Speed is seductive; attention is finite. Brief pauses for reflection, assumptions made explicit, and moments of dissent are not a luxury but hygiene. That is how technology becomes a foundation rather than a wildfire, and quality remains a virtue rather than a chart.
For the individual behind the professional, it ultimately comes down to compass. AI makes visible where we choose convenience over meaning, where we want to outsource responsibility, and where we find the courage to set boundaries. The question is not whether you use AI, but how you relate to it: curious and critical, open yet bounded, willing to learn—and willing to refuse.
Perhaps it starts here: dare to say out loud which parts of your work you never want to automate fully. Be clear about the values you want to protect when things need to move fast. And seek the conversation where it rubs—within yourself, with colleagues, and with those you serve. In that weave, AI finds its place as a tool with character, and human judgement remains a public good.


