René de Baaij

From System to Meaning

Culture, leadership, and AI intersect precisely where inner world and outer world meet. In this theme, we explore culture development as a psychodynamic and digital question: not only what people say “the culture” is, but what unconsciously plays out in relationships, power, and loyalties — and in the ways digital systems and AI co-shape daily work.

From a psychodynamic lens, we look at the undercurrent behind behavioural norms: shame and pride, fear of deviating, hunger for recognition, rivalry, and hidden loyalties. These dynamics show up in meeting language, decision-making, and informal codes — but also in the place AI is given. Technology quickly becomes a carrier of desire and fear: AI as norm (“this is how we do things here”), as scapegoat (“the system wants this”), or as weapon (“the data proves we are right”). In this way, AI becomes part of the social and psychological system — and thus part of culture.

Data and AI choices are therefore never neutral. What we measure and reward, which models we trust, who gains access to which insights — these are choices about what is considered valuable, who counts, and who fades from view. Culture development then calls for organisations that go beyond simply wanting to be “data-driven,” and are able to read what data and AI do culturally and dynamically: which norms harden, which voices fall silent, which forms of humanity disappear into standardisation — and where new space emerges instead.

Culture development in a HUMAN–AI context is therefore not an implementation project, but a process of transformation from the inside out. Teams and leaders learn to recognise patterns, tolerate tension, and make explicit how they want to connect humanity, responsibility, and technology. AI becomes both mirror and conversation starter: what does this system reveal about our assumptions, blind spots, and power structures? Culture development in the age of AI thus becomes, above all, the courage to face the undercurrent — in people and in systems — and from there to collaborate in a mature and humane way.

Culture development intervention

DBVP starts from the conviction that culture development is not a question of values or communication, but a matter of transformation from the inside out. Not: “how do we get the right words on the wall?” but: “how do we develop the capacity to act maturely as a system — with people and technology — even under pressure?”

We consistently work with three interlocking lenses. Psychodynamically, we look beyond visible behaviour to underlying drivers, fears, loyalties, and defence mechanisms. What makes an organisation deflect with jokes instead of speaking, moralise instead of inquire, control instead of trust? We take transference and countertransference seriously: how do leaders, teams, works councils, staff, and “the outside world” become carriers of projections — and how is technology drawn into this? Tension is not noise, but information: here lies something that cannot yet be tolerated or spoken.

Systemically, we always connect culture to the ordering of the work. Position, mandate, role clarity, and formal and informal power determine what becomes “normal.” That is why we work with the question: what does the task require, and how should the organisational design support that behaviour? We move simultaneously across multiple levels — teams, leadership, critical processes, and governance — so that culture does not remain stuck in intentions, but becomes visible in decision-making, collaboration, and daily routines.

We explicitly include HUMAN–AI as a socio-technical reality. Culture is partly written into dashboards, definitions, data fields, and algorithms: they determine what is visible, what appears “logical,” and what counts. We examine which assumptions are built into systems, when technology is a support and when it becomes a shield, and how moral responsibility remains with people, even when a system presents an outcome. AI thus becomes both a work tool and a cultural mirror.

How we intervene is consistent: we start from the purpose — what is this culture meant to serve? We work with real situations where friction is tangible, not with abstract values. We design trajectories in which culture, organisational design, and AI come together, and create a learning climate that is warm enough for honesty and solid enough for real shift. Ownership remains with the organisation: we are mirror, guide, and challenger — not the owner of the culture. In this way, culture development becomes not a campaign, but an ongoing movement of inner and systemic shift, visible in behavioural norms and in the way technology is used.

Methodological considerations

At DBVP, we approach culture development as working on real patterns of interaction in real work: right in the midst of tension, performance pressure, and digitalisation. Methodically, we start from purpose and connect it to transformation from the inside out: culture changes sustainably when the undercurrent, roles, context, structure, and systems move together. We treat culture as a relational phenomenon — expectations, projections, and organisational history are always part of it — and therefore integrate the personal, interpersonal, and systemic levels. We do not treat AI as an “additional theme,” but as a factor that co-shapes norms, visibility, decision-making, and moral considerations, and thus directly influences culture.

Typical methods and techniques

We begin with a deep exploration and clear contracting. In conversations with key stakeholders and teams, we explore the purpose, the dominant cultural codes, and the places where friction occurs. We formulate a shared development task that is directly linked to the strategic reality — so that this is not about “working together more nicely,” but about mature culture precisely where it is challenging.

We then work with living cases. Teams and leaders bring in situations in which culture becomes visible — conflict, mistakes, feedback that does not occur, decision-making that solidifies, exclusion that happens “neatly,” or the tension around AI-driven assessments. We slow down around key moments: what happened beneath the surface, in relationships, in the ordering, and in the data? From psychodynamic and systemic hypotheses, we open up alternative courses of action that are sound both humanly and organisationally.

Where appropriate, we use group sessions as a practice field. In labs and workshops, the group itself becomes the material: silences, coalitions, norm-setting, and defence become visible and discussable. We practise tension-filled conversations, addressing issues, setting boundaries, and speaking up — and sometimes deliberately introduce AI outputs as a “third voice” to learn to distinguish: what is data, what is norm, what is responsibility?

We explicitly attend to parallel processes and role reflection. Patterns from the organisation often reappear in teams, in leadership, and in the relationship with DBVP; this is precisely where the learning material lies. Through role consultation, we shift from “what do you think?” to “what does your role, in this system, require to carry culture in a mature way — and where do you set boundaries?”

In addition, we carry out HUMAN–AI–specific interventions, always linked to real choices. Together, we analyse how systems shape culture: what do we measure, what do we reward, who falls outside the data, and which assumptions are embedded in definitions and models? We explore cases in which the system “says” something different from professional or moral judgement, and design ground rules: how do we work with AI without outsourcing responsibility?

Finally, where possible, we work in the moment. By observing critical meetings, through short debriefs, and via short-cycle learning loops, we bring culture development back to the place where it needs to land. We prefer small, consistent shifts in the daily rhythm over grand declarations made during an off-site retreat.

Throughout, the principle holds: DBVP brings sharpness, language, and a holding environment; the organisation brings its stories, choices, and courage. Culture development thus becomes an ongoing process of maturation — visible in patterns of interaction, decision-making, and in the way you work with AI.