René de Baaij

From System to Meaning

Team development is about how a group of professionals learns to truly work together — not only in terms of tasks, but also relationally and reflectively. In this theme, we approach teams and teaming as a psychodynamic and digital phenomenon: what is visible in meetings, decision-making, and results is always connected to what is happening in the undercurrent and to the ways AI systems co-organise the work.

A psychodynamic perspective means recognising how unconscious patterns co-determine decisions. Who automatically steps into the role of rescuer, critic, or mediator? Who withdraws as soon as things become tense? Which topics remain neatly out of view? Behind such patterns often lie earlier experiences with leadership, organisations, or failure, which are replayed within the team. Conflicts, silences, or subtle power games are then not disturbances, but information: here the team touches something it does not (yet) dare to feel or discuss.

At the same time, teams increasingly operate in a field where AI co-determines information, rhythms, and choices. Access to data and tools becomes influence. Dashboards steer attention. AI suggests solutions and priorities, and may even appear to offer judgements or scores. Teaming thus also becomes learning to collaborate with technology — and to remain critical of the assumptions embedded within it.

Team development in a HUMAN–AI context therefore requires teams to build sufficient safety to tolerate and discuss tension, uncertainty, and difference. To make their unspoken rules explicit — including the rules around data and tools (“here we trust the system, here we don’t”). And to consciously design how team and AI complement each other: what belongs to human judgement, what can be delegated to systems, and who ultimately remains responsible.

This theme invites teams to see themselves and their digital environment as a single whole. Team development then becomes transformation from the inside out: recognising patterns, creating role clarity, and using AI in ways that strengthen craftsmanship, responsibility, and trust.

Intervention

DBVP views team development as more than “better collaboration” or “clear agreements.” A team is a living system in which task, relationships, history, power, and technology continuously influence one another. Team development is therefore, for us, simultaneously a psychodynamic, systemic, and HUMAN–AI challenge.

Psychodynamically, we work with the undercurrent present in every team: who naturally becomes the spokesperson, who takes care, who explains everything away, who disengages, who subtly sabotages? Behind these movements often lie earlier experiences of rejection, success, failure, or lack of safety. We do not see cynicism, jokes “about nothing,” sluggishness, or conflict as disturbances, but as signals. We work with the tension rather than around it, because that is precisely where the learning potential lies.

Systemically, we never consider the team in isolation from its context. Its position in the organisation, the task, the mandate, and the dependencies shape the dynamics. We therefore look at role clarity and decision-making, at the connection with other teams, and at the relationship with leadership and governance. We intervene not only on behaviour, but on coherence: is the task right, does the composition fit, are boundaries clear, and is it evident who is truly responsible for what?

We add HUMAN–AI as a socio-technical reality. Much team work is driven by data, systems, and AI tools: planning, dashboards, analyses, risk models. Those who have access to which information effectively gain more influence — and the system often determines the pace and focus. We examine how technology shapes role distribution and attention. We use AI both as a tool to clarify patterns and scenarios and as a mirror: what do your tools say about what is considered important in this team?

How we intervene is consistent. We start from the team’s task: what are you here for, within the larger whole? We then work with concrete cases where tension is present — real decisions, real mistakes, real friction — so that development does not become abstract. We connect work and learning moments: the practice field is the daily work itself. We create a holding environment that is safe enough for honesty and sharp enough not to stay away from what is painful. And we continually place ownership back where it belongs: with the team; DBVP is a guide, mirror, and challenger. In this way, team development becomes not “a nice day together,” but a process of transformation from the inside out, visible in task maturity, mutual honesty, and mature engagement with AI and systems.

Methodological considerations

At DBVP, we see team development as working with a living system of task, relationships, history, and technology. The starting point is always the task: without a shared purpose, “collaboration” quickly becomes an empty container. We work from transformation from the inside out: patterns only change sustainably when emotions, loyalties, roles, and structure are all brought into view. Psychodynamically, we look at the undercurrent — who carries tension, who rescues, who fights, who freezes — and treat conflict and silence as information. Systemically, we examine position and context: task, mandate, boundaries, dependencies, and the relationship with leadership and other teams. And we explicitly include AI and systems as part of the team field: dashboards and tools partly determine who has influence, what appears to be “true,” and which pace becomes leading.

Typical methods and techniques

We begin where the team already is: in the work. We observe regular meetings and reflect back what happens, including what is not being said. We work with concrete cases — decisions, mistakes, complaints, successes — and make visible how role patterns form in real time: who speaks, who remains silent, who names the unnameable, who diverts attention away from tension.

We then make the undercurrent explicit and workable. We normalise themes such as trust, shame, guilt, rivalry, and loyalty as part of team dynamics, and explore recurring reflexes: avoiding, rescuing, fighting, escalating complaints upward, becoming cynical. Teams practise “listening beneath the words”: what is being avoided, and what price does the team pay for that?

We then bring in systemic ordering and role clarity. We clarify the team’s task, mandate, and boundaries within the organisation, and work out who is truly responsible for what — making implicit expectations explicit. Often, the biggest breakthroughs lie in small structural interventions: who attends which meetings, how decisions are made, which escalation routes are legitimate, and where the real dependencies lie.

In parallel, we carry out HUMAN–AI–specific interventions. We facilitate team conversations about systems: which data steer us, what are we missing, who sees which information, and what does that do to influence and recognition? We examine situations in which AI or system outputs clash with professional or moral judgement, and make explicit how the team wants to shape the human–system division of tasks: what do we consciously keep with ourselves, and how do we safeguard responsibility?

Finally, we embed learning in the line through short learning loops. After significant moments — crisis, escalation, success — we facilitate immediate reflection, so insights do not evaporate. We practise tension-filled conversations in the real work, not only in role play. And we involve the leader as part of the team field: not as an external “boss,” but as an actor who co-shapes the system.

Throughout, the principle holds: DBVP brings framework, sharpness, and safety; the team brings itself, with all its friction and potential. Team development thus becomes an ongoing process of looking together, choosing together, and practising together — visible in task maturity, mutual honesty, and mature engagement with AI.