Philosophy
DBVP operates precisely where people, organisations, and the socio-technological system intersect. Where strategy, structure, and dashboards meet, the illusion easily arises that steering is mainly about the right buttons and the right language. But organisations are living systems. What is visible in plans, KPIs, governance, and processes is continually shaped by what plays out beneath the surface: fear and ambition, loyalty and power, hope and shame. If you don’t take that undercurrent into account, change quickly becomes cosmetic—logical on paper, unworkable in practice.
That’s why we don’t look for a quick fix, but for transformation from within. Not as a slogan, but as a way of working—in leadership, culture, teams, and professionalism. We take both a psychodynamic and a systemic perspective: we take the inner logic seriously (defences, projections, transference, loyalties) and we always read behaviour in relation to the whole (role and mandate, history, the field of forces, governance and oversight). We intervene in the midst of the real work—where it gets tense, where patterns repeat, where conversations are avoided or escalate, and where decision-making congeals. That is precisely where something can shift: from reactive to conscious, from control to ownership, from friction to mature collaboration.
In that same field of forces, AI takes a place you can’t dismiss as mere “tooling”. For us, HUMAN–AI is not a separate theme, but a lens that runs through everything. Data and AI increasingly determine who sees what, who is allowed to know what, who decides, and what comes to feel “logical” or “true”. Technology therefore also carries power, norms, and blind spots. AI amplifies what is already present: clarity and false certainty, speed and impoverishment, inclusion and exclusion. If you add AI without reading the undercurrent, you stack a new layer of rationality on top of old tensions; if you do take it in, technology can become both a mirror and an accelerator of mature choices.
In that same field of forces, AI takes a place you can’t dismiss as mere “tooling”. For us, HUMAN–AI is not a separate theme, but a lens that runs through everything. Data and AI increasingly determine who sees what, who is allowed to know what, who decides, and what comes to feel “logical” or “true”. Technology therefore also carries power, norms, and blind spots. AI amplifies what is already present: clarity and false certainty, speed and impoverishment, inclusion and exclusion. If you add AI without reading the undercurrent, you stack a new layer of rationality on top of old tensions; if you do take it in, technology can become both a mirror and an accelerator of mature choices.
Our position is clear: guide, mirror, and challenger. We bring language, structure, sharpness, and a holding environment—and we ask questions that aren’t always comfortable. Ownership stays where it belongs: with the organisation, its leaders, its teams, and the people themselves. Transformation from within cannot be rolled out; it demands the courage to face yourself and your system, especially when technology invites you to speed up and numb out.
That is the core of DBVP: slowing down together in the right places, so that people, the organisation, and AI can move into a next, more mature order—with less noise, more maturity, and steering that fits your intent and your responsibility.

Rollen
In my work with DBVP, I move fluidly between roles, depending on the question, the developmental phase, and the field of tension an organisation is in. The core remains the same throughout: a psychodynamic and systemic perspective, in a world in which AI and technology play a full and integral role.
Whatever role I take on, one thing remains constant: I stand beside you, not above you — as mirror, guide, challenger, and at times as a programmatic anchor. Always with the same purpose: transformation from the inside out, in your leadership, in the organisation, and in the way you live and work together with technology.
At times, I work as an advisor and analyst, closely with boards, executive teams, or supervisory boards. In that role, I examine what is truly going on: not only what is being said, but especially what is being avoided. I connect structure, roles, and decision-making with the data and AI landscape and with the undercurrent of loyalty, fear, pride, and power. The result is a sharp diagnosis, reflected back in a way that opens the conversation rather than shutting it down.
In other trajectories, I act as a counsellor or executive coach for one or a small number of key individuals. In those cases, coaching and counselling intersect: we work on inner ordering (biography, patterns, moral questions) and on concrete situations of today and tomorrow. I am not a therapist, but an engaged, sharp conversational partner who helps realign role, inner leadership, and context — including the question: what do you leave to systems and AI, and where do you yourself need to remain attentive?
When the pressure for change is higher, my role shifts toward change manager or interventionist. Together with leadership, I design how the movement can take root: which conversations are needed where, which teams will carry it, and which patterns need to be seen first. Not steering through a blueprint, but through a programmatic change logic with rhythm, learning loops, confrontation, and anchoring. I bring structure and coherence; the organisation chooses and carries.
In larger trajectories, I sometimes explicitly take on programme management: not classical project control, but safeguarding the substantive line and the systemic perspective. I prevent fragmentation by connecting interventions: leadership to organisational inquiry, team interventions to strategy, AI projects to culture and ethics. In this way, a single narrative emerges: what is now, what can come later, and what we consciously leave aside.
In addition, I regularly work as an auditor or in a role close to oversight. In that capacity, I assess more sharply and independently: does the way of organising still fit the purpose? Are governance, culture, and technology in balance? Is it psychologically safe and data/AI-mature? With the mandate to probe beneath what lies under reports and presentations.
All these roles come together in a few clear areas of intervention: coaching & counselling (one-to-one and small groups), organisational inquiry & advice (sharpening perception and interpretation together), interventions and programmatic change (designing and guiding the movement at the heart of the work), and leadership programmes (where personal development, system, culture, organisational questions, and the place of AI converge).


