Internal AI guidelines for youth work

AI is used in many organisations, but often without common guidelines, transparency or reflection. For OKJA Basel, we are developing a process that demonstrates how internal AI guidelines can be developed: in a participatory and practical manner, whilst allowing room for ambivalence.

OKJA

Many organisations are currently facing a similar challenge: AI is already in use, but often without clear guidelines. At the same time, there are uncertainties regarding the handling of sensitive data, transparency and specific applications. Can I use ChatGPT for emails? What about when communicating with clients? What about sensitive data? Do I have to declare my use of AI? And isn’t everyone already using AI anyway?

This situation was also evident at the umbrella organisation for open child and youth work in Basel-Stadt (OKJA). Together with a dedicated working group, we are supporting the development of internal AI guidelines.

The aim of the process is to develop clear, practical and well-considered guidelines for working with AI. These are intended to provide guidance without restricting work, whilst at the same time creating space for discussion and critical evaluation.

Why internal AI guidelines?

Waiting for instructions «from above» does not work. AI is already in widespread use, often informally and without a shared understanding. If this use remains hidden, not only is there a lack of critical reflection, but uncertainty quickly arises as to what is permitted and what is not.

This uncertainty can foster a climate of reticence or even mistrust within organisations. Employees do not know whether they are allowed to openly discuss their practices. This is precisely where internal guidelines come in: they make existing practices visible, provide a shared framework and enable open dialogue.

Furthermore, AI technologies are developing at a rapid pace, whilst the legal framework is lagging behind. Waiting for clear regulatory guidelines is therefore not a sensible strategy. Organisations are called upon to develop their own, context-specific solutions.

The role of the Dezentrum in the process

As a general rule, the most important step is simply to get such a process off the ground. This discussion can and should take place even without external support.

If desired, we at Dezentrum can contribute our experience from similar projects. This includes both expert analysis of current developments and support in shaping the process itself. We help to develop a meaningful structure, provide tried-and-tested frameworks for guidelines, and contribute examples and resources from research and practice. At the same time, we ensure that the process is tailored to the specific needs of the organisation.

What sets good AI guidelines apart

Good AI guidelines are, first and foremost, easy to understand. They are written in clear, simple language and can be quickly grasped and applied in day-to-day work. Complex or technocratic wording is of little help. At the same time, the guidelines should have a lasting impact. This means that they are not geared towards individual tools or short-term trends, but towards overarching principles. This ensures they remain relevant even as technologies evolve.

Applicability is key. Guidelines must be based on specific situations that actually occur in everyday life. In the case of OKJA, the connection to young people’s lived experiences is central: the perspectives and realities of young people form an important reference point for dealing with AI.

Process design: the case of OKJA Basel

The process begins with a joint assessment of the current situation. The first step involves understanding needs and uncertainties, establishing a shared knowledge base, and clarifying the process and objectives. At this stage, the format of the final output is defined, such as guidelines and, where appropriate, supplementary materials.

This is followed by a phase of iterative development. The working group collaboratively develops content, discusses specific use cases and gradually produces a first draft. This process is deliberately designed to be dialogical and draws on practical experience.

In a final step, the guidelines are refined, formulated and finalised. At the same time, it is clarified how implementation might look in everyday practice. The process therefore does not end with the document, but includes the question of implementation.

Dezentrum partner Lukas Hess is overseeing the process at OKJA Basel
Dezentrum partner Lukas Hess is overseeing the process at OKJA Basel

Particular feature: The disclaimer

One particularly intriguing idea to emerge from the OKJA process is the planned «disclaimer». In it, the working group explicitly addresses a fundamental tension: on the one hand, there is an awareness that AI is linked to significant societal problems – such as the concentration of power, unresolved copyright issues, or its use in military contexts.

On the other hand, in youth work, the connection to young people’s everyday lives is crucial. If the young people we work with use AI as a matter of course, this technology cannot simply be ignored.

The disclaimer captures precisely this ambivalence. It combines a critical stance towards AI with a pragmatic approach in practice. Rather than resolving the tension, it makes it visible and uses it as a starting point for reflective action. This is precisely where the particular strength of this approach lies.

Contact