
How AI can strengthen ATSEP continuation training
At our last ATSEP seminar, Sebastian Wagner opened with a story.
He took the room to South Africa. A major airport was in full operation when the flight plan management system suddenly broke down. Screens froze. New clearances stopped flowing. Departures began to stack up. Arrivals became uncertain. Within minutes, Africa’s largest hub was struggling and the whole system was under pressure.
Then an experienced ATSEP stepped in.
Gerald knew the system, the weak points and the logic behind it. More importantly, he knew how to stay calm. Together with his team, he checked logs, isolated corrupted data paths and worked through the problem in a structured way. What could have turned into a full day of disruption was reduced to just a few hours.
That story set the tone for the session. The message from Sebastian Wagner and Ulf Herbig was clear: resilience is not built only into systems. It is built into people through training. Technology can fail, but well-trained engineers like Gerald turn disruption into safe recovery.
Why ATSEP continuation training is critical
ATSEPs work with systems that are critical for safety and operations. At the same time, systems evolve, configurations change and expectations grow. Regulations are also clear. Continuation training, emergency training and conversion training are required.
The challenge is that some of the most serious situations are also the hardest to train. Rare failures do not happen often enough to build routine. Cross-domain events are difficult to recreate. Traditional simulator time is valuable, but limited, if simulators are available for these scenarios at all.
That is why Wagner and Herbig focused on a practical question: how can ATSEP continuation training become more relevant, more realistic and more useful?
Building trust in AI for ATSEP training
Before discussing solutions, they asked the audience whether AI could help scale or personalise refresher and emergency training, and whether they would trust AI-generated outcomes.
The answers showed both interest and caution. Many saw clear potential, but trust was the key condition.
That was an important point throughout the session. In aviation, trust does not come from technology alone. It comes from transparency, accountability and human oversight.
Herbig explained that AI is already moving into a regulated space. The EU AI Act makes clear that the risk depends on the use case. A tool that helps generate scenarios is very different from a tool that would assess competence on its own.
For training organisations, that matters. AI can support training, but it must be designed in the right way, with humans still in control.
Practical AI use cases in ATSEP training
The most valuable part of the session was how concrete it became.
Wagner and Herbig showed how AI could support emergency training by creating rare and complex scenarios that are difficult to build manually. In data processing, that could mean corrupt flight plan data, duplicate tracks or no AFTN/AMHS connection at all. In communications, it could mean interference or receiver problems in complex traffic situations. In cross-domain cases, it could mean power failures, cooling issues or chain reactions across several systems.
The value is not only in creating a scenario once. It is in creating variation. AI could help generate different versions of the same core problem and adapt the difficulty depending on the ATSEP’s experience level and how the trainee responds.
That makes training more dynamic and closer to operational reality.
They also highlighted the important human side of ATSEP work. Communication under pressure, decision making and cooperation across roles are all part of handling real incidents. AI can help create scenarios for that too, but they believe that the instructor will continue to play the central role in briefing, guidance and debriefing.
Why human instructors still matter in ATSEP training
One point came back again and again: AI should support instructors, not replace them.
That is where the real opportunity lies. AI can help create realistic situations, increase variation and improve accessibility. But instructors contribute contextual understanding, professional experience and the ability to make balanced assessments based on the overall situation. They understand not only what happened, but why it mattered.
That balance is essential if AI is to strengthen training in a trustworthy way.
What AI means for the future of ATSEP training
This was not a session about technology for technology’s sake. It was a session about preparedness.
The story from South Africa stayed with the audience for a reason. When systems fail, the difference is often made by trained people who know what to do and how to stay calm.
That is why continuation training matters. And that is why AI is worth exploring.
Not because it is fashionable, but because, used in the right way, it can help make ATSEP training more relevant, more realistic and more accessible.
And in the end, that is what builds resilience.
Want to learn more about how AI could support your ATSEP continuation training? Contact us to continue the conversation.