

Water management has made great strides in digitalisation and automation in recent years. Sensors record water levels, models forecast discharges – everything works perfectly as long as conditions remain normal. However, historical data rarely reflects extreme events such as heavy rainfall. Measurement campaigns are costly, and the results are often not publicly available. Local authorities are thus navigating growing climate risks in the dark – and bearing the consequences.
Generative AI models solve this data puzzle. They learn real-world distributions and generate synthetic time series that plausibly represent even rare extremes. Research findings show that such data significantly improves the accuracy of forecasting models. Some even achieve the quality of real measurement series using exclusively AI-generated data. For cities, this means more reliable provision.
At the DFKI in Kaiserslautern, Prof. Andreas Dengel, as Managing Director and Head of the ‘Smart Data & Knowledge Services’ division, is driving these approaches forward in collaboration with the other research divisions based there. For example, researchers are testing AI systems directly on the city’s wastewater network and establishing partnerships such as the recently launched Transferlab with the Federal Institute of Hydrology (BfG).
This not only improves hydrology and water quality, but also provides policymakers and administrators with concrete options for action.
AI does not replace engineers, but it does make up for missing data. It makes digital twins more resilient to climate stress. The real question is a political one: should cities learn reactively from damage – or plan proactively using simulated scenarios? Synthetic data better equips local authorities, even on tight budgets. In the face of climate change, this is not an option, but a duty.
Executive Director, DFKI Kaiserslautern