07. January 2026
AIFRI Viewer
Visualizing results from the research project AI for Rail Inspection
tl;dr : There is a viewer for our results from the aifri project: It might take a minute to start the container and see some data.
Railway infrastructure safety is critical for millions of passengers and freight operations daily. Traditional rail inspection methods rely on automatic assessment (AA) algorithms with fixed moving windows. The AIFRI research project explores how neural networks can improve defect detection. This interactive viewer demonstrates the differences between traditional and AI-based approaches
What is AIFRI?
AIFRI stands for AI for Railway Infrastructure and is a collaborative research project headed by DZFS and partnering with BAM, TU-Berlin, VRANA, Zedas and DB Infra Go. The goal, especially from my side is to develop neural network-based methods for detecting rail artifacts and defects in ultrasonic (UT) and eddy current testing (ET) data from regular inspection runs by DB.
This is supposed to improve the inspection work of DB which is currenlty done in a combination of manual work and hierarchical algorithms analysing B-Data pattern. Advancing detection and classification of artifact and defect pattern can significantly reduce the workload and therefore improve the quality. The current method provides limited flexibility in detecting irregular or novel defect patterns and demands a high amount of manual work. Du to the large volume of data with repeating tasks, new automated analyses are required.
Common testing data from the field typically contains no or only singular occurrences of defects, which is not enough to train a neural network sufficiently. Because of this, BAM provides simulated defects, which were sucessfully implemented into the modelling pipeline. Field data and simulations can be explored in the AIFRI viewer.
Comparing approaches
- Traditional AA: Analyses gate data distribution in fixed moving windows and provides hierarchical decision
- AIFRI AI: Assigns most likely defect class to each pixel column detected as artifact
- Flexibility: AI can adapt to irregular patterns in Gate and Pixel data, not covered by predefined rules
- Granularity: Pixel-level classification vs. window-based assessment
- Trade-offs: Research model still has room for improvement vs. established AA methods
Key Features of the AIFRI Viewer
- Interactive Data Exploration: Browse real UT inspection data from the mobilithek database
- Approach Comparison: View traditional AA vs. AI predictions simultaneously
- Detailed Visualisation: Examine ultrasonic testing data at the measurement level of 3mm resolution
- Open Data Integration: Direct access to publicly available rail inspection data
How It Works:
- Select an inspection file from the mobilithek file list
- Download the DICOM data containing ultrasonic testing data
- View the prediction overview showing all detected artifacts along the rail grouped by meter
- Compare assessments: How do both methods assess the same data and what is their output
- Examine the raw ultrasonic data and compare pattern and classifications
- Explore simulations made for training by BAM
Future Development
[] Supporting additional ET data from mibilithek
[] Mobile-friendly design
[] Performance optimizations for larger datasets
[] optimisation for less loading time on google cloud
[] Open for feedback and collaboration