Making big, messy data behave - with the help of HPC
11. stu 2025.
Paper published in " Journal MDPI Mathematics":"Scalable QR Factorisation of Ill-Conditioned Tall-and-Skinny Matrices on Distributed GPU Systems"
Our researchers Nenad Mijić, Abhiram Kaushik, Dario Zivkovic and Davor Davidovic have developed a new way to handle one of the trickiest problems in scientific computing: working with huge, unstable datasets that normally break traditional math methods.
Many modern technologies - like climate simulations, medical imaging, AI, and engineering - depend on something called QR factorization, a mathematical tool for solving complex systems of equations. But when the data is extremely unbalanced or “ill-conditioned,” even powerful computers struggle to process it correctly.
This new research introduces an improved algorithm designed specifically for supercomputers with many GPUs working in parallel. The method builds on a fast technique called CholeskyQR and reinforces it with extra stability checks so it doesn’t fall apart when the data is difficult.
The results are impressive:
- It stays accurate even with extremely unstable datasets.
- It runs up to 12× faster than widely used scientific libraries.
- It scales extremely well across many GPUs at once.
In simple terms:
The team found a way to make supercomputers crunch huge, messy datasets much more efficiently - unlocking better performance for simulations, modelling, and scientific discoveries across many fields.
Published in journal MDPI Mathematics
📄Read the full paper here: https://doi.org/10.3390/math13223608
This research was supported by the Hrvatska zaklada za znanost (Croatian Science Foundation) through project HybridScale, and by ERC grants GlueSatLight and YoctoLHC - special thanks to SRCE for providing access to the Supek supercomputer.