Research

How can we reconstruct high-quality images from measurement data?
How can we uncover biochemical reaction mechanisms from experiments?
And how can we quantify the uncertainty in the predictions of mathematical models such as neural networks?



These questions arise across a wide range of applications—from medical imaging, remote sensing, and data assimilation to computational chemistry, biology, and neuroscience—and drive my research. I am particularly interested in exploring the intersection of numerical analysis, inverse problems & uncertainty quantification, and hierarchical Bayesian learning. I focus on analyzing and developing foundational computational tools that leverage prior knowledge to improve the accuracy, efficiency, and reliability of solving inverse problems and time-dependent partial differential equations (PDEs). I am also interested in quantifying the confidence of mathematical models (including neural networks) and computational predictions.

Please find more details on my research, including publications, below. See also Google Scholar, Research Gate, or ORCiD.

Hierarchical Bayesian learning

Hierarchical Bayesian learning is a statistical framework for learning from data when information is uncertain, incomplete, or comes from multiple sources. By modeling parameters at different levels and allowing them to share information, it provides reliable estimates together with measures of uncertainty. This makes it particularly useful in areas such as machine learning, imaging, and the natural sciences.

My research develops robust and efficient hierarchical Bayesian methods for recovering signals and images from indirect and noisy measurements. I work on sparse Bayesian learning techniques with applications in denoising, deblurring, magnetic resonance imaging (MRI), and synthetic aperture radar (SAR).

Selected publications:

  • J. Glaubitz and Y. Marzouk. Efficient sampling for sparse Bayesian learning using hierarchical prior normalization. (arXiv:2505.23753 [math.NA], 2025)

  • J. Glaubitz, A. Gelb, and G. Song. Generalized sparse Bayesian learning and application to image reconstruction. SIAM-ASA J Uncertain Quantif 11(1), 262–284 (2023). (DOI: 10.1137/22M147236X)

  • J. Glaubitz and A. Gelb. Leveraging joint sparsity in hierarchical Bayesian learning. SIAM-ASA J Uncertain Quantif 12(2), (2024). (DOI: 10.1137/23M156255X)

Uncertainty quantification

Uncertainty quantification (UQ) focuses on understanding how reliable model-based predictions are when data and models are imperfect. By identifying and quantifying sources of uncertainty, UQ helps researchers assess the confidence in their results and make better-informed decisions. It plays an important role in fields ranging from engineering and environmental science to medicine.

My research develops Bayesian methods for uncertainty quantification in inverse problems, time-dependent partial differential equations, and machine learning, with a particular emphasis on hierarchical models. By incorporating prior knowledge—such as sparsity and piecewise smoothness—these methods improve both reconstruction quality and uncertainty assessment. The resulting techniques support reliable decision-making.

Selected publications:

  • J. Glaubitz and Y. Marzouk. Efficient sampling for sparse Bayesian learning using hierarchical prior normalization. (arXiv:2505.23753 [math.NA], 2025)

  • M. Le Provost, J. Glaubitz, and Y. Marzouk. Preserving linear invariants in ensemble filtering methods. (arXiv:2404.14328 [stat.CO], 2024)

  • P. Öffner, J. Glaubitz, and H. Ranocha. Stability of correction procedure via reconstruction with summation-by-parts operators for Burgers’ equation using a polynomial chaos approach. ESAIM: M2AN, 52.6, 2215–2245, (2018). (DOI: 10.1051/m2an/2018072)

Numerical conservation laws

Hyperbolic conservation laws are mathematical models used to describe wave-like phenomena such as fluid flow, traffic dynamics, and wave propagation. Their solutions often contain sharp features, such as jump discontinuities and steep gradients, which make accurate and stable numerical simulation challenging.

My recent research develops stable and high-accuracy numerical methods for solving such equations using summation-by-parts (SBP) operators with non-polynomial approximation spaces. While classical SBP methods rely on polynomials, these are not always the most effective choice. I have established a general SBP framework that extends beyond polynomials to include trigonometric, exponential, and radial basis function spaces.

Selected publications:

  • J. Glaubitz, J. Nordström, and P. Öffner. Summation-by-parts operators for general function spaces. SIAM J Numer Anal 61(2), 733–754 (2023). (DOI: 10.1137/22M1470141)

  • J. Glaubitz, J. Nordström, and P. Öffner. Energy-stable global radial basis function methods on summation-by-parts form. J Sci Comput 98, article 30 (2024). (DOI: 10.1007/s10915-023-02427-8)

  • J. Glaubitz, J. Nordström, and P. Öffner. An optimization-based construction procedure for function space-based summation-by-parts operators on arbitrary grids. J Sci Comput 105, article 83 (2025). (DOI: 10.1007/s10915-025-03062-1)

Publications

Books (2)

  1. J. Glaubitz. Shock capturing and high-order methods for hyperbolic conservation laws. Dissertation, Logos Verlag Berlin, 2020. (DOI: 10.30819/5084)

  2. J. Glaubitz, D. Rademacher, T. Sonar. Lernbuch Analysis 1 - Das Wichtigste ausführlich für Bachelor und Lehramt. (Teaching book in analysis,) Springer, 2019. (DOI: 10.1007/978-3-658-26937-1)

Preprints (4)

  1. J. Glaubitz, J. Lampert, A. R. Winters, and J. Nordström. Towards provable energy-stable overset grid methods using sub-cell summation-by-parts operators. 2025, arXiv:2509.21442 [math.NA]

  2. J. Glaubitz, T. Li, J. Ryan, and R. Stuhlmacher. The Bayesian SIAC filter. 2025, arXiv:2509.14771 [math.NA]

  3. J. Lindbloom, M. Pasha, J. Glaubitz, and Y. Marzouk. Priorconditioned sparsity-promoting projection methods for deterministic and Bayesian linear inverse problems. 2025, arXiv:2505.01827 [math.NA]

  4. M. Le Provost, J. Glaubitz, and Y. Marzouk. Preserving linear invariants in ensemble filtering methods. 2024, arXiv:2404.14328 [stat.CO]

Refereed Journal Articles (26)

  1. J. Glaubitz, J. Nordström, P. Öffner. An optimization-based construction procedure for function space-based summation-by-parts operators on arbitrary grids. J Sci Comput 105, article 83 (2025). (DOI: 10.1007/s10915-025-03062-1)

  2. J. Glaubitz, H. Ranocha, A. Winters, M. Schlottke-Lakemper, P. Öffner, G. Gassner. Generalized upwind summation-by-parts operators and their application to nodal discontinuous Galerkin methods. J Comput Phys, 113841 (2025). (DOI: 10.1016/j.jcp.2025.113841)

  3. J. Lindbloom, J. Glaubitz, and A. Gelb. Efficient sparsity-promoting MAP estimation for Bayesian linear inverse problems. Inverse Probl, 41, 025001 (2025). (DOI: 10.1088/1361-6420/ada17f)

  4. H. Ranocha, A. Winters, M. Schlottke-Lakemper, P. Öffner, J. Glaubitz, G. Gassner. On the robustness of high-order upwind summation-by-parts methods for nonlinear conservation laws. J Comput Phys, 520, 112889 (2025). (DOI: 10.1016/j.jcp.2024.113471)

  5. J. Glaubitz, A. Gelb. Leveraging joint sparsity in hierarchical Bayesian learning. SIAM-ASA J Uncertain Quantif 12(2), (2024). (DOI: 10.1137/23M156255X)

  6. J. Glaubitz, S.-C. Klein, J. Nordström, P. Öffner. Summation-by-parts operators for general function spaces: The second derivative. J Comput Phys, 504, 112889 (2024). (DOI: 10.1016/j.jcp.2024.112889)

  7. J. Glaubitz, J. Nordström, P. Öffner. Energy-stable global radial basis function methods on summation-by-parts form. J Sci Comput 98, article 30 (2024). (DOI: 10.1007/s10915-023-02427-8)

  8. J. Glaubitz, S.-C. Klein, J. Nordström, P. Öffner. Multi-dimensional summation-by-parts operators for general function spaces: Theory and construction. J Comput Phys 491, 112370 (2023). (DOI: 10.1016/j.jcp.2023.112370)

  9. J. Glaubitz. Construction and application of provable positive and exact cubature formulas. IMA J Numer Anal, 43(3), 1616—1652 (2023). (DOI: 10.1093/imanum/drac017)

  10. Y. Xiao, J. Glaubitz. Sequential image recovery using joint hierarchical Bayesian learning. J Sci Comput 96, Article 4 (2023). (DOI: 10.1007/s10915-023-02234-1)

  11. J. Glaubitz, J. Nordström, P. Öffner. Summation-by-parts operators for general function spaces. SIAM J Numer Anal 61(2), 733—754 (2023). (DOI: 10.1137/22M1470141)

  12. J. Glaubitz, A. Gelb, G. Song. Generalized sparse Bayesian learning and application to image reconstruction. SIAM-ASA J Uncertain Quantif 11(1), 262–284 (2023). (DOI: 10.1137/22M147236X)

  13. J. Glaubitz, J. Reeger. Towards stability results for global radial basis function based quadrature formulas. BIT Numer Math 63, 6 (2023). (DOI: 10.1007/s10543-023-00956-0)

  14. Y. Xiao, J. Glaubitz, A. Gelb, G. Song. Sequential image recovery from noisy and under-sampled Fourier data. J Sci Comput 91 (3), 79 (2022). (DOI: 10.1007/s10915-022-01850-7)

  15. J. Glaubitz. Stable high-order cubature formulas for experimental data. J Comput Phys 447, 110693 (2021). (DOI: 10.1016/j.jcp.2021.110693)

  16. J. Glaubitz, E. Le Meledo, P. Öffner. Towards stable radial basis function methods for linear advection problems. Comput Math Appl 85, 84–97 (2021). (DOI: 10.1016/j.camwa.2021.01.012)

  17. J. Glaubitz, A. Gelb. Stabilizing radial basis function methods for conservation laws using weakly enforced boundary conditions. J Sci Comput 87, 40 (2021). (DOI: 10.1007/s10915-021-01453-8)

  18. J. Glaubitz. Stable high-order quadrature rules for scattered data and general weight functions. SIAM J Numer Anal 58, 2144 (2020). (DOI: 10.1137/19M1257901)

  19. J. Glaubitz, P. Öffner. Stable discretisations of high-order discontinuous Galerkin methods on equidistant and scattered points. Appl Numer Math 151, 98–118 (2020). (DOI: 10.1016/j.apnum.2019.12.020)

  20. P. Öffner, J. Glaubitz, H. Ranocha. Analysis of artificial dissipation of explicit and implicit time-integration methods. Int J Numer Anal Model, 17.3, 332–349 (2020). (URL: http://www.math.ualberta.ca/ijnam/Volume- 17-2020/No-3-20/2020-03-03.pdf)

  21. J. Glaubitz. Shock capturing by Bernstein polynomials for scalar conservation laws. Appl Math Comput 363, 124593 (2019). (DOI: 10.1016/j.amc.2019.124593)

  22. J. Glaubitz, A. Gelb. High order edge sensors with l1 regularization for enhanced discontinuous Galerkin methods. SIAM J Sci Comput, 41(2), A1304–A1330 (2019). (DOI: 10.1137/18M1195280)

  23. J. Glaubitz, A.C. Nogueira Jr., J.L.S. Almeida, R.F. Cantao, C.A.C. Silva. Smooth and compactly supported viscous sub-cell shock capturing for discontinuous Galerkin methods. J Sci Comput, 79, 249–272 (2019). (DOI: 10.1007/s10915-018-0850-3)

  24. P. Öffner, J. Glaubitz, H. Ranocha. Stability of correction procedure via reconstruction with summation-by- parts operators for Burgers’ equation using a polynomial chaos approach. ESAIM: M2AN, 52.6, 2215–2245, (2018). (DOI: 10.1051/m2an/2018072)

  25. H. Ranocha, J. Glaubitz, P. Öffner, T. Sonar. Stability of artificial dissipation and modal filtering for flux reconstruction schemes using summation-by-parts operators. Appl Numer Math, 128, 1–23 (2018). (DOI: 10.1016/j.apnum.2018.01.019)

  26. J. Glaubitz, P. Öffner, T. Sonar. Application of modal filtering to a spectral difference method. Math Comput, 87.309, 175–207 (2018). (DOI: 10.1090/mcom/3257)

Refereed Conference Proceedings (2)

  1. J. Glaubitz, A. Gelb. Using l1-regularization for shock capturing in discontinuous Galerkin methods. ICOSAHOM 2020+1. Springer Nature, Vol. 137, p. 337 (2023). (DOI: 10.1007/978-3-031-20432-6_21)

  2. J. Glaubitz, P. Öffner, H. Ranocha, T. Sonar. Artificial viscosity for correction procedure via reconstruction using summation-by-parts operators. XVI International Conference on Hyperbolic Problems: Theory, Numerics, Applications. Springer, Cham, 363–375 (2016). (DOI: 10.1007/978-3-319-91548-7_28)

Software (16)

  1. J. Glaubitz, J. Lampert, A. R. Winters, and J. Nordström. Reproducibility repository for ``Towards provable energy-stable overset grid methods using sub-cell summation-by-parts operators". GitHub (2025). (https://github.com/JoshuaLampert/2025_overset_grid_sub-cell)

  2. J. Glaubitz, T. Li, J. Ryan, and R. Stuhlmacher. Reproducibility repository for ``The Bayesian SIAC filter". GitHub (2025). (https://github.com/RomanStuhlmacher/paper-2025-Bayesian-SIAC-Filter)

  3. J. Glaubitz and Y. Marzouk. Reproducibility repository for ``Efficient sampling for sparse Bayesian learning using hierarchical prior normalization". GitHub (2025). (https://github.com/jglaubitz/paper-2025-SBL-priorNormalization)

  4. J. Lindbloom, M. Pasha, J. Glaubitz, and Y. Marzouk. Reproducibility repository for ``Priorconditioned sparsity-promoting projection methods for deterministic and Bayesian linear inverse problems". GitHub (2025). (https://github.com/mpasha3/IRLS_prec_GSBL)

  5. J. Glaubitz, J. Nordström, and P. Öffner. Reproducibility repository for ``An optimization-based construction procedure for function space-based summation-by-parts operators on arbitrary grids”. GitHub (2025). (https://github.com/phioeffn/SBP-Construction)

  6. M. Le Provost, J. Glaubitz, and Y. Marzouk. Reproducibility repository for ``Preserving linear invariants in ensemble filtering methods". GitHub (2024). (https://github.com/mleprovost/Paper-Linear-Invariants-Ensemble-Filters)

  7. J. Glaubitz, H. Ranocha, A.R. Winters, M. Schlottke-Lakemper, P. Öffner, G.J. Gassner. Reproducibility repository for ``Generalized upwind summation-by-parts operators and their application to nodal discontinuous Galerkin methods". Zenodo (2024). (DOI: 10.5281/zenodo.11661785)

  8. H. Ranocha, A.R. Winters, M. Schlottke-Lakemper, P. Öffner, J. Glaubitz, G.J. Gassner. Reproducibility repository for ``On the robustness of high-order upwind summation-by-parts methods for nonlinear conservation laws". Zenodo (2023). (DOI: 10.5281/zenodo.10200102)

  9. J. Glaubitz. jglaubitz/2ndDerivativeFSBP. GitHub (2023). (https://github.com/jglaubitz/2ndDerivativeFSBP)

  10. J. Glaubitz. jglaubitz/LeveragingJointSparsity. GitHub (2023). (https://github.com/jglaubitz/LeveragingJointSparsity)

  11. J. Glaubitz. jglaubitz/FSBP. GitHub (2022). (https://github.com/jglaubitz/FSBP)

  12. J. Glaubitz. jglaubitz/generalizedSBL. GitHub (2022). (https://github.com/jglaubitz/generalizedSBL)

  13. J. Glaubitz. jglaubitz/stableCFs (v2.0). Zenodo (2021). (DOI: 10.5281/zenodo.5392394)

  14. J. Glaubitz. jglaubitz/positive CFs (v1.0). Zenodo (2021). (DOI: 10.5281/zenodo.5164000)

  15. J. Glaubitz. jglaubitz/stability RBF CFs (v1.0). Zenodo (2021). (DOI: 10.5281/zenodo.5086347)

  16. J. Glaubitz. jglaubitz/weakRBF (v1.0). Zenodo (2020). (DOI: 10.5281/zenodo.4310328)