-
Technical University of Munich
- Munich
-
23:38
(UTC +01:00) - qiauil.github.io
- in/qiang-liu-3a21b92b3
Highlights
- Pro
Stars
For trading. Please star.
PDE-Transformer is a neural network architecture designed to efficiently process and predict the evolution of physical systems described by partial differential equations.
Official implementation of "Guiding diffusion models to reconstruct flow fields from sparse data"
About code release of "Transolver: A Fast Transformer Solver for PDEs on General Geometries", ICML 2024 Spotlight. https://arxiv.org/abs/2402.02366
Official implementation of Diffusion Graph Networks (DGNs)
Dataset handling for physics-based deep learning tasks
Customizable cover page for an MkDocs site.
Use Jupyter Notebook in mkdocs
Intuitive scientific computing with dimension types for Jax, PyTorch, TensorFlow & NumPy
Approximating Wasserstein distances with PyTorch
A modern file manager that helps users organize their files and folders.
Library for Jacobian descent with PyTorch. It enables the optimization of neural networks with multiple losses (e.g. multi-task learning).
Code for the paper "Poseidon: Efficient Foundation Models for PDEs"
Efficient Differentiable n-d PDE Solvers in JAX.
[Neurips 2024] A benchmark suite for autoregressive neural emulation of PDEs. (≥46 PDEs in 1D, 2D, 3D; Differentiable Physics; Unrolled Training; Rollout Metrics)
A python package for automated loading of physics datasets.
Visualisation for 4D Volumes (Space + Time) written in WebGPU and Rust
Convolutional Differential Operators for Physics-based Deep Learning Study
[ICLR2025 Spotlight] Official implementation of Conflict-Free Inverse Gradients Method
A beautiful, simple, clean, and responsive Jekyll theme for academics