TorchLaplace’s documentation
TorchLaplace is open-source software for differentiable Laplace Reconstructions for modelling any time observation with O(1) complexity. This library provides Inverse Laplace Transform (ILT) algorithms implemented in PyTorch. Backpropagation through differential equation (DE) solutions in the Laplace domain is supported using the Riemann stereographic projection for better global representation of the complex Laplace domain. For usage for DE representations in the Laplace domain in deep learning applications, see reference [1] .
Useful links: Install | Source Repository | Issues & Ideas
Getting started
New to TorchLaplace? Check out the getting started guides. They contain an introduction to TorchLaplace’ main concepts and links to additional tutorials.
User guide
The user guide provides in-depth information on the key concepts of TorchLaplace with useful background information and explanation.
API reference
The reference guide contains a detailed description of the TorchLaplace API. The reference describes how the methods work and which parameters can be used. It assumes that you have an understanding of the key concepts.
Developer guide
Saw a typo in the documentation? Want to improve existing functionalities? The contributing guidelines will guide you through the process of improving TorchLaplace.
Contents
References
For usage for DE representations in the Laplace domain and leveraging the stereographic projection and other applications see:
[1] Samuel Holt, Zhaozhi Qian, and Mihaela van der Schaar. “Neural laplace: Learning diverse classes of differential equations in the laplace domain.” International Conference on Machine Learning. 2022. arxiv
—
If you found this library useful in your research, please consider citing.
@inproceedings{holt2022neural,
title={Neural Laplace: Learning diverse classes of differential equations in the Laplace domain},
author={Holt, Samuel I and Qian, Zhaozhi and van der Schaar, Mihaela},
booktitle={International Conference on Machine Learning},
pages={8811--8832},
year={2022},
organization={PMLR}
}