Neural Ordinary Differential Equations (ODEs) struggle to model systems with long-range dependencies or discontinuities common in engineering and biological contexts. Despite alternative approaches, numerical instability persists when handling stiff ODEs and those with piecewise forcing functions. This work introduces Neural Laplace, a framework adept at learning various classes of differential equations, efficiently representing history-dependencies and discontinuities in the Laplace domain through complex exponentials. By leveraging the geometric stereographic map of a Riemann sphere, Neural Laplace ensures smoother learning in this domain. Experimental results indicate its superior performance in modeling and extrapolating trajectories of diverse differential equations, even those with complex history dependencies and abrupt changes.