Skip to main content Skip to main navigation

Publikation

NeST: The neuro-symbolic transpiler

Viktor Pfanschilling; Hikaru Shindo; Devendra Singh Dhami; Kristian Kersting
In: International Journal of Approximate Reasoning (IJAR), Vol. 179, Pages 1-22, Sciencedirect.com, 2025.

Zusammenfassung

Tractable Probabilistic Models such as Sum-Product Networks are a powerful category of models that offer a rich choice of fast probabilistic queries. However, they are limited in the distributions they can represent, e.g., they cannot define distributions using loops or recursion. To move towards more complex distributions, we introduce a novel neurosymbolic programming language, Sum Product Loop Language (SPLL), along with the Neuro-Symbolic Transpiler (NeST). SPLL aims to build inference code most closely resembling Tractable Probabilistic Models. NeST is the first neuro-symbolic transpiler—a compiler from one high-level language to another. It generates inference code from SPLL but natively supports other computing platforms, too. This way, SPLL can seamlessly interface with e.g. pretrained (neural) models in PyTorch or Julia. The result is a language that can run probabilistic inference on more generalized distributions, reason on neural network outputs, and provide gradients for training.

Weitere Links