Citation
Karan Sikka and Andrew Silberfarb and John Byrnes and Indranil Sur and Ed Chow and Ajay Divakaran and Richard Rohwer: Deep Adaptive Semantic Logic (DASL): Compiling Declarative Knowledge into Deep Neural Networks. (2020) 2003.07344.
Abstract
We introduce Deep Adaptive Semantic Logic (DASL), a novel framework for automating the generation of deep neural networks that incorporates user-provided formal knowledge to improve learning from data. We provide formal semantics that demonstrate that our knowledge representation captures all of first order logic and that finite sampling from infinite domains converges to correct truth values. DASL’s representation improves on prior neural-symbolic work by avoiding vanishing gradients, allowing deeper logical structure, and enabling richer interactions between the knowledge and learning components. We illustrate DASL through a toy problem in which we add structure to an image classification problem and demonstrate that knowledge of that structure reduces data requirements by a factor of 1000 . We then evaluate DASL on a visual relationship detection task and demonstrate that the addition of commonsense knowledge improves performance by 10.7 % in a data scarce setting.