Papers
Topics
Authors
Recent
Search
2000 character limit reached

Comparing differentiable logics for learning with logical constraints

Published 4 Jul 2024 in cs.LO | (2407.03847v2)

Abstract: Extensive research on formal verification of machine learning systems indicates that learning from data alone often fails to capture underlying background knowledge, such as specifications implicitly available in the data. Various neural network verifiers have been developed to ensure that a machine-learnt model satisfies correctness and safety properties; however, they typically assume a trained network with fixed weights. A promising approach for creating machine learning models that inherently satisfy constraints after training is to encode background knowledge as explicit logical constraints that guide the learning process via so-called differentiable logics. In this paper, we experimentally compare and evaluate various logics from the literature, present our findings, and highlight open problems for future work. We evaluate differentiable logics with respect to their suitability in training, and use a neural network verifier to check their ability to establish formal guarantees. The complete source code for our experiments is available as an easy-to-use framework for training with differentiable logics at https://github.com/tflinkow/comparing-differentiable-logics.

Citations (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 3 likes about this paper.