Papers
Topics
Authors
Recent
Search
2000 character limit reached

Problems with information theoretic approaches to causal learning

Published 24 Oct 2021 in cs.IT and math.IT | (2110.12497v1)

Abstract: The language of information theory is favored in both causal reasoning and machine learning frameworks. But, is there a better language than this? In this study, we demonstrate the pitfalls of infotheoretic estimation using first order statistics on (short) sequences for causal learning. We recommend the use of data compression based approaches for causality testing since these make very little assumptions on data as opposed to infotheoretic measures, and are more robust to finite data length effects. We conclude with a discussion on the challenges posed in modeling the effects of conditioning process $X$ with another process $Y$ in causal machine learning. Specifically, conditioning can increase 'confusion' which can be difficult to model by classical information theory. A conscious causal agent creates new choices, decisions and meaning which poses huge challenges for AI.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.