and_or_kl

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
and_or_kl [2024/06/20 13:05] – [Building conjunctions and disjunctions] pedroortegaand_or_kl [2024/07/26 18:53] (current) pedroortega
Line 1: Line 1:
 ====== And, Or, and the Two KL Projections ====== ====== And, Or, and the Two KL Projections ======
- 
  
 > I discuss the difference between minimizing the KL-divergence with respect to the first and second argument, and will conclude that they correspond to AND and OR operations on distributions, respectively. > I discuss the difference between minimizing the KL-divergence with respect to the first and second argument, and will conclude that they correspond to AND and OR operations on distributions, respectively.
 +
 +//Cite as: Ortega, P.A. "And, Or, and the Two KL Projections", Tech Note 1, DAIOS, 2024.//
 +
  
 Oftentimes I see people wondering about the meaning of the two KL-projections: Oftentimes I see people wondering about the meaning of the two KL-projections:
Line 20: Line 22:
 {{ ::forward-reverse-kl.png?800 |}} {{ ::forward-reverse-kl.png?800 |}}
  
-Personally, I find this explanation somewhat unclear and unsatisfying. I've looked at the difference from various perspectives, including [[https://en.wikipedia.org/wiki/Information_geometry|Information Geometry]]. However, I was never truly happy with how the material was presented in the literature. I think the simplified explanation below, which took me years to arrive to, encapsulates a substantial amount of insight without relying on the complexities of information geometry. I hope you agree!+Personally, I find this explanation somewhat vague and unsatisfying. I've looked at the difference from various angles, including [[https://en.wikipedia.org/wiki/Information_geometry|Information Geometry]]. However, I was never truly happy with how the material was presented in the literature. I think the simplified explanation below, which took me years to get to, encapsulates the core insight without relying on the complexities of information geometry. I hope you agree!
  
  
Line 34: Line 36:
 Let's say we have $N$ distributions $q_1, q_2, \ldots, q_N$ over a finite set $\mathcal{X}$. Let's say we have $N$ distributions $q_1, q_2, \ldots, q_N$ over a finite set $\mathcal{X}$.
 Given a set of positive weights $w_1, w_2, \ldots, w_N$ that sum up to one, their  Given a set of positive weights $w_1, w_2, \ldots, w_N$ that sum up to one, their 
-*linear mixtureis+//linear mixture// is
  
 $$ $$
Line 40: Line 42:
 $$ $$
  
-The *linear mixtureexpresses $N$ mutually exclusive hypotheses $q_i(x)$ that+The //linear mixture// expresses $N$ mutually exclusive hypotheses $q_i(x)$ that
 could be true with probabilities $w_i$. That is, either $q_1$ **or** $q_2$ **or** could be true with probabilities $w_i$. That is, either $q_1$ **or** $q_2$ **or**
 ... **or** $q_N$ is true, with probability $w_1$, $w_2$, ..., $w_N$ respectively, ... **or** $q_N$ is true, with probability $w_1$, $w_2$, ..., $w_N$ respectively,
Line 142: Line 144:
 Thus, it turns out that sequential predictions can be regarded as an alternation Thus, it turns out that sequential predictions can be regarded as an alternation
 between OR and AND operations, first to express our uncertainty over the hypotheses, between OR and AND operations, first to express our uncertainty over the hypotheses,
-and second to incorporate new evidence, respectively.+and second to incorporate new evidence, respectively. What class of algorithms does this remind you of?
  • and_or_kl.1718888714.txt.gz
  • Last modified: 2024/06/20 13:05
  • by pedroortega