Questions tagged [entropy]
For questions involving some concept of entropy (Boltzmann, von Neumann, Shannon, Rényi, and others)
338 questions
6
votes
2
answers
798
views
Which conjectures have been solved (or partially solved) using entropy methods?
I am studying how entropy-based arguments (in the sense of Shannon, Boltzmann, or Perelman's geometric entropy) can be used to prove or guide the resolution of major mathematical conjectures.
Some ...
3
votes
0
answers
715
views
Can Tao’s maximum-entropy heuristic for smooth numbers be made rigorous to recover Hildebrand–Tenenbaum asymptotics?
Let $\Psi(x,y)$ be the number of $y$-smooth integers $\le x$.
It is classical (de Bruijn, Hildebrand-Tenenbaum) that $\Psi(x,y) \sim x \rho(u)$ with $u=\log x/\log y$, and more refined saddle-point ...
2
votes
1
answer
127
views
Maximum of packing number for symmetric convex sets
Let $B$ denote the unit Euclidean ball centered at $0 \in \mathbb{R}^n$. Given a set $K \subset \mathbb{R}^n$ let us denote by $P_K(x)$ the maximal number of points $y_i \in K \cap (x + B)$ such that ...
3
votes
0
answers
126
views
Kolmogorov $\varepsilon$-entropy of an ellipsoid
I am trying to understand Kolmogorov (metric) entropy in a concrete geometric setting.
Let $(X,d)$ be a metric space and $K \subset X$ a compact set. For $\varepsilon > 0$, define the Kolmogorov $\...
2
votes
1
answer
332
views
Detecting dependence of one random variable on the other through a function
This is a non-specialist question about probability theory.
I have two random variables $X$ and $Y$, and I want to measure to what extent is there a function $F$ such that $Y$ is $F(X)$.
I asked an AI ...
2
votes
0
answers
186
views
Can conditioning eliminate VC dimension dependence in empirical process bounds?
I'm analyzing the following function class:
$$
\mathcal{F} = \left\{ (x, z, y) \mapsto \mathbb{1}\{ y \leq z\alpha + x^\top\beta \} : \alpha \in \mathbb{R}, \beta \in \mathbb{R}^d \right\}.
$$
Let $\...
2
votes
0
answers
151
views
I am looking for "something like" an entry-wise matrix 1/2-norm. Has such a thing been studied? Where should I look?
Suppose an $n\times n$ square matrix $A$ with real or complex entries $A_{ij}$. Now define a quantity $Z(A)$ associated with the matrix by
$$
Z(A)=\sum_i \sum_j |A_{ij}|^{1/2}.
$$
What is this ...
0
votes
0
answers
94
views
Covering number for Lipschitz functions
Let $\mathcal{F}$ be the class of 1-Lipschitz functions $f:\mathbb{R} \to \mathbb{R}$ satisfying that $f(0) = 0$. Define a norm
$$
\|f\| = \max_{i\in I} \lambda_i\|f\|_{L^\infty(-R_i,R_i)},
$$
where $...
0
votes
0
answers
109
views
Uniform entropy bounds for unions of VC subgraph classes
I'm working with VC subgraph classes of functions, say $\mathcal{F}$ and $\mathcal{G}$, which are both uniformly bounded and admit envelopes $F$ and $G$, respectively. I came across a useful lemma (...
10
votes
0
answers
327
views
The discrete uncertainty principle and entropy
If $G$ is a finite abelian group, $f : G\to {\bf C}$ is a function, and $\hat f$ is its Fourier transform, then
$$|{\rm supp}(f)| \cdot |{\rm supp}(\hat f)| \ge |G|.\tag 1$$
This is the discrete ...
4
votes
1
answer
278
views
Is metric entropy characterized by dense subsets?
Let $X\subseteq Y$ be a dense subset of a metric space $(Y,\rho)$. Let $\varepsilon>0$, $A\subseteq Y$ and let $N(A,\varepsilon)$ denote the external $\varepsilon$-covering number of $A$; i.e. the ...
2
votes
1
answer
160
views
Entropy in the context of quantum discord
In this paper of Olivier & Żurek, the notion of quantum discord is introduced. I already asked one question related to this paper in this discussion. However, I realized that one (possibly small) ...
0
votes
0
answers
106
views
Is there a relation of topological entropy of a map "T" to Artin-Mazur zeta function of "T" ? (Same for Weil,Riemann,Ruelle,... zetas ?)
Given a map "T" topological space to itself (say compact) one may consider topological entropy of "T" (it is related to Kolmogorov–Sinai, or metric entropy in case of metric spaces)...
5
votes
2
answers
220
views
Does this notion of "differential" entropy exist?
In Leinster's Entropy and Diversity it's shown that the Shannon Entropy $H$ of finite random variables is characterized up to a constant factor by the so-called chain rule (and continuous dependence ...
20
votes
1
answer
611
views
Intuitive/combinatorial proof for Boppana entropy inequality $H(x^2)\ge\phi xH(x)$, i.e. $\binom{\phi p n}{\phi p^2 n} \leq \binom{n}{p^2n}$
Let $H(x)$ on $[0,1]$ denote the binary entropy function (the base of the logarithm does not matter for this whole discussion). Let $\phi:= \frac{1+\sqrt 5}{2}$ denote the golden ratio $\approx 1.618$....
0
votes
1
answer
113
views
KL Divergence Order For mixture distribution
Suppose $KL(p,q) \geq KL(r,q)$ for a density $p,q,r$ with the same support. Does this imply $KL(p, tp+(1-t)q) \geq KL(r, tr+(1-t)q)$ for $t\in [0,1]$? It is a different version of the question of KL ...
3
votes
1
answer
223
views
KL divergence order for convex combination
Consider probability densities functions p,q, and r with the same support. For $t \in (0,1)$, does $D(p,q) > D(p,r)$ imply $$D(p,tp+(1-t)q)\geq D(p,tp+(1-t)r))?$$
7
votes
1
answer
386
views
What is the name of the quantity $\sum_{i=1}^n(p_i-q_i)\log p_i$?
Let $P=\{p_1,\dots,p_n\},Q=\{q_1,\dots,q_n\}$ are discrete probability distribution. It is well known that $D_\text{KL}(P\|Q)=\sum_{i=1}^np_i(\log p_i-\log q_i)$ is the Kullback–Leibler divergence ...
3
votes
0
answers
109
views
Combinatorial/probabilistic interpretation of a quantity of union closed family
Let $\mathcal{F}\subseteq2^{[n]}$ be a union-closed family of sets. For a set $S\in[n]$ (not necessary belong to $\mathcal{F}$), define $w_{\mathcal{F}}(S)$ to be the number of subset of $S$ which ...
1
vote
0
answers
98
views
Convergence of iterated average Bayesian posterior to high entropy distribution
Setup
Assume $p_Y \in \Delta^n$ is a probability vector obtained by $p_Y=L_{Y|X}p_X$, where $L_{Y|X} \in \mathbb{R}^{n \times m}$ is an arbitrary likelihood (i.e, a column stochastic matrix) and $p_X \...
2
votes
1
answer
277
views
Is Boltzmann entropy well-defined for arbitrary probability density function?
$\newcommand{\bR}{\mathbb{R}}\newcommand{\diff}{\mathop{}\!\mathrm{d}}$ We define a continuous function $\varphi : \bR_+ \to \bR$ by
$$
\varphi (s) :=
\begin{cases}
0 &\text{if} \quad s =0 , \\
s \...
19
votes
2
answers
2k
views
Probability vector $p$ majorizes its normalized entropy vector $\small \frac{-p\log p}{H(p)}$
I guess the following inequality
$$ \sum_{i=1}^n g \left (\frac{-p_i \log p_i}{H(\boldsymbol{p})} \right ) \le \sum_{i=1}^n g (p_i)$$
holds for any continuous convex function $g$ and any probability ...
2
votes
0
answers
297
views
Ultraviolet divergences of entanglement entropy in QFT
I've often read that entanglement entropy in quantum field theory is ill-defined because local algebras are generally of type III, which implies that a trace doesn't exist. For a normal state $\omega_{...
3
votes
1
answer
147
views
$L^{1}$-convergence to steady states for an advection-diffusion equation on the half real line
I consider the following problem on the half real line
$$
\begin{cases}
u_t = u_{xx} + u_x, & \quad t > 0, \, x > 0, \\[2mm]
-u_x|_{x=0} = u|_{x=0}, & \quad t > 0, \, x = 0, \\[2mm]
u|...
2
votes
0
answers
176
views
Continuity of entropies, replica trick and Hausdorff moment problem
I could not find a really appropriate title for my question (happy to revise) but let me explain.
Suppose $p(x|c)$ is a probability density function over $x \in [0,1]$ which depends continuously on ...
3
votes
0
answers
142
views
The topological entropy of potential space filling curves on the unit interval
By a potential space filling curve we mean a continuous function $f:[0,1]\to [0,1]$ such that there is a continuous surgective function $g:[0,1]\to [0,1]^2$ with $f=\pi_1 \circ g$ where $\pi_1$...
0
votes
0
answers
223
views
Shub Conjecture and polynomial entropy
The Shub conjecture on topological entropy $h(f)$ of self map f on manifold M says that the topological entropy is greater (or equal) than (to) the log of maximum absolute values of the ...
6
votes
4
answers
2k
views
Bounding a binomial coefficient using the binary entropy function
I'm reading that recent paper on a new bound for diagonal Ramsey and am stuck at the attached "Fact 12.1", which is "standard".
Could anyone please point me to a source for this ...
1
vote
0
answers
95
views
Maximize differential entropy of probability distribution without fixing second moment
(Crossposted from Mathematics Stackexchange)
In Christopher Bishop's book "Pattern Recognition and Machine Learning", pages 53-54 specifically, he uses Lagrange multipliers to find the ...
1
vote
1
answer
229
views
trying to get intuition into why Cross Entropy will always be greater or equal to the Entropy
I understand what entropy measures and cross entropy is the same except it is uses another distribution $q$ to compare it against $p.$ Is it because the log function is concave down so the predictions ...
0
votes
0
answers
219
views
About the monotonicity of the exponential entropy
This question was previously posted on MSE at About the monotonicity of the exponential entropy.
In the paper The Unifying Frameworks of Information Measures the conditional exponential entropy (see ...
11
votes
1
answer
817
views
Entropy arguments used by Jean Bourgain
My question comes from understanding a probabilistic inequality in Bourgain's paper on Erdős simiarilty problem: Construction of sets of positive measure not containing an affine image of a given ...
4
votes
1
answer
346
views
Maximum entropy probability distribution with fixed interval and variance?
What is the maximum entropy probability distribution if the support is a fixed interval (e.g. $[-1,1]$) with an already known variance?
If we know the support is a fixed interval, then the maximum ...
0
votes
1
answer
417
views
Reference request: log Sobolev inequality for uniform measure (uniform distribution over discrete set)
Suppose that $N \in \mathbb N_+$ is fixed and denote by $\mu = (\mu_0,\ldots,\mu_N)$ the uniform distribution on the set $\{0,1,\ldots,N\}$ (i.e., $\mu_n = \frac{1}{N+1}$ for each $0\leq n\leq N$). I ...
2
votes
0
answers
150
views
Information inequality for Renyi divergences
Let $X^1 \ldots X^n$ be random variables on $\mathbb{R}^d$ with an arbitrary joint probability distribution $\mu$ on $\mathbb{R}^{n \times d}$. Let $\nu = \nu^1 \times \ldots \times \nu^n$ be a ...
1
vote
1
answer
316
views
Inequalities involving entropy: quantum discord and mutual information
My question is inspired by the following paper of Olivier and Żurek but for this question to be self-contained I will recall all the necessary definitions: for a quantum state $\rho$ we define the ...
1
vote
1
answer
187
views
Is the Boltzmann entropy continuous in the supremum norm?
We define $U : [0, +\infty) \to [0, +\infty)$ by $U(0) := 0$ and $U (s) := s \log s$ for $s >0$. Then $U$ is strictly convex. Let $D$ be the set of all bounded non-negative continuous functions $\...
1
vote
0
answers
181
views
Estimating the entropy of the solution to an SDE
Forgive me for the poorly researched question. I'm currently working on a computer science project involving training a neural stochastic differential equation, and I've run into a problem while ...
2
votes
1
answer
190
views
Does every proximal dynamical system have zero topological entropy?
A dynamical system is proximal if $$\:\forall (x,y) \in X \times X, \: \liminf_{n \rightarrow \infty} d(f^{n}(x),f^{n}(y)) = 0 $$ (where $X$ is a compact metric space with metric $d$). Is it true that ...
0
votes
1
answer
119
views
Can we lower bound this entropy by $\int_{\mathbb R^d} \rho^k (x) \, \mathrm d x$ and $\int_{\mathbb R^d} |x|^2\rho (x) \, \mathrm d x$?
We define $U : [0, \infty) \to [0, \infty)$ by $U(0) := 1$ and $U (s) := s \log s + (1-s)$ for $s >0$. Then $U$ is strictly convex. The minimum of $U$ is $0$ and is attained at $s=1$. Let $\mathcal ...
1
vote
2
answers
393
views
Is the Boltzmann entropy lower semi-continuous in the weak topology induced by $C_b (\mathbb R^d)$?
For Lebesgue-absolutely continuous probability measures $\rho\ll \mathcal{L}^d$ in the whole space $\mathbb{R}^d$ with finite second moments (i-e $\rho\in \mathcal{P}^2_{ac}(\mathbb{R}^d)$), let
$$
\...
3
votes
1
answer
201
views
Let $\mu : [0, T] \to \mathcal P_2^a (\mathbb R^d), t \mapsto \mu_t$ be absolutely continuous. Is $t \mapsto \mathcal H (\mu_t)$ continuous?
We endow the space $\mathcal P_2^a (\mathbb R^d)$ of absolutely continuous probability measures with finite second moment with the Wasserstein distance $W_2$. Let $\mathcal H (\mu)$ be the relative ...
1
vote
1
answer
148
views
Does convergence of Radon transforms of a sequence of probability distributions implies convergence of the distributions themselves?
Let $P_1,P_2,\ldots $ be a sequence of absolutely continuous probability measures on $\mathbb R^n$, and let
$f_j:\mathbb R^n\to\mathbb R$ be their PDFs. Assume that $\operatorname{E}P_j = 0$ and $\...
2
votes
1
answer
250
views
Maximal entropy distribution on three variables knowing its marginals on any two
Observation 0: Given a finite set $X$, the probability distribution on $X$ with highest entropy is the uniform one. This is well known.
Observation 1: Given two finite sets $X,Y$ and two probability ...
0
votes
0
answers
119
views
Does there exist an established name for the exponential of surprisal (e.g. the reciprocal of probability?)
There are several different names that I know of for the exponential of the entropy of which "diversity" and "perplexity" are fairly well-established. Tom Leinster has a very ...
3
votes
1
answer
488
views
Relative entropy equality for a sequence of Bernoulli random variables
We are given two joint probability distributions, $p$ and $q$, of $n$ Bernoulli random variables $X_1, X_2, \ldots, X_n$.
We denote by $p(x_k\mid x^{k-1})$ the probability $\mathbb{P}_p(X_k=x_k\mid ...
3
votes
0
answers
158
views
Differential entropy of random Gibbs measure
There is a question I have been wondering about for a while, which I have thus far not been able to resolve. The problem revolves around random Gibbs measures. I am not very well-versed in the more ...
2
votes
1
answer
214
views
Morse-Hedlund\Coven-Hedlund theorem for non-Abelian groups
There is a well know theorem by Coven and Hedlund, in Sequences with minimal block growth, stating that the complexity function of an aperiodic sequence\configuration $\omega\in \mathcal{A}^{\mathbb{Z}...
3
votes
1
answer
254
views
Bound on an integral representing a difference of two relative entropies
Let $ f : [0,1] \to \mathbb{R} $ be a function satisfying: 1.) $ |f(x)| \leqslant a $ for some $ a < 1 $, and 2.) $ \int_0^1 f(x) {\mathrm d}x = 0 $. I would like to know whether the following ...
2
votes
2
answers
1k
views
Defining a measure of uniformity for measurable subsets of $[0,1]^2$ w.r.t dimension $\alpha\in[0,2]$
Let $(X,d)$ be a metric space. If set $A\subseteq X$, let $H^{\alpha}$ be the $\alpha$-dimensional Hausdorff measure on $A$, where $\alpha\in[0,2]$ and $\text{dim}_{\text{H}}(A)$ is the Hausdorff ...