Skip to main content

Questions tagged [entropy]

For questions involving some concept of entropy (Boltzmann, von Neumann, Shannon, Rényi, and others)

Filter by
Sorted by
Tagged with
6 votes
2 answers
798 views

I am studying how entropy-based arguments (in the sense of Shannon, Boltzmann, or Perelman's geometric entropy) can be used to prove or guide the resolution of major mathematical conjectures. Some ...
3 votes
0 answers
715 views

Let $\Psi(x,y)$ be the number of $y$-smooth integers $\le x$. It is classical (de Bruijn, Hildebrand-Tenenbaum) that $\Psi(x,y) \sim x \rho(u)$ with $u=\log x/\log y$, and more refined saddle-point ...
UpperBound2025 Bound's user avatar
2 votes
1 answer
127 views

Let $B$ denote the unit Euclidean ball centered at $0 \in \mathbb{R}^n$. Given a set $K \subset \mathbb{R}^n$ let us denote by $P_K(x)$ the maximal number of points $y_i \in K \cap (x + B)$ such that ...
Drew Brady's user avatar
3 votes
0 answers
126 views

I am trying to understand Kolmogorov (metric) entropy in a concrete geometric setting. Let $(X,d)$ be a metric space and $K \subset X$ a compact set. For $\varepsilon > 0$, define the Kolmogorov $\...
user197284's user avatar
2 votes
1 answer
332 views

This is a non-specialist question about probability theory. I have two random variables $X$ and $Y$, and I want to measure to what extent is there a function $F$ such that $Y$ is $F(X)$. I asked an AI ...
მამუკა ჯიბლაძე's user avatar
2 votes
0 answers
186 views

I'm analyzing the following function class: $$ \mathcal{F} = \left\{ (x, z, y) \mapsto \mathbb{1}\{ y \leq z\alpha + x^\top\beta \} : \alpha \in \mathbb{R}, \beta \in \mathbb{R}^d \right\}. $$ Let $\...
Stan's user avatar
  • 55
2 votes
0 answers
151 views

Suppose an $n\times n$ square matrix $A$ with real or complex entries $A_{ij}$. Now define a quantity $Z(A)$ associated with the matrix by $$ Z(A)=\sum_i \sum_j |A_{ij}|^{1/2}. $$ What is this ...
Christopher Fuchs's user avatar
0 votes
0 answers
94 views

Let $\mathcal{F}$ be the class of 1-Lipschitz functions $f:\mathbb{R} \to \mathbb{R}$ satisfying that $f(0) = 0$. Define a norm $$ \|f\| = \max_{i\in I} \lambda_i\|f\|_{L^\infty(-R_i,R_i)}, $$ where $...
user58955's user avatar
  • 640
0 votes
0 answers
109 views

I'm working with VC subgraph classes of functions, say $\mathcal{F}$ and $\mathcal{G}$, which are both uniformly bounded and admit envelopes $F$ and $G$, respectively. I came across a useful lemma (...
Stan's user avatar
  • 55
10 votes
0 answers
327 views

If $G$ is a finite abelian group, $f : G\to {\bf C}$ is a function, and $\hat f$ is its Fourier transform, then $$|{\rm supp}(f)| \cdot |{\rm supp}(\hat f)| \ge |G|.\tag 1$$ This is the discrete ...
Marcel K. Goh's user avatar
4 votes
1 answer
278 views

Let $X\subseteq Y$ be a dense subset of a metric space $(Y,\rho)$. Let $\varepsilon>0$, $A\subseteq Y$ and let $N(A,\varepsilon)$ denote the external $\varepsilon$-covering number of $A$; i.e. the ...
Mathematical-Semi_N00b's user avatar
2 votes
1 answer
160 views

In this paper of Olivier & Żurek, the notion of quantum discord is introduced. I already asked one question related to this paper in this discussion. However, I realized that one (possibly small) ...
truebaran's user avatar
  • 9,634
0 votes
0 answers
106 views

Given a map "T" topological space to itself (say compact) one may consider topological entropy of "T" (it is related to Kolmogorov–Sinai, or metric entropy in case of metric spaces)...
Alexander Chervov's user avatar
5 votes
2 answers
220 views

In Leinster's Entropy and Diversity it's shown that the Shannon Entropy $H$ of finite random variables is characterized up to a constant factor by the so-called chain rule (and continuous dependence ...
Michael Bächtold's user avatar
20 votes
1 answer
611 views

Let $H(x)$ on $[0,1]$ denote the binary entropy function (the base of the logarithm does not matter for this whole discussion). Let $\phi:= \frac{1+\sqrt 5}{2}$ denote the golden ratio $\approx 1.618$....
D.R.'s user avatar
  • 1,255
0 votes
1 answer
113 views

Suppose $KL(p,q) \geq KL(r,q)$ for a density $p,q,r$ with the same support. Does this imply $KL(p, tp+(1-t)q) \geq KL(r, tr+(1-t)q)$ for $t\in [0,1]$? It is a different version of the question of KL ...
Krrrr's user avatar
  • 35
3 votes
1 answer
223 views

Consider probability densities functions p,q, and r with the same support. For $t \in (0,1)$, does $D(p,q) > D(p,r)$ imply $$D(p,tp+(1-t)q)\geq D(p,tp+(1-t)r))?$$
Krrrr's user avatar
  • 35
7 votes
1 answer
386 views

Let $P=\{p_1,\dots,p_n\},Q=\{q_1,\dots,q_n\}$ are discrete probability distribution. It is well known that $D_\text{KL}(P\|Q)=\sum_{i=1}^np_i(\log p_i-\log q_i)$ is the Kullback–Leibler divergence ...
Veronica Phan's user avatar
3 votes
0 answers
109 views

Let $\mathcal{F}\subseteq2^{[n]}$ be a union-closed family of sets. For a set $S\in[n]$ (not necessary belong to $\mathcal{F}$), define $w_{\mathcal{F}}(S)$ to be the number of subset of $S$ which ...
Veronica Phan's user avatar
1 vote
0 answers
98 views

Setup Assume $p_Y \in \Delta^n$ is a probability vector obtained by $p_Y=L_{Y|X}p_X$, where $L_{Y|X} \in \mathbb{R}^{n \times m}$ is an arbitrary likelihood (i.e, a column stochastic matrix) and $p_X \...
backboltz37's user avatar
2 votes
1 answer
277 views

$\newcommand{\bR}{\mathbb{R}}\newcommand{\diff}{\mathop{}\!\mathrm{d}}$ We define a continuous function $\varphi : \bR_+ \to \bR$ by $$ \varphi (s) := \begin{cases} 0 &\text{if} \quad s =0 , \\ s \...
Akira's user avatar
  • 1,163
19 votes
2 answers
2k views

I guess the following inequality $$ \sum_{i=1}^n g \left (\frac{-p_i \log p_i}{H(\boldsymbol{p})} \right ) \le \sum_{i=1}^n g (p_i)$$ holds for any continuous convex function $g$ and any probability ...
Amir's user avatar
  • 413
2 votes
0 answers
297 views

I've often read that entanglement entropy in quantum field theory is ill-defined because local algebras are generally of type III, which implies that a trace doesn't exist. For a normal state $\omega_{...
Gabriel Palau's user avatar
3 votes
1 answer
147 views

I consider the following problem on the half real line $$ \begin{cases} u_t = u_{xx} + u_x, & \quad t > 0, \, x > 0, \\[2mm] -u_x|_{x=0} = u|_{x=0}, & \quad t > 0, \, x = 0, \\[2mm] u|...
Garou Garou's user avatar
2 votes
0 answers
176 views

I could not find a really appropriate title for my question (happy to revise) but let me explain. Suppose $p(x|c)$ is a probability density function over $x \in [0,1]$ which depends continuously on ...
nervxxx's user avatar
  • 231
3 votes
0 answers
142 views

By a potential space filling curve we mean a continuous function $f:[0,1]\to [0,1]$ such that there is a continuous surgective function $g:[0,1]\to [0,1]^2$ with $f=\pi_1 \circ g$ where $\pi_1$...
Ali Taghavi's user avatar
0 votes
0 answers
223 views

The Shub conjecture on topological entropy $h(f)$ of self map f on manifold M says that the topological entropy is greater (or equal) than (to) the log of maximum absolute values of the ...
Ali Taghavi's user avatar
6 votes
4 answers
2k views

I'm reading that recent paper on a new bound for diagonal Ramsey and am stuck at the attached "Fact 12.1", which is "standard". Could anyone please point me to a source for this ...
Lawrence Paulson's user avatar
1 vote
0 answers
95 views

(Crossposted from Mathematics Stackexchange) In Christopher Bishop's book "Pattern Recognition and Machine Learning", pages 53-54 specifically, he uses Lagrange multipliers to find the ...
Hippopotoman's user avatar
1 vote
1 answer
229 views

I understand what entropy measures and cross entropy is the same except it is uses another distribution $q$ to compare it against $p.$ Is it because the log function is concave down so the predictions ...
Chris Blodgett's user avatar
0 votes
0 answers
219 views

This question was previously posted on MSE at About the monotonicity of the exponential entropy. In the paper The Unifying Frameworks of Information Measures the conditional exponential entropy (see ...
Upax's user avatar
  • 127
11 votes
1 answer
817 views

My question comes from understanding a probabilistic inequality in Bourgain's paper on Erdős simiarilty problem: Construction of sets of positive measure not containing an affine image of a given ...
Tutukeainie's user avatar
4 votes
1 answer
346 views

What is the maximum entropy probability distribution if the support is a fixed interval (e.g. $[-1,1]$) with an already known variance? If we know the support is a fixed interval, then the maximum ...
Sarah Rune's user avatar
0 votes
1 answer
417 views

Suppose that $N \in \mathbb N_+$ is fixed and denote by $\mu = (\mu_0,\ldots,\mu_N)$ the uniform distribution on the set $\{0,1,\ldots,N\}$ (i.e., $\mu_n = \frac{1}{N+1}$ for each $0\leq n\leq N$). I ...
Fei Cao's user avatar
  • 742
2 votes
0 answers
150 views

Let $X^1 \ldots X^n$ be random variables on $\mathbb{R}^d$ with an arbitrary joint probability distribution $\mu$ on $\mathbb{R}^{n \times d}$. Let $\nu = \nu^1 \times \ldots \times \nu^n$ be a ...
MatrixGeek1234's user avatar
1 vote
1 answer
316 views

My question is inspired by the following paper of Olivier and Żurek but for this question to be self-contained I will recall all the necessary definitions: for a quantum state $\rho$ we define the ...
truebaran's user avatar
  • 9,634
1 vote
1 answer
187 views

We define $U : [0, +\infty) \to [0, +\infty)$ by $U(0) := 0$ and $U (s) := s \log s$ for $s >0$. Then $U$ is strictly convex. Let $D$ be the set of all bounded non-negative continuous functions $\...
Akira's user avatar
  • 1,163
1 vote
0 answers
181 views

Forgive me for the poorly researched question. I'm currently working on a computer science project involving training a neural stochastic differential equation, and I've run into a problem while ...
user3002473's user avatar
2 votes
1 answer
190 views

A dynamical system is proximal if $$\:\forall (x,y) \in X \times X, \: \liminf_{n \rightarrow \infty} d(f^{n}(x),f^{n}(y)) = 0 $$ (where $X$ is a compact metric space with metric $d$). Is it true that ...
Matej Moravik's user avatar
0 votes
1 answer
119 views

We define $U : [0, \infty) \to [0, \infty)$ by $U(0) := 1$ and $U (s) := s \log s + (1-s)$ for $s >0$. Then $U$ is strictly convex. The minimum of $U$ is $0$ and is attained at $s=1$. Let $\mathcal ...
Akira's user avatar
  • 1,163
1 vote
2 answers
393 views

For Lebesgue-absolutely continuous probability measures $\rho\ll \mathcal{L}^d$ in the whole space $\mathbb{R}^d$ with finite second moments (i-e $\rho\in \mathcal{P}^2_{ac}(\mathbb{R}^d)$), let $$ \...
Akira's user avatar
  • 1,163
3 votes
1 answer
201 views

We endow the space $\mathcal P_2^a (\mathbb R^d)$ of absolutely continuous probability measures with finite second moment with the Wasserstein distance $W_2$. Let $\mathcal H (\mu)$ be the relative ...
Akira's user avatar
  • 1,163
1 vote
1 answer
148 views

Let $P_1,P_2,\ldots $ be a sequence of absolutely continuous probability measures on $\mathbb R^n$, and let $f_j:\mathbb R^n\to\mathbb R$ be their PDFs. Assume that $\operatorname{E}P_j = 0$ and $\...
Misha's user avatar
  • 13
2 votes
1 answer
250 views

Observation 0: Given a finite set $X$, the probability distribution on $X$ with highest entropy is the uniform one. This is well known. Observation 1: Given two finite sets $X,Y$ and two probability ...
Gro-Tsen's user avatar
  • 38.8k
0 votes
0 answers
119 views

There are several different names that I know of for the exponential of the entropy of which "diversity" and "perplexity" are fairly well-established. Tom Leinster has a very ...
Mike Battaglia's user avatar
3 votes
1 answer
488 views

We are given two joint probability distributions, $p$ and $q$, of $n$ Bernoulli random variables $X_1, X_2, \ldots, X_n$. We denote by $p(x_k\mid x^{k-1})$ the probability $\mathbb{P}_p(X_k=x_k\mid ...
Penelope Benenati's user avatar
3 votes
0 answers
158 views

There is a question I have been wondering about for a while, which I have thus far not been able to resolve. The problem revolves around random Gibbs measures. I am not very well-versed in the more ...
Jesse van Rhijn's user avatar
2 votes
1 answer
214 views

There is a well know theorem by Coven and Hedlund, in Sequences with minimal block growth, stating that the complexity function of an aperiodic sequence\configuration $\omega\in \mathcal{A}^{\mathbb{Z}...
Keen-ameteur's user avatar
3 votes
1 answer
254 views

Let $ f : [0,1] \to \mathbb{R} $ be a function satisfying: 1.) $ |f(x)| \leqslant a $ for some $ a < 1 $, and 2.) $ \int_0^1 f(x) {\mathrm d}x = 0 $. I would like to know whether the following ...
aleph's user avatar
  • 503
2 votes
2 answers
1k views

Let $(X,d)$ be a metric space. If set $A\subseteq X$, let $H^{\alpha}$ be the $\alpha$-dimensional Hausdorff measure on $A$, where $\alpha\in[0,2]$ and $\text{dim}_{\text{H}}(A)$ is the Hausdorff ...
Arbuja's user avatar
  • 135

1
2 3 4 5
7