Questions tagged [entropy]
A mathematical quantity designed to measure the amount of randomness of a random variable.
695
questions
0
votes
0
answers
20
views
differential entropy for comparison distributions
I want to use differential entropy to compare the outcome of Bayesian updating (multidimensional probability distributions) for different datasets. My parameters are different physical parameters i.e. ...
1
vote
0
answers
24
views
How should I go about completely decorrelating a digital signal?
So I'm working on real time signal compression, and I need to come up with the best convolution to minimize the entropy of incoming data (which I will then compress), which I understand is achieved by ...
0
votes
0
answers
18
views
Interpretation of time series spectral entropy values wrt forecastability by a general neural network
I recently started using spectral entropy to analyze time series (already windowed). I'm having difficulty for interpreting the results, the entropy of the last 25% of a series is 0.19, and the ...
2
votes
0
answers
41
views
Shannon source coding theorem and differential entropy
Loosely speaking, Shannon's source encoding theorem says that there is an encoder with rate at least $H(x)$ such that $n$ repetitions of the source can be mapped to at least $nH(X)$ bits of binary ...
1
vote
1
answer
60
views
Chain rule conditional entropy
A textbook I am reading states that$$H(X,Y)=H(X)+H(Y|X)$$where $H(X,Y)$ is the joint entropy of random variables $X,Y$, $H(X)$ the entropy of $X$, and $H(Y|X)$ is conditional entropy. It then states ...
1
vote
1
answer
44
views
Supposed to be a simple question about entropy
Let say there is an urn which contains balls of different color. It is a well known formula to calculate entropy of balls in the urn:
$H = - \sum P_i\cdot\log(P_i)$
where $P_i = \frac{M_i}{N}$, where $...
0
votes
1
answer
94
views
Check if my time series is forecastable using Shannon entropy
According to this answer: https://datascience.stackexchange.com/a/95232/141037, is possible to verify the forecastability of a time series using the Shannon entropy, the lower the Shannon entropy ...
2
votes
0
answers
22
views
How to quantify "clumpiness" of a time series of physical activity?
Let's assume you are measuring the physical activity levels over a day of some people, using accelerometry for example. The goal is to quantify the "clumpiness" of the activity patterns. It ...
1
vote
1
answer
44
views
Minimum entropy decomposition of probability distributions
Say you want to decompose a probability distribution (a PDF) into a mixture of distributions in such a way as to minimize the mean entropy of the component distributions. I have an idea that this is ...
0
votes
0
answers
19
views
How to use and understand entropy for pattern detection?
I have two images from erp_data and noerp_data matricies. In erp_data we can see a pattern (sigmoid), in no_erp we see no pattern. ERP is event-related potential, if you are curious.
My goal is to ...
1
vote
0
answers
16
views
Effect on entropy when we scale Bernoulli plus Gaussian
Question: Given $X\sim\text{Bernoulli}(\alpha)$, $Y\sim\mathcal{N}(0,1)$, and non-random positive constants $C,\epsilon>0$. Let $H(\cdot)$ be the differential entropy. Is it true that
$$
H((C+\...
0
votes
0
answers
75
views
In Stata or SAS, how do I run a stacked regression with entropy balancing?
My objective is to run stacked regression (DiD) with entropy balancing in the setting of multiple timing for treatments.
Initially, my data consisted of panel data for firm-years. I then stacked this ...
2
votes
1
answer
42
views
Bivariate random variable and transformation
Let $X=(X_1,X_2)$ and $Y=(Y_1,Y_2)$ be non-negative absolutely continuous random vector and if $\phi(X_j)=Y_j$, $j=1,2$, are one-one transformation then $$H[Y;\phi(t_1),\phi(t_2)]=H(X;t_1,t_2)-E[\log ...
0
votes
0
answers
28
views
Entropy of a set of strings based on a sample
Say I have an enormous set of N-character-long strings. Far too many to enumerate or store in memory, but far fewer than the theoretical $26^N$ possible strings. I can draw samples from this set, but ...
1
vote
0
answers
27
views
Integral Over functions Differential Entropy
Suppose there is some function:
\begin{equation}
f(t) = p(x)
\end{equation}
Where $p(x)$ is a PDF over $x$ at $t$. Some examples would be linear regression with error bounds or a Gaussian Process (...