science:phd-notes:2025-05-13-porter-thomas-fluctuations
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
science:phd-notes:2025-05-13-porter-thomas-fluctuations [2025/05/13 13:18] – Add Python examples jon-dokuwiki | science:phd-notes:2025-05-13-porter-thomas-fluctuations [2025/05/26 12:43] (current) – jon-dokuwiki | ||
---|---|---|---|
Line 38: | Line 38: | ||
$$ | $$ | ||
- | The law of large numbers tells us that the sample average will converge to the expected value $\mu = 1$ as $n$ goes to infinity. The CLT states that as $n$ gets larger, the distribution of $\bar{X}_n$ gets arbitrarily close to the normal distribution with a mean of 1 and a variance of $2/n$. | + | The law of large numbers tells us that the sample average will converge to the expected value $\mu$ as $n$ goes to infinity. The CLT states that as $n$ gets larger, the distribution of $\bar{X}_n$ gets arbitrarily close to the normal distribution with a mean of 1 and a variance of $2/n$ (The PT distribution has a mean of 1 and a variance of 2). |
Let us quickly check that this is true! Let's say that $n = 1000$ and with some quick Python magic: | Let us quickly check that this is true! Let's say that $n = 1000$ and with some quick Python magic: | ||
Line 58: | Line 58: | ||
Mic drop? | Mic drop? | ||
+ | |||
+ | Now! How can we use this information to determine how much $y$ should vary? And what does //vary// even mean here? Vary-ance maybe. If $y$ is PT-distributed, | ||
+ | |||
+ | $$ | ||
+ | \text{Var}(X) = E[(X - \mu)^2]. | ||
+ | $$ | ||
+ | |||
+ | So maybe what we want is to check that the variance of the $B$ distribution is (close to) 2? We can also draw a bunch of values from the distribution and check that the variance of the mean of all the $n$ draws are indeed equal to $2/n$, as the CLT predicts is true. |
science/phd-notes/2025-05-13-porter-thomas-fluctuations.1747142328.txt.gz · Last modified: by jon-dokuwiki