Questions tagged [statistics]
Mathematical statistics is the study of statistics from a mathematical standpoint, using probability theory and other branches of mathematics such as linear algebra and analysis.
37,756 questions
- Bountied 0
- Unanswered
- Frequent
- Score
- Trending
- Week
- Month
- Unanswered (my tags)
-1
votes
0
answers
11
views
calculating reliability statistics- combined incidents per thousand units. [closed]
I would like to combine warranty information across different products. for a combined warranty metric (incidents per thousand units). Is this a case of multiplying together?
0
votes
0
answers
108
views
"Central limit theorem" for symmetric random variables with no finite mean
Let $(X_n)_{n\in\mathbb N}$ be a sequence of independent random variables with the same distribution. The common distribution $\mu$ is such that it is symmetric, that is, $\mu((-\infty,x])=\mu([-x,\...
1
vote
0
answers
59
views
Asymptotic Variance
Let $X_1,X_2,\cdots,X_n$ be an i.i.d. sample from a distribution $p_{\theta}$. Denote $\hat{\theta}_{MLE}$ is the MLE of $\theta$. Then under some regularity conditions
$$\sqrt{n},円(\hat{\theta}_{MLE} ...
2
votes
0
answers
36
views
An MLE Asymptotic Normality problem with i.n.i.d. data
Suppose $n \in \mathbb{N}$. Suppose $s_0 > 1$ and $\xi_j \sim N (0, j^{- s_0}
+ n^{- 1}),ドル $j = 1, 2, 3, \ldots, n$. Let $\hat{s}_n$ be the maximum
likelihood estimator of $s_0$. Is $\hat{s}_n$ ...
0
votes
1
answer
45
views
Can mutual information be defined between a random variable and a conditional distribution
Quantities like mutual Information $I,ドル entropy $H,ドルetc. are typically defined as taking random variables as input. However, they are actually just functions on probability distributions - e.g. the ...
0
votes
0
answers
22
views
Log-linear model for 2 way contingency table
I have some confusion on part c of the problem.
Our null hypothesis is $$H_0:\pi_{1j}=\pi_{2j}=\pi_{3j}=\pi_{4j}\\\forall j$$
Should our log-linear model be $$logu=u+uT+uR$$ or $$logu=u+uR$$
where uR ...
0
votes
0
answers
68
views
Where can I find more information about 'specific information'
I am looking to borrow a concept from information theory. Namely, the mutual information between two random variables $X, Y$ when one of the random variables is fixed, e.g. $Y = y$:
$$
I(X ; Y = y)
$$...
2
votes
1
answer
90
views
A statistics problem dealing with normal random variables - My answer is close to the answer in the book - is it close enough?
Below is a problem I did. I feel my answer is right. However, it is different than the book's answer. Is the difference simple round-off error?
Problem:
The length of time required for the periodic ...