Univariate continuous distributions

Theory
The distributions of continuous random variables are described by the probability distribution functions (pdfs) and cumulative distribution functions (cdfs)

Probability Distribution Function
should integrate to 1

non-negative

any mathematical function which is non-negative, positive on at least one interval of values of x, and has a finite integral can be made into a pdf.

C is called a normalising constant

density core of f(x) can be called g

Cumulative distribution function
The pdf and cdf are related;

F(x) = \integral f(x)

Moments of probability distributions
The mean and variance are special cases of moments of distributions

Expectation
the average, typical value of X

Mean
\mu = E(X) = \int_x_b

Raw Moments
The mean is also known as the the first moment of a distribution

the second and third moments;

E(X^2) E(X^3)

Central moments
central moments about the mean

@todo

Variance is the second central moment of a distribution

\mu_2

V(X) is the

Moment generating function
a single formula for the moments of all r

the moment generating function mgf

@todo - Linearity of expectation is true only for finite sums?

Quantiles
Quantile function is the inverse of the cumulative distribution function

Chi-squared distribution
is written as X^2(v)

parameter is the degrees of freedom

The exponential distribution M(\lambda) is a special case of the X^2(v)

t-distribution
Student's t distribution, t(v)

Cauchy
a special case of the t-distribution is the cauchy distribution with v=1

none of the moments of the Cauchy distribution exist, hence some of the general results such as the Central limit theorem do no apply.