Statistical decision theory (Bayesian)

questions regarding the unknown parameter theta, such as ‘what is the single best estimate of theta?’ and ‘what range of values is theta most likely to be in?

a set of decisions D

$$ d \in D $$

Loss functions
$Loss(d_1,\theta)$

so the $Loss(d_1,\theta)$ measures the loss incurred by making decision $$d_1 \in D$$ when the paramater variable is $$\theta$$

there are several loss functions, and the best one depends on the context

Loss functions for point estimation are usually of the form uppercase L open bracket d comma theta close bracket = g open bracket d minus theta close bracket, for some function g.

absolute loss
Under absolute loss, the best estimate of theta is the posterior median.

quadratic loss
the quadratic loss function penalises estimation more severely as the error increases

it treats positive and negative differences symmetrically.

0-1 loss function
missing the target by a little as by a lot. It also treats positive and negative differences symmetrically.

Under 0–1 loss, for small alpha, the best estimate of theta is the posterior mode.


 * 1) $Loss(d,\theta)$ should be 0 when d=0
 * 2) $Loss(d,\theta)$ should be > 0 when $d \neq \theta$
 * 3) $Loss(d,\theta)$ should be non decreasing away from $\theta$

Decision
the best decision is the value of d which minimizes the expected loss, which minimizes the posterior expected loss

$$E(L(d_1,\theta)|x]$$

= bayes factors =

ration of odds

The Bayes factor can also be calculated when the models being compared are the same except for different parameter values

likelihood principle
The likelihood principle states that likelihoods which contain identical information regarding $$\theta$$ should yield identical inferences. always holds in Bayesian inference

likelihood principle also holds in those aspects of classical inference based purely on the likelihood

sufficiency
data are uninformative about a parameter $$\theta$$ if they do not change the beliefs about theta