
Inside-outside algorithm
Encyclopedia
The inside-outside algorithm is a way of re-estimating production probabilities in a probabilistic context-free grammar. It was introduced by James K. Baker in 1979 as a generalization of the forward-backward algorithm for parameter estimation on hidden Markov model
s to stochastic context-free grammars
. It is used to compute expectations, for example as part of the Expectation-maximization algorithm
(an unsupervised learning algorithm).
is the total probability of generating words
, given the root nonterminal
and a grammar
:
The outside probability
is the total probability of beginning with the start symbol
and generating the nonterminal
and all the words outside
, given a grammar
:
Hidden Markov model
A hidden Markov model is a statistical Markov model in which the system being modeled is assumed to be a Markov process with unobserved states. An HMM can be considered as the simplest dynamic Bayesian network. The mathematics behind the HMM was developed by L. E...
s to stochastic context-free grammars
Stochastic context-free grammar
A stochastic context-free grammar is a context-free grammar in which each production is augmented with a probability...
. It is used to compute expectations, for example as part of the Expectation-maximization algorithm
Expectation-maximization algorithm
In statistics, an expectation–maximization algorithm is an iterative method for finding maximum likelihood or maximum a posteriori estimates of parameters in statistical models, where the model depends on unobserved latent variables...
(an unsupervised learning algorithm).
Inside and outside probabilities
The inside probability




The outside probability





