Update: One visitor pointed out to me that I have made a mistake in terminology:
I just want to note that the algorithm you present is not a Metropolis-Hastings algorithm but a random-walk Metropolis algorithm, since it uses a symmetric proposal distribution (normal). Thus the acceptance ratio a=[fpdf( epsilon_tilde ) / fpdf( previous_epsilon )] does not depend on the proposal distribution. In MH you would have a=[(fpdf( epsilon_tilde ) *q(epsilon))/ (fpdf( previous_epsilon )*q(epsilon_tilde))] which is called the MH ratio.
It is generally quite easy to draw random numbers from uniform and normal distributions, as well as distributions that are simple transformations of uniform and normal distributions. The Metropolis-Hastings algorithm (MH) produces draws from more complex distributions, and in fact, does not even require that the complex distribution be precisely known. If denotes the density from which you wish to draw a random number, and for any constant , then it suffices to know in order to use MH. This is especially convenient when it is numerically difficult to calculate the normalizing factor, as is the case in many Bayesian applications.
It is important to note that the algorithm converges to a stream of random draws from the given distribution. Thus, you must allow a sufficient “burn-in time.” A general rule of thumb is that the more the distribution deviates from a standard normal, the more burn-in you will need. 50 iterations is generally okay for simple distributions, and 500 is very safe for more complicated ones. The following code is an implementation of MH in Matlab:
function [ epsilon ] = MetropolisHastings( epsilon_0, num_iterations, fpdf ) %METROPOLISHASTINGS Use the Metropolis-Hastings algorithm to obtain draws %from a given density via approximation. % % Parameters: % % epsilon_0 = the initial value of the vector % num_iterations = the number of iterations to run % fpdf = the density from which we wish to draw % % Example: % % epsilon = MetropolisHastings( 0, 1000, @(x)( (x > 0) * exp(-x) ) ); % % % pre-generate the random variables normal_randoms = randn(num_iterations,1); uniform_randoms = rand(num_iterations,1); epsilon = zeros(num_iterations,1); previous_epsilon = epsilon_0; % for each random element for i = 1:num_iterations % add a normally distributed noise epsilon_tilde = previous_epsilon + normal_randoms(i); % if f(e~) > f(e_i-1) if fpdf( epsilon_tilde ) > fpdf( previous_epsilon ) epsilon(i) = epsilon_tilde; else % with probability f(e~)/f(e_i-1) if uniform_randoms(i) <= fpdf( epsilon_tilde ) / fpdf( previous_epsilon ) epsilon(i) = epsilon_tilde; else epsilon(i) = previous_epsilon; end end % update the previous draw previous_epsilon = epsilon(i); end end
For example, it is easy to draw from a normal distribution cut off at 1.5:
fpdf = @(x) double(abs(x) <= 1.5) .* normpdf( x ); guess = 0; num_iterations = 500; random_draws = MetropolisHastings(guess, num_iterations, fpdf); use_draws = random_draws(50:end); % recall that the algorithm has a burn-in
Recall that even though the function that is passed in is not a distribution because it does not normalize to one, it is correct up to a constant factor, so it can be passed to MH.
Note: The previous article is one of a series of topic summaries I am writing to introduce various topics that are not explained particularly well by online resources such as Wikipedia. I’m tagging all of these posts as “Wikipedia.” Please feel free to adapt these summaries for any use, with citation.
- Andrew Chalk on Evidence for Strong EMH
- vlad on Jensen’s Alpha
- Anon on Dealing with occasionally non-numeric data in Matlab
- Quant on Jensen’s Alpha
- Anonymous on Jensen’s Alpha