Lecture I: Introduction to Monte Carlo Methods, Integration and Probability Distributions

Outline Lecture I: Introduction to Monte Carlo Methods, Integration and Probability Distributions Morten Hjorth-Jensen 1 Department of Physics and C...
Author: Megan Goodwin
0 downloads 0 Views 923KB Size
Outline

Lecture I: Introduction to Monte Carlo Methods, Integration and Probability Distributions Morten Hjorth-Jensen 1 Department

of Physics and Center of Mathematics for Applications University of Oslo, N-0316 Oslo, Norway

2 Department

of Physics and Astronomy, Michigan State University East Lansing, Michigan, USA

January 28 - February 2

First National Winter School in eScience

Lecture I, January 28 2007

Outline

Outline

1

Introduction to Monte Carlo Methods

2

Probability Distribution Functions

3

Monte Carlo Integration

First National Winter School in eScience

Lecture I, January 28 2007

Outline

Outline

1

Introduction to Monte Carlo Methods

2

Probability Distribution Functions

3

Monte Carlo Integration

First National Winter School in eScience

Lecture I, January 28 2007

Outline

Outline

1

Introduction to Monte Carlo Methods

2

Probability Distribution Functions

3

Monte Carlo Integration

First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

’Iacta Alea est’, the die is cast!

Plan for the lectures 1

January 28: Introduction to Monte Carlo methods, probability distributions and Monte Carlo Integration.

2

January 29: Random numbers, Markov chains, diffusion and the Metropolis algorithm.

3

January 30: Applications in sociology, simulations of phase transitions in physics and quantum physics.

4

All material taken from my text on Computational Physics, see http://www.uio.no/studier/emner/matnat/ fys/FYS3150/h06/undervisningsmateriale/ LectureNotes/.

First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

http://www.iop.org/EJ/journal/CSD

N EW FO R 2006

Uniquely driven by the aim to publish multidisciplinary scientific advances, tog ether with details of their enablingtechnolog ies.

www.iop.org /journals/csd First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

What is Monte Carlo?

1

Monte Carlo methods are nowadays widely used, from the integration of multi-dimensional integrals to solving ab initio problems in chemistry, physics, medicine, biology, or even Dow-Jones forecasting. Computational finance is one of the novel fields where Monte Carlo methods have found a new field of applications, with financial engineering as an emerging field.

2

Numerical methods that are known as Monte Carlo methods can be loosely described as statistical simulation methods, where statistical simulation is defined in quite general terms to be any method that utilizes sequences of random numbers to perform the simulation.

First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

Monte Carlo Keywords

Consider it is a numerical experiment Be able to generate random variables following a given PDF Find a probability distribution function (PDF). Sampling rule for accepting a move Compute standard deviation and other expectation values Techniques for improving errors

First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

The Plethora of Applications; from the Sciences to Social Studies 1

2

3

4

5 6 7 8

Quantum Physics and Chemistry: Variational, Diffusion and Path Integral Monte Carlo Simulations of Phase transitions, classical ones and quantal ones such as superfluidity (quantum liquids) Lattice Quantum-Chromo-Dynamics (QCD), the only way to test the fundamental forces of Nature. (Own dedicated High-Performance-Computing machine). Reconstruction of particle-collisions’ paths at for example CERN Solution of Stochastic differential equations Dow-Jones forecasting and financial engineering Modelling electoral patterns Ecological evolution models, percolation, wood fires, earthquakes....and so forth First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

Selected Texts C. R. Robert and G. Casella,Monte Carlo Statistical Methods, Springer, 2nd edition 2004. M. E. J. Newman and G. T. Barkema, Monte Carlo Methods in Statistical Physics, Oxford University Press, 1999. P. Glasserman, Monte Carlo Methods in Financial Engineering, Springer, 2003. B. L. Hammond, W. A. Lester, Jr., P. J. Reynolds, Monte Carlo Methods in Ab Initio Quantum Chemistry, World Scientific, 1994. G. S. Fishman, Monte Carlo Methods, Concepts, Algorithms and Applications,

First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

Important Application: Monte Carlo Integration Consider 1

Z

f (x)dx ≈

I= 0

N X

ωi f (xi ),

i=1

where ωi are the weights determined by the specific integration method (like Simpson’s or Taylor’s methods) with xi the given mesh points. To give you a feeling of how we are to evaluate the above integral using Monte-Carlo, we employ here the crudest possible approach. Later on we will present more refined approaches. This crude approach consists in setting all weights equal 1, ωi = 1. Recall also that dx = h = (b − a)/N where b = 1, a = 0 in our case and h is the step size. We can then rewrite the above integral as Z 1 N 1 X I= f (x)dx ≈ f (xi ). N 0 i=1

Introduce the concept of the average of the function f for a given Probability Distribution Function p(x) as E[f ] = hf i =

N 1 X f (xi )p(xi ), N i=1

and identify p(x) with the uniform distribution, viz p(x) = 1 when x ∈ [0, 1] and zero for all other values of x. First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

Monte Carlo Integration The integral is then the average of f over the interval x ∈ [0, 1] 1

Z

f (x)dx ≈ E[f ] = hf i.

I= 0

In addition to the average value E[f ] the other important quantity in a Monte-Carlo calculation is the variance σ 2 and the standard deviation σ. We define first the variance of the integral with f for a uniform distribution in the interval x ∈ [0, 1] to be σf2 =

N 1 X (f (xi ) − hf i)2 p(xi ), N i=1

and inserting the uniform distribution this yields σf2

0 12 N N 1 X 1 X 2 @ = f (xi ) − f (xi )A , N N i=1

or

i=1

“ ” σf2 = E[f 2 ] − (E[f ])2 .

which is nothing but a measure of the extent to which f deviates from its average over the region of integration. First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

But what do we gain by Monte Carlo Integration?

The trapezoidal rule carries a truncation error O(h2 ), with h the step length. In general, quadrature rules such as Newton-Cotes have a truncation error which goes like ∼ O(hk ), with k ≥ 1. Recalling that the step size is defined as h = (b − a)/N, we have an error which goes like ∼ N −k . Monte Carlo integration is more efficient in higher dimensions. Assume that our integration volume is a hypercube with side L and dimension d. This cube contains hence N = (L/h)d points and therefore the error in the result scales as N −k/d for the traditional methods. The error in √ the Monte carlo integration is however independent of d and scales as σ ∼ 1/ N, always! Comparing this with traditional methods, shows that Monte Carlo integration is more efficient than an order-k algorithm when d > 2k

First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

Why Monte Carlo Integration? An example from quantum mechanics: most problems of interest in e.g., atomic, molecular, nuclear and solid state physics consist of a large number of interacting electrons and ions or nucleons. The total number of particles N is usually sufficiently large that an exact solution cannot be found. Typically, the expectation value for a chosen hamiltonian for a system of N particles is hHi = R

dR1 dR2 . . . dRN Ψ∗ (R1 , R2 , . . . , RN )H(R1 , R2 , . . . , RN )Ψ(R1 , R2 , . . . , RN ) R , dR1 dR2 . . . dRN Ψ∗ (R1 , R2 , . . . , RN )Ψ(R1 , R2 , . . . , RN )

an in general intractable problem. This integral is actually the starting point in a Variational Monte Carlo calculation. Gaussian quadrature: Forget it! given 10 particles and 10 mesh points for each degree of freedom and an ideal 1 Tflops machine (all operations take the same time), how long will it ta ke to compute the above integral? Lifetime of the universe T ≈ 4.7 × 1017 s.

First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

The Dimensionality Curse

¨ As an example from the nuclear many-body problem, we have Schrodinger’s equation as a differential equation ˆ HΨ(r 1 , .., rA , α1 , .., αA ) = EΨ(r1 , .., rA , α1 , .., αA ) where r1 , .., rA , are the coordinates and α1 , .., αA , are sets of relevant quantum numbers such as spin and isospin for a system of A nucleons (A = N + Z , N being the number of neutrons and Z the number of protons).

First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

More on Dimensionality

There are 2A ×



A Z

«

coupled second-order differential equations in 3A dimensions. For a nucleus like 10 Be this number is 215040. This is a truely challenging many-body problem.

First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

Another classic: Radioactive Decay

Assume that a the time t = 0 we have N(0) nuclei of type X which can decay radioactively. At a time t > 0 we are left with N(t) nuclei. With a transition probability ω, which expresses the probability that the system will make a transition to another state during a time step of one second, we have the following first-order differential equation dN(t) = −ωN(t)dt, whose solution is N(t) = N(0)e−ωt , where we have defined the mean lifetime τ of X as τ =

First National Winter School in eScience

1 . ω

Lecture I, January 28 2007

Introduction PDF MC Integration

Radioactive Decay Probability for a decay of a particle during a time step ∆t is ∆N(t) = −λ N(t)∆t λ is inversely proportional to the lifetime Choose the number of particles N(t = 0) = N0 . Make a loop over the number of time steps, with maximum time bigger than the number of particles N0 At every time step there is a probability λ for decay. Compare this probability with a random number x. If x ≤ λ, reduce the number of particles with one i.e., N = N − 1. If not, keep the same number of particles till the next time step. Increase by one the time step (the external loop)

First National Winter School in eScience

Lecture I, January 28 2007

Introduction PDF MC Integration

Radioactive Decay idum=-1; // initialise random number generator // loop over monte carlo cycles // One monte carlo loop is one sample for (cycles = 1; cycles