### Random multiplicative processes

Introduction

Examples of random multiplicative processes include the distributions of incomes, rainfall, and fragment sizes in rock crushing processes. Consider the latter for which we begin with a rock of size w. We strike the rock with a hammer and generate two fragments whose sizes are pw and qw, where q = 1 - p. In the next step the possible sizes of the fragments are p2w, pqw, qpw, and q2w. What is the distribution of the fragments after N blows of the hammer?

To answer this question, consider a binary sequence in which the numbers x1 and x2 appear independently with probabilities p and q respectively. If there are N elements in the product Π, we can ask what is <Π>, the mean value of Π? To compute <Π>, we define P(n) as the probability that the product of N independent factors of x1 and x2 has the value x1n x2N-n. This probability is given by the number of sequences where x1 appears n times multiplied by the probability of choosing a specific sequence with x1 appearing n times:

P(n) = N!/(n! (N - n)!) pn qN-n.

The mean value of the product is given by

<Π> = Σn=0 P(n) x1nx2N-n = (px1 + qx2)N.

The most probable event is one in which the product contains Np factors of x1 and Nq factors of x2. Hence, the most probable value of the product is

Πmp = (x1px2q)N.

Problems

1. The average value of the sum of random variables is a good approximation to the most probable value of the sum. Is there a similar relation for a random multiplicative process? First consider x1 = 2, x2 = 1/2, and p = q = 1/2. Determine <Π> and Πmp.
2. Use the program to estimate <Π> and Πmp for the same parameters as used to calculate the results analytically in Problem 1. Do your estimated values converge more or less uniformly to the exact values as the number of measurements becomes large? Do a similar simulation for N = 20. Compare your results with a similar simulation of a random walk and discuss the importance of extreme events for random multiplicative processes.
3. *The average value of a product of random variables is governed by rare events that are at the tail of the distribution. However, the most probable events will likely dominate in a simulation of a multiplicative process. As the number of trials increase, there will be an increase in the number of rare events that are sampled, and we expect that the observed averages will fluctuate greatly. As the number of trials is increased still further, the number of rare events will be more accurately sampled, and the observed averages will eventually converge to their true values. Redner has estimated that the minimum number of trials for this crossover to occur is given by

where T* is the number of trials and m is the moment of the distribution that we wish to estimate. How does the estimate of T* compare with the results you observe in the simulation?

References

• S. Redner, Random multiplicative processes: An elementary tutorial, Am. J. Phys. 58, 267–273 (1990).

Java Classes

• ProductProcessApp

Updated 28 December 2009.