The probability that a system in equilibrium with a heat bath at temperature T has energy E is given by
P(E,β) = g(E) e-βE/Z,
where Z is the partition function and g(E) is the number of states with energy E. If g(E) is known, we can calculate the mean energy and other thermodynamic quantities at any temperature from the relation
<E> = (1/Z) ∑E Eg(E)eβ E.
Hence, the quantity g(E) is of much interest.
In the following we discuss an algorithm for directly computing g(E). To be concrete we apply the algorithm to the two-dimensional Ising model. In this model, the energy is a discrete variable and hence the quantity we wish to compute is the number of spin configurations with the same energy. If we were to apply the algorithm to a system for which the energy is continuous, we would compute the number of states with energy between E and E + ΔE, that is g(E)ΔE. In this case, g(E) would be the density of states. It is common to refer to g(E) as the density of states even when we really mean the number of states.
Suppose that we were to try to compute g(E) by doing a random walk in energy space by flipping the spins at random and accepting all configurations that we obtain in this way. The histogram of the energy, H(E), the number of visits to each possible energy E of the system, would converge to g(E) if the walk visited all possible configurations. In practice, it would be impossible to realize such a long random walk given the extremely large number of configurations. For example, the Ising model on a L = 10 square lattice has 2100 ≅ 1.3 × 1030 spin configurations.
The main difficulty of doing a simple random walk to determine g(E) is that the walk would spend most of its time visiting the same energy values over and over again and would not reach the values of E that are less probable. The idea of the Wang-Landau algorithm is to do a random walk in energy space by flipping single spins at random and accepting the changes with a probability that is proportional to the reciprocal of the density of states. In this way energy values that would be visited often using a simple random walk would be visited less often because they have a larger density of states. There is only one problem -- we don't know the density of states. We will see that the Wang-Landau algorithm estimates the density of states at the same time that it does a random walk in phase space.
p(E1 → E2) = min(g(E1)/g(E2), 1),This equation implies that if g(E2) ≤ g(E1), the state with energy E2 is always accepted; otherwise, it is accepted with probability g(E1)/g(E2). That is, the state with energy E2 is accepted if a random number r ≤ g(E1)/g(E2).
g(E) = fg(E).We also update the existing entry for H(E) in the energy histogram:
H(E) = H(E) + 1.Because g(E) becomes very large, in practice we must work with the logarithm of the density of states, so that ln(g(E)) will fit into double precision numbers. Therefore, each update of the density of states is implemented as
ln(g(E)) → ln(g(E)) + ln(f),and the ratio of the density of states is computed as exp[ln(g(E1)) - ln(g(E2))].
At the end of the simulation, the algorithm provides only a relative density of states. To determine the normalized density of states we can either use the fact that the total number of states for the Ising model is
∑E g(E) = 2N,
or that the number of ground states (for which E = -2N) is 2. The latter normalization guarantees the accuracy of the density of states at low energies which is important in the calculation of thermodynamic quantities at low temperature. If we apply the former condition, we cannot guarantee the accuracy of g(E) for energies at or near the ground state, because the rescaling factor is dominated by the maximum density of states. We can use one of these two normalization conditions to obtain the absolute density of states, and use the other normalization condition to check the accuracy of our result.
C ≅ C0 ln L.Is your data for C consistent with this form? The constant C0 is approximately 0.4995.
Updated 6 February 2010.