Successive Over Relaxation (SOR) Method

A better rate of convergence is achieved by the Successive Over Relaxation (SOR) Method. In this approach, the old and new fields, calculated via Gauss-Seidel Method, are further mixed via a parameter $\omega$.

\begin{displaymath}
U^{old}_{i,j} + \omega \left( U^{new}_{i,j} - U^{old}_{i,j} \right)
\end{displaymath} (21)

This method aims at accelerating convergence by scaling the changes proposed by Gauss-Seidel. If $\omega > 1$, the changes proposed by Gauss-Seidel are scaled up, while otherwise the changes are scaled down. If $\omega = 1$ Gauss-Seidel is recovered.

The parameter $\omega$ should vary as a function of iteration number. It should be small for the first few iterations, when the guessed field may be very far from the solution. It should then be increased to a value near 1 for many iterations, and eventually it should be made larger than 1 to accelerate convergence in the latter stages of the iteration process. The choice of the best value for $\omega$ is discussed in Numerical Recipes in the section on SOR. This method falls somewhat in the folklore of solving elliptic equations.

2015-01-07