next up previous
Next: A relaxed derivation of Up: A More Fundamental Analysis Previous: A More Fundamental Analysis

Conditions for the the convergence of SOR

The conditions for the convergence of the simultaneous over-relaxation method are found via Kahan's Theorem (Equaton 3.10). Note that
$\displaystyle det(H)$ $\displaystyle =$ $\displaystyle det((D-wL)^{-1})det((1-w)D+wU)$  
  $\displaystyle =$ $\displaystyle det(D^{-1})det((1-w)D+wU))$  
  $\displaystyle =$ $\displaystyle det((1-w)I +wD^{-1}U)=det((1-w)I)=(1-w)^n$ (3.9)

Since $ det(H)=\prod_i \lambda_i$, ($ \lambda_i$ are eigenvalues of H), and the spectral radius is defined by $ \rho(H)=max\vert\lambda_i\vert$,

$\displaystyle \prod_i \lambda_i = det(H)=(1-w)^n \leq max\vert\lambda_i\vert^n  \Rightarrow \rho(H)\geq\vert w-1\vert$ (3.10)

Now the Fundamental Theorem of Linear Iterative Methods (see final section) tells us we have convergence if $ \rho(H)<1$, and since $ \rho(H)\geq\vert w-1\vert$,

$\displaystyle 1>\vert w-1\vert \Rightarrow 0<w<2$

Obviously $ w=1$ is Jacobi-GS, and $ w<1$ is foolish, so we consider $ 1<w<2$. It is now our task to find which value in this region is most efficient.
next up previous
Next: A relaxed derivation of Up: A More Fundamental Analysis Previous: A More Fundamental Analysis
Timothy Jones 2006-02-24