Numerical methods เนื้อหาไฟนอล 2023/1

October 24, 2023

Table of Contents

Linear algebra equation (cont.)

Jacobi Iteration Methods

From Ax = B [a11a12a13a21a22a23a31a32a33]{x1x2x3}={b1b2b3} a11x1+a12x2+a13x3=b1x1=b1a12x2a13x3a11x1k+1=b1a12x2ka13x3ka11 a21x1+a22x2+a23x3=b2x2=b2a21x1a23x3a22x2k+1=b2a21x1ka23x3ka22 a31x1+a32x2+a33x3=b3x3=b3a31x1a32x2a33x3k+1=b3a31x1ka32x2ka33 \text{From Ax = B} \\~\\ \begin{bmatrix} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \\ \end{bmatrix} \begin{Bmatrix} x_1 \\ x_2 \\ x_3 \end{Bmatrix} = \begin{Bmatrix} b_1 \\ b_2 \\ b_3 \end{Bmatrix} \\~\\ a_{11}{\color{red}x_1} + a_{12}x_2 + a_{13}x_3 = b_1 \\ \begin{aligned} x_1 = \dfrac{b_1 - a_{12}x_2 - a_{13}x_3}{a_{11}} \\ \therefore x_1^{k+1} = \dfrac{b_1 - a_{12}x_2^k - a_{13}x_3^k}{a_{11}} \end{aligned} \\~\\ a_{21}x_1 + a_{22}{\color{red}x_2} + a_{23}x_3 = b_2 \\ \begin{aligned} x_2 = \dfrac{b_2 - a_{21}x_1 - a_{23}x_3}{a_{22}} \\ \therefore x_2^{k+1} = \dfrac{b_2 - a_{21}x_1^k - a_{23}x_3^k}{a_{22}} \end{aligned} \\~\\ a_{31}x_1 + a_{32}x_2 + a_{33}{\color{red}x_3} = b_3 \\ \begin{aligned} x_3 = \dfrac{b_3 - a_{31}x_1 - a_{32}x_2}{a_{33}} \\ \therefore x_3^{k+1} = \dfrac{b_3 - a_{31}x_1^k - a_{32}x_2^k}{a_{33}} \end{aligned} \\~\\

ต้องเช็ค Error ของทุก xix_i ให้ต่ำกว่า Error threshold

error=xik+1xikxik+1×100%\text{error} = \left| \dfrac{x_i^{k+1} - x_i^{k}}{x_i^{k+1}} \times 100\% \right|

Guass-Seidal Iteration Methods

เหมือนกับ Jacobi Iteration Methods แต่ถ้ามีตัวแปร xk+1x^{k+1} ที่เคยคำนวนแล้วให้นำมาใช้

Example. x1k+1=100+x2kx2k+1=400+x1k4+x3k2x3k+1=100+x2k2 x1x2x31. x1k+1=100+x2k2. x2k+1=400+x1k+14+x3k23. x3k+1=100+x2k+12 x3x2x11. x3k+1=100+x2k22. x2k+1=400+x1k4+x3k+123. x1k+1=100+x2k+1 \text{Example.}\\~\\ \begin{aligned} x_1^{k+1} & = 100 + x_2^k \\ x_2^{k+1} & = 400 + \dfrac{x_1^k}{4} + \dfrac{x_3^k}{2} \\ x_3^{k+1} & = 100 + \dfrac{x_2^k}{2} \end{aligned} \\~\\ x_1 \Rarr x_2 \Rarr x_3 \\ \begin{aligned} & \text{1. } {\color{red}x_1^{k+1}} = 100 + x_2^k\\ & \text{2. } {\color{blue}x_2^{k+1}} = 400 + \dfrac{\color{red}x_1^{k+1}}{4} + \dfrac{x_3^k}{2}\\ & \text{3. } x_3^{k+1} = 100 + \dfrac{\color{blue}x_2^{k+1}}{2}\\ \end{aligned} \\~\\ x_3 \Rarr x_2 \Rarr x_1 \\ \begin{aligned} & \text{1. } {\color{red}x_3^{k+1}} = 100 + \dfrac{x_2^k}{2} \\ & \text{2. } {\color{blue}x_2^{k+1}} = 400 + \dfrac{x_1^k}{4} + \dfrac{\color{red}x_3^{k+1}}{2} \\ & \text{3. } x_1^{k+1} = 100 + {\color{blue}x_2^{k+1}} \end{aligned} \\~\\

Conjugate Gradient Methods

M\lfloor M \rfloor คือ Transpose matrix ของ [M][M]

Initial Value{R}0=[A]{X}0{B} {D}0={R}0 Each iteration (start at k=0)λk=Dk{R}kDk[A]{D}k {X}k+1={X}k+λk{D}k {R}k+1=[A]{X}k+1{B} Error=Rk+1{R}k+1 αk=Rk+1[A]{D}kDk[A]{D}k {D}k+1={R}k+1+αk{D}k\begin{aligned} & \text{Initial Value} \\\\ \{R\}^0 & = [A]\{X\}^0 - \{B\} \\~\\ \{D\}^0 & = -\{R\}^0 \\~\\ & \text{Each iteration (start at }k = 0) \\\\ \lambda_k & = - \dfrac{\lfloor D \rfloor^k\{R\}^k}{\lfloor D \rfloor^k [A] \{D\}^k} \\~\\ \{X\}^{k+1} & = \{X\}^k + \lambda_k \{D\}^k \\~\\ \{R\}^{k+1} & = [A]\{X\}^{k+1} - \{B\} \\~\\ Error & = \sqrt{\lfloor R \rfloor^{k+1} \{R\}^{k+1}} \\~\\ \alpha_k & = \dfrac{\lfloor R \rfloor^{k+1}[A]\{D\}^k}{\lfloor D \rfloor^k[A]\{D\}^k} \\~\\ \{D\}^{k+1} & = -\{R\}^{k+1} + \alpha_k\{D\}^k \end{aligned}

Interpolations

การหาค่าที่หายไปในระหว่างช่วง

วิธีการรูปแบบสูตร
Newton divided-differenceLinear
Quadratic
Polynomial
Lagrange interpolationLinear
Quadratic
Polynomial
Spline interpolationLinear
Quadratic
Cubic

Newton divided-difference

  • Linear

สูตร f(x)=c0+c1(xx0)f(x) = c_0 + c_1(x - x_0)

โดยที่ c0=f(x0),c1=f(x1)f(x0)x1x0c_0 = f(x_0), c_1 = \dfrac{f(x_1) - f(x_0)}{x_1 - x_0}

  • Quadratic

สูตร f(x)=c0+c1(xx0)+c2(xx0)(xx1)f(x) = c_0 + c_1(x - x_0) + c_2(x - x_0)(x - x_1)

โดยที่

c0=f(x0)c1=f(x1)f(x0)x1x0c2=f(x2)f(x1)x2x1f(x1)f(x0)x1x0x2x1\begin{align*} & c_0 = f(x_0) \\ & c_1 = \dfrac{f(x_1) - f(x_0)}{x_1 - x_0} \\ & c_2 = \dfrac{\dfrac{f(x_2) - f(x_1)}{x_2 - x_1} - \dfrac{f(x_1) - f(x_0)}{x_1 - x_0}}{x_2 - x_1} \end{align*}
  • Polynomial
Alt text

Lagrange interpolation

  • Linear

สูตร f(x)=L0f(x0)+L1f(x1)f(x) = L_0f(x_0) + L_1f(x_1)

โดยที่

L0=(x1x)(x1x0)L1=(x0x)(x0x1)L_0 = \dfrac{(x_1 - x)}{(x_1 - x_0)} \\ L_1 = \dfrac{(x_0 - x)}{(x_0 - x_1)}
  • Quadratic

สูตร f(x)=L0f(fx0)+L1f(fx1)+L2f(fx2)f(x) = L_0f(fx_0) + L_1f(fx_1) + L_2f(fx_2)

โดยที่

L0=(x1x)(x2x)(x1x0)(x2x0)L1=(x0x)(x2x)(x0x0)(x2x0)L2=(x0x)(x1x)(x0x0)(x1x0)L_0 = \dfrac{(x_1 - x)(x_2 - x)}{(x_1 - x_0)(x_2 - x_0)} \\ L_1 = \dfrac{(x_0 - x)(x_2 - x)}{(x_0 - x_0)(x_2 - x_0)} \\ L_2 = \dfrac{(x_0 - x)(x_1 - x)}{(x_0 - x_0)(x_1 - x_0)} \\
  • Polynomial

สูตร f(x)=L0f(x0)+L1f(x1)+...+Lnf(xn)f(x) = L_0f(x_0) + L_1f(x_1) + ... + L_nf(x_n)

โดยที่

La=ian1(xix)ian1(xixa)\displaystyle L_a = \dfrac{\displaystyle\prod_{i \neq a}^{n - 1}{(x_i - x)}}{\displaystyle\prod_{i \neq a}^{n - 1}{(x_i - x_a)}}

Spline

จะแบ่ง Functions ออกเป็นช่วงๆ

  • Linear
Linear Spline
f1(x)=f(x0)+f(x1)f(x0)x1x0(xx0);x0xx1 f2(x)=f(x1)+f(x2)f(x1)x2x1(xx1);x1xx2 f3(x)=f(x2)+f(x3)f(x2)x3x2(xx2);x2xx3f_1(x) = f(x_0) + \dfrac{f(x_1) - f(x_0)}{x_1 - x_0}(x - x_0) ; \quad x_0 \le x \le x_1 \\~\\ f_2(x) = f(x_1) + \dfrac{f(x_2) - f(x_1)}{x_2 - x_1}(x - x_1) ; \quad x_1 \le x \le x_2 \\~\\ f_3(x) = f(x_2) + \dfrac{f(x_3) - f(x_2)}{x_3 - x_2}(x - x_2) ; \quad x_2 \le x \le x_3
  • Quadratic
Quadratic Spline

จากสมการ fi(x)=aix2+bix+cif_i(x) = a_ix^2 + b_ix + c_i

จะใช้กฎ 4 ข้อในการคำนวนหาสมการ

Interior knots หรือปมใน หมายถึงจุด xx ที่ไม่ได้อยู่ตรงริม End knots หรือ End points หมายถึงจุด xx ที่อยู่ตรงริม (จุดแรกและจุดสุดท้าย)

  1. ค่าของ Functions ของจุดที่ติดกันมีค่าเท่ากัน ที่ปมด้านใน

    At xif(xi)=aixi2+bixi+cif(xi)=ai+1xi2+bi+1xi2+ci+1\begin{aligned} & \text{At }x_i \\ f(x_i) & = a_ix_i^2 + b_ix_i + c_i \\ f(x_i) & = a_{i+1}x_i^2 + b_{i+1}x_i^2 + c_{i+1} \\ \end{aligned}
  2. ค่าของ Functions อันแรก และ อันสุดท้ายผ่าน (ปมด้านนอก)

    f(x0)=a1x02+b1x0+c1f(xn)=anxn2+bnxn+cn\begin{aligned} f(x_0) & = a_1x_0^2 + b_1x_0 + c_1\\ f(x_n) & = a_nx_n^2 + b_nx_n + c_n \end{aligned}
  3. Derivative หรือ Slope ของปมด้านในมีค่าเท่ากัน

    f(x)=2ax+b At xi2aixi+bi=2ai+1xi+bi+1f'(x) = 2ax + b \\~\\ \text{At }x_i \\ 2a_ix_i + b_i = 2a_{i+1}x_i + b_{i+1}
  4. ให้ a1=0a_1 = 0

  • Cubic

จากสมการ fi(x)=aix3+bix2+cix+dif_i(x) = a_ix^3 + b_ix^2 + c_ix + d_i

  1. ค่าของ Functions ของจุดที่ติดกันมีค่าเท่ากัน ที่ปมด้านใน
  2. ค่าของ Functions อันแรก และ อันสุดท้ายผ่าน (ปมด้านนอก)
  3. First Derivative หรือ Slope ของปมด้านในมีค่าเท่ากัน
  4. Second Derivative ของปมด้านใน มีค่าเท่ากัน
  5. Second Derivative ของปมด้านนอก มีค่าเท่ากับ 0

Extrapolation

Simple Regression

g(x)=a0+a1x+a2x2++amxmg(x) = a_0 + a_1x + a_2x^2 + \ldots + a_mx^m
[ni=1nxii=1nxi2i=1nximi=1nxii=1nxi2i=1nxi3i=1nxim+1i=1nxi2i=1nxi3i=1nxi4i=1nxim+2i=1nximi=1nxim+1i=1nxim+2i=1nxi2m]{a0a1a2am}={i+1nyii+1nxiyii+1nxi2yii+1nximyi}\begin{bmatrix} n & \displaystyle\sum_{i=1}^n x_i & \displaystyle\sum_{i=1}^n x_i^2 & \cdots & \displaystyle\sum_{i=1}^n x_i^m \\ \displaystyle\sum_{i=1}^n x_i & \displaystyle\sum_{i=1}^n x_i^2 & \displaystyle\sum_{i=1}^n x_i^3 & \cdots & \displaystyle\sum_{i=1}^n x_i^{m+1} \\ \displaystyle\sum_{i=1}^n x_i^2 & \displaystyle\sum_{i=1}^n x_i^3 & \displaystyle\sum_{i=1}^n x_i^4 & \cdots & \displaystyle\sum_{i=1}^n x_i^{m+2} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ \displaystyle\sum_{i=1}^n x_i^m & \displaystyle\sum_{i=1}^n x_i^{m+1} & \displaystyle\sum_{i=1}^n x_i^{m+2} & \cdots & \displaystyle\sum_{i=1}^n x_i^{2m} \end{bmatrix} {\def\arraystretch{2.2} \begin{Bmatrix} a_0 \\ a_1 \\ a_2 \\ \vdots \\ a_m \end{Bmatrix}} = \begin{Bmatrix} \displaystyle\sum_{i+1}^n y_i \\ \displaystyle\sum_{i+1}^n x_iy_i \\ \displaystyle\sum_{i+1}^n x_i^2y_i \\ \vdots \\ \displaystyle\sum_{i+1}^n x_i^my_i \end{Bmatrix}

Multiple Regression

g(x)+a0+a1x1+a2x2++akxkg(x) + a_0 + a_1x_1 + a_2x_2 + \ldots + a_kx_k
[ni=1nx1ii=1nx2ii=1nxkii=1nx1ii=1nx1ix1ii=1nx1ix2ii=1nx1ixkii=1nx2ii=1nx2ix1ii=1nx2ix2ii=1nx2ixkii=1nxkii=1nxkix1ii=1nxkix2ii=1nxkixki]{a0a1a2ak}={i=1nyii=1nx1iyii=1nx2iyii=1nxkiyi}\begin{bmatrix} n & \displaystyle\sum_{i=1}^n x_{1i} & \displaystyle\sum_{i=1}^n x_{2i} & \cdots & \displaystyle\sum_{i=1}^n x_{ki} \\ \displaystyle\sum_{i=1}^n x_{1i} & \displaystyle\sum_{i=1}^n x_{1i}x_{1i} & \displaystyle\sum_{i=1}^n x_{1i}x_{2i} & \cdots & \displaystyle\sum_{i=1}^n x_{1i}x_{ki} \\ \displaystyle\sum_{i=1}^n x_{2i} & \displaystyle\sum_{i=1}^n x_{2i}x_{1i} & \displaystyle\sum_{i=1}^n x_{2i}x_{2i} & \cdots & \displaystyle\sum_{i=1}^n x_{2i}x_{ki} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ \displaystyle\sum_{i=1}^n x_{ki} & \displaystyle\sum_{i=1}^n x_{ki}x_{1i} & \displaystyle\sum_{i=1}^n x_{ki}x_{2i} & \cdots & \displaystyle\sum_{i=1}^n x_{ki}x_{ki} \end{bmatrix} {\def\arraystretch{2.2} \begin{Bmatrix} a_0 \\ a_1 \\ a_2 \\ \vdots \\ a_k \end{Bmatrix}} = \begin{Bmatrix} \displaystyle\sum_{i=1}^n y_i \\ \displaystyle\sum_{i=1}^n x_{1i}y_i \\ \displaystyle\sum_{i=1}^n x_{2i}y_i \\ \vdots \\ \displaystyle\sum_{i=1}^n x_{ki}y_i \end{Bmatrix}

Integration

Single Trapezoidal Rule

I=ba2[f(x0)+f(x1)]I = \dfrac{b - a}{2} \left[ f(x_0) + f(x_1) \right]

Composite Trapezoidal Rule

h=(ba)/nI=h2(f(x0)+f(xn)+2i=1n1f(xi))\begin{aligned} h & = (b - a)/n \\ I & = \dfrac{h}{2} \left( f(x_0) + f(x_n) + 2\sum_{i=1}^{n-1} f(x_i) \right) \end{aligned}

Simpson's Rule

h=ba2I=h3[f(x0)+4f(x1)+f(x2)]\begin{aligned} h & = \dfrac{b - a}{2} \\ I & = \dfrac{h}{3} \left[ f(x_0) + 4f(x_1) + f(x_2) \right] \end{aligned}

Composite Simpson's Rule

h=ba2nI=h2[f(x0)+f(xn)+4i=1,3,5n1f(xi)+2i=2,4,6n2f(xi)]\begin{aligned} h & = \dfrac{b - a}{2n} \\ I & = \dfrac{h}{2} \left[ f(x_0) + f(x_n) + 4\sum_{i=1,3,5}^{n-1} f(x_i) + 2\sum_{i=2,4,6}^{n-2} f(x_i) \right] \end{aligned}

Differential

ค่าประมาณ

Diff
ΔyΔx=f(x0+Δx)f(x0)Δx\dfrac{\Delta y}{\Delta x} = \dfrac{f(x_0 + \Delta x_) - f(x_0)}{\Delta x}

ค่าจริง

dydx=limΔx0f(x0+Δx)f(x0)Δx\dfrac{dy}{dx}=\lim_{\Delta x \to 0}{\dfrac{f(x_0 + \Delta x_) - f(x_0)}{\Delta x}}

รวมสูตร

Alt text
  • Forward divided-difference
DerivativeError
First Derivative
f(xi)=f(xi+1)f(xi)hf'(x_i) = \dfrac{f(x_{i+1}) - f(x_i)}{h}O(h)O(h)
f(xi)=f(xi+2)+4f(xi+1)3f(xi)2hf'(x_i) = \dfrac{-f(x_{i+2}) + 4f(x_{i+1}) - 3f(x_{i})}{2h}O(h2)O(h^2)
Second Derivative
f(xi)=f(xi+2)2f(xi+1)+f(xi)h2f''(x_i) = \dfrac{f(x_{i+2}) - 2f(x_{i+1}) + f(x_{i})}{h^2}O(h)O(h)
f(xi)=f(xi+3)+4f(xi+2)5f(xi+1)+2f(xi)h2f''(x_i) = \dfrac{-f(x_{i+3}) + 4f(x_{i+2}) - 5f(x_{i+1}) + 2f(x_{i})}{h^2}O(h2)O(h^2)

First derivative

First forward devided-difference

จาก Taylor Series

f(x)=f(x0)+(xx0)f(x0)+(xx0)22!f(x0)+(xx0)33!f(x0)+...f(x) = f(x_0) + (x - x_0)f'(x_0) + \dfrac{(x - x_0)^2}{2!}f''(x_0) + \dfrac{(x - x_0)^3}{3!}f'''(x_0) + ...

เปลี่ยน xx เป็น xi+1x_{i+1} และ x0x_0 เป็น xix_i และ xx0x-x_0 เป็น hh

f(xi+1)=f(xi)+hf(xi)+h22!f(xi)+...f(x_{i+1}) = f(x_i) + hf'(x_i) + \dfrac{h^2}{2!}f''(x_i) + ...

ย้ายฝั่งหา f(xi)f'(x_i)

f(xi+1)f(xi)h22!f(xi)...=hf(xi)f(xi)=f(xi+1)f(xi)h22!f(xi)...hf(xi)=f(xi+1)f(xi)hh2!f(xi)...f(x_{i+1}) - f(x_i) - \dfrac{h^2}{2!}f''(x_i) - ... = hf'(x_i) \\ f'(x_i) = \dfrac{f(x_{i+1}) - f(x_i) - \dfrac{h^2}{2!}f''(x_i) - ...}{h} \\ f'(x_i) = \dfrac{f(x_{i+1}) - f(x_i)}{h} - \color{red}{\dfrac{h}{2!}f''(x_i) - ...}

ลดความละเอียดลงเป็น O(h)O(h)

f(xi)=f(xi+1)f(xi)h+O(h)f'(x_i) = \dfrac{f(x_{i+1}) - f(x_i)}{h} + \color{red}{O(h)}

First backward divided-difference

f(xi)=f(xi)f(xi+1)h+O(h)f'(x_i) = \dfrac{f(x_i) - f(x_{i+1})}{h} + \color{red}{O(h)}

First Central divided-difference

f(xi)=f(xi1)+f(xi+1)2h+O(h2)f'(x_i) = \dfrac{f(x_{i-1}) + f(x_{i+1})}{2h} + \color{red}{O(h^2)}