Compare commits

...

10 Commits

Author SHA1 Message Date
Deniz
ea901c03ab update eigen.py
fixed -> matrix and vector imported twice from linalg
2025-09-17 14:09:49 +03:00
e0d0eb1fc9 Merge pull request 'broken links corrected' (#5) from ougur/numethods:main into main
Reviewed-on: #5
2025-09-17 13:31:14 +03:00
9fe1855fc6 typo 2025-09-17 13:30:00 +03:00
094778f67e more typo 2025-09-17 13:28:33 +03:00
61b567ec19 broken links corrected 2025-09-17 13:11:55 +03:00
0cfbdbe02d Update tutorials/README.md 2025-09-17 13:08:57 +03:00
94759a9d71 Merge pull request 'main: some modification and typos' (#4) from ougur/numethods:main into main
Reviewed-on: #4
2025-09-17 13:07:02 +03:00
cf0b91f94a some typos for math in tutorial1 2025-09-17 12:30:58 +03:00
f73789bca6 created README in tutorials (by modifying .gitignore - beginning ignored files are removed.) 2025-09-17 12:26:42 +03:00
f56e48cebd README is modified 2025-09-17 12:20:41 +03:00
8 changed files with 62 additions and 97 deletions

35
.gitignore vendored
View File

@@ -1,38 +1,3 @@
Logo
Issues
Pull Requests
Milestones
Explore
PhiloMath
/
data-fitting-models
Private
generated from PhiloMath/default-project-template
Code
Issues
Pull Requests
Actions
Packages
Projects
Releases
Wiki
Activity
Settings
Files
.gitignore
LICENSE
README.md
data-fitting-models
/
.gitignore
Ömür Uğur
f9860d7f48
Initial commit
2 days ago
357 lines
6.4 KiB
Plaintext
# ---> JupyterNotebooks # ---> JupyterNotebooks
# gitignore template for Jupyter Notebooks # gitignore template for Jupyter Notebooks
# website: http://jupyter.org/ # website: http://jupyter.org/

View File

@@ -10,55 +10,9 @@ A lightweight, from-scratch, object-oriented Python package implementing classic
- Lightweight, no dependencies. - Lightweight, no dependencies.
- Consistent object-oriented API (.solve() etc). - Consistent object-oriented API (.solve() etc).
---
## Tutorial Series ## Tutorial Series
This package comes with a set of Jupyter notebooks designed as a structured tutorial series in **numerical methods**, both mathematically rigorous and hands-on with code. This package comes with a set of Jupyter notebooks designed as a structured tutorial series in **numerical methods**, both mathematically rigorous and hands-on with code. See [Tutorials](./tutorials/README.md).
### Core Tutorials
1. [Tutorial 1: Vectors and Matrices](tutorials/tutorial1_vectors.ipynb)
- Definitions of vectors and matrices.
- Vector operations: addition, scalar multiplication, dot product, norms.
- Matrix operations: addition, multiplication, transpose, inverse.
- Matrix and vector norms.
- Examples with `numethods.linalg`.
2. [Tutorial 2: Linear Systems of Equations](tutorials/tutorial2_linear_systems.ipynb)
- Gaussian elimination and GaussJordan.
- LU decomposition.
- Cholesky decomposition.
- Iterative methods: Jacobi and Gauss-Seidel.
- Examples with `numethods.solvers`.
3. [Tutorial 3: Orthogonalization and QR Factorization](tutorials/tutorial3_orthogonalization.ipynb)
- Inner products and orthogonality.
- GramSchmidt process (classical and modified).
- Householder reflections.
- QR decomposition and applications.
- Examples with `numethods.orthogonal`.
4. [Tutorial 4: Root-Finding Methods](tutorials/tutorial4_root_finding.ipynb)
- Bisection method.
- Fixed-point iteration.
- Newtons method.
- Secant method.
- Convergence analysis and error behavior.
- Trace outputs for iteration history.
- Examples with `numethods.roots`.
- [Polynomial Regression Demo](tutorials/polynomial_regression.ipynb)
- Step-by-step example of polynomial regression.
- Shows how to fit polynomials of different degrees to data.
- Visualizes fitted curves against the original data.
---
## Features ## Features

View File

@@ -3,7 +3,6 @@ from .linalg import Matrix, Vector
from .orthogonal import QRHouseholder from .orthogonal import QRHouseholder
from .solvers import LUDecomposition from .solvers import LUDecomposition
from .exceptions import NonSquareMatrixError, ConvergenceError from .exceptions import NonSquareMatrixError, ConvergenceError
from .linalg import Matrix, Vector
import math import math

47
tutorials/README.md Normal file
View File

@@ -0,0 +1,47 @@
# Tutorial Series
This package comes with a set of Jupyter notebooks designed as a structured tutorial series in **numerical methods**, both mathematically rigorous and hands-on with code.
## Core Tutorials
1. [Tutorial 1: Vectors and Matrices](./tutorial1_vectors_matrices.ipynb)
- Definitions of vectors and matrices.
- Vector operations: addition, scalar multiplication, dot product, norms.
- Matrix operations: addition, multiplication, transpose, inverse.
- Matrix and vector norms.
- Examples with `numethods.linalg`.
2. [Tutorial 2: Linear Systems of Equations](./tutorial2_linear_systems.ipynb)
- Gaussian elimination and GaussJordan.
- LU decomposition.
- Cholesky decomposition.
- Iterative methods: Jacobi and Gauss-Seidel.
- Examples with `numethods.solvers`.
3. [Tutorial 3: Orthogonalization and QR Factorization](./tutorial3_orthogonalization.ipynb)
- Inner products and orthogonality.
- GramSchmidt process (classical and modified).
- Householder reflections.
- QR decomposition and applications.
- Examples with `numethods.orthogonal`.
4. [Tutorial 4: Root-Finding Methods](./tutorial4_root_finding.ipynb)
- Bisection method.
- Fixed-point iteration.
- Newtons method.
- Secant method.
- Convergence analysis and error behavior.
- Trace outputs for iteration history.
- Examples with `numethods.roots`.
- [Polynomial Regression Demo](./polynomial_regression.ipynb)
- Step-by-step example of polynomial regression.
- Shows how to fit polynomials of different degrees to data.
- Visualizes fitted curves against the original data.
---

View File

@@ -142,7 +142,7 @@
"## 5. Basic operations\n", "## 5. Basic operations\n",
"\n", "\n",
"### 5.1 Vector addition and subtraction\n", "### 5.1 Vector addition and subtraction\n",
"For $ u, v \\in \\mathbb{R}^n $:\n", "For $u, v \\in \\mathbb{R}^n$:\n",
"\n", "\n",
"$$\n", "$$\n",
"u + v = \\begin{bmatrix} u_1 + v_1 \\\\ u_2 + v_2 \\\\ \\vdots \\\\ u_n + v_n \\end{bmatrix},\n", "u + v = \\begin{bmatrix} u_1 + v_1 \\\\ u_2 + v_2 \\\\ \\vdots \\\\ u_n + v_n \\end{bmatrix},\n",
@@ -180,7 +180,7 @@
"metadata": {}, "metadata": {},
"source": [ "source": [
"### 5.2 Scalar multiplication\n", "### 5.2 Scalar multiplication\n",
"For $ \\alpha \\in \\mathbb{R}, v \\in \\mathbb{R}^n $:\n", "For $\\alpha \\in \\mathbb{R}, v \\in \\mathbb{R}^n$:\n",
"\n", "\n",
"$$\n", "$$\n",
"\\alpha v = \\begin{bmatrix} \\alpha v_1 \\\\ \\alpha v_2 \\\\ \\vdots \\\\ \\alpha v_n \\end{bmatrix}.\n", "\\alpha v = \\begin{bmatrix} \\alpha v_1 \\\\ \\alpha v_2 \\\\ \\vdots \\\\ \\alpha v_n \\end{bmatrix}.\n",
@@ -215,7 +215,7 @@
"metadata": {}, "metadata": {},
"source": [ "source": [
"### 5.3 Matrix addition and subtraction\n", "### 5.3 Matrix addition and subtraction\n",
"For $ A, B \\in \\mathbb{R}^{m \\times n} $:\n", "For $A, B \\in \\mathbb{R}^{m \\times n}$:\n",
"\n", "\n",
"$$\n", "$$\n",
"A + B = [ a_{ij} + b_{ij} ], \\quad\n", "A + B = [ a_{ij} + b_{ij} ], \\quad\n",
@@ -254,7 +254,7 @@
"metadata": {}, "metadata": {},
"source": [ "source": [
"### 5.4 Matrix-Vector multiplication\n", "### 5.4 Matrix-Vector multiplication\n",
"For $ A \\in \\mathbb{R}^{m \\times n}, v \\in \\mathbb{R}^n $:\n", "For $A \\in \\mathbb{R}^{m \\times n}, v \\in \\mathbb{R}^n$:\n",
"\n", "\n",
"$$\n", "$$\n",
"(Av)_i = \\sum_{j=1}^n a_{ij} v_j.\n", "(Av)_i = \\sum_{j=1}^n a_{ij} v_j.\n",
@@ -290,7 +290,7 @@
"metadata": {}, "metadata": {},
"source": [ "source": [
"### 5.5 Matrix-Matrix multiplication\n", "### 5.5 Matrix-Matrix multiplication\n",
"For $ A \\in \\mathbb{R}^{m \\times n}, B \\in \\mathbb{R}^{n \\times p} $:\n", "For $A \\in \\mathbb{R}^{m \\times n}, B \\in \\mathbb{R}^{n \\times p}$:\n",
"\n", "\n",
"$$\n", "$$\n",
"(AB)_{ij} = \\sum_{k=1}^n a_{ik} b_{kj}.\n", "(AB)_{ij} = \\sum_{k=1}^n a_{ik} b_{kj}.\n",
@@ -327,7 +327,7 @@
"metadata": {}, "metadata": {},
"source": [ "source": [
"### 5.6 Transpose\n", "### 5.6 Transpose\n",
"For $ A \\in \\mathbb{R}^{m \\times n} $:\n", "For $A \\in \\mathbb{R}^{m \\times n}$:\n",
"\n", "\n",
"$$\n", "$$\n",
"A^T_{ij} = A_{ji}.\n", "A^T_{ij} = A_{ji}.\n",

View File

@@ -15,7 +15,7 @@
"## Motivation\n", "## Motivation\n",
"Why we care about solving Ax=b? in numerical methods (e.g., arises in ODEs, PDEs, optimization, physics).\n", "Why we care about solving Ax=b? in numerical methods (e.g., arises in ODEs, PDEs, optimization, physics).\n",
"\n", "\n",
"Exact solution: $ x=A^{-1}b $, but computing $ A^{-1} $ explicitly is costly/unstable.\n", "Exact solution: $x = A^{-1}b$, but computing $A^{-1}$ explicitly is costly/unstable.\n",
"\n", "\n",
"Numerical algorithms instead use factorizations or iterative schemes.\n", "Numerical algorithms instead use factorizations or iterative schemes.\n",
"\n", "\n",

View File

@@ -94,7 +94,7 @@
"q_1 = \\frac{a_1}{\\|a_1\\|}\n", "q_1 = \\frac{a_1}{\\|a_1\\|}\n",
"$$\n", "$$\n",
"$$\n", "$$\n",
"q_k = \\frac{a_k - \\sum_{j=1}^{k-1} (q_j \\cdot a_k) q_j}{\\left\\|a_k - \\sum_{j=1}^{k-1} (q_j \\cdot a_k) q_j\\right\\|}\n", "q_k = \\frac{a_k - \\sum_{j=1}^{k-1} (q_j \\cdot a_k) q_j}{\\left\\|a_k - \\sum_{j=1}^{k-1} (q_j \\cdot a_k) q_j\\right\\|}, \\qquad k = 2, \\ldots, n\n",
"$$\n", "$$\n",
"\n", "\n",
"Matrix form:\n", "Matrix form:\n",
@@ -236,17 +236,17 @@
"\n", "\n",
"We want to solve\n", "We want to solve\n",
"\n", "\n",
"$$ \\min_x \\|Ax - b\\|_2. $$\n", "$$ \\min_x \\Vert Ax - b \\Vert_2^2. $$\n",
"\n", "\n",
"If $A = QR$, then\n", "If $A = QR$, then\n",
"\n", "\n",
"$$ \\min_x \\|Ax - b\\|_2 = \\min_x \\|QRx - b\\|_2. $$\n", "$$ \\min_x \\Vert Ax - b \\Vert_2^2 = \\min_x \\Vert QRx - b \\Vert_2^2. $$\n",
"\n", "\n",
"Since $Q$ has orthonormal columns:\n", "Since $Q$ has orthonormal columns, and the normal equations boils down to\n",
"\n", "\n",
"$$ R x = Q^T b. $$\n", "$$ R x = Q^T b, $$\n",
"\n", "\n",
"So we can solve using back-substitution.\n" "we can therefore solve for $x$ by using back-substitution.\n"
] ]
}, },
{ {

View File

@@ -55,7 +55,7 @@
"source": [ "source": [
"## 2. Bisection Method\n", "## 2. Bisection Method\n",
"\n", "\n",
"**Assumption (Intermediate Value Theorem):** If f is continuous on ([a,b]) and (f(a),f(b) < 0),\n", "**Assumption (Intermediate Value Theorem):** If f is continuous on $[a,b]$ and $f(a)f(b) < 0$,\n",
"then there exists $x^\\star$ in (a,b) with $f(x^\\star)=0$.\n", "then there exists $x^\\star$ in (a,b) with $f(x^\\star)=0$.\n",
"\n", "\n",
"- Assumes $f$ is continuous on $[a,b]$ with $f(a)f(b)<0$.\n", "- Assumes $f$ is continuous on $[a,b]$ with $f(a)f(b)<0$.\n",