Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
Ok-landscape
GitHub Repository: Ok-landscape/computational-pipeline
Path: blob/main/notebooks/published/arnoldi_iteration/arnoldi_iteration_posts.txt
51 views
unlisted
1
# Social Media Posts: Arnoldi Iteration
2
3
---
4
5
## 1. Twitter/X (280 chars max)
6
7
🔢 Arnoldi Iteration: Build an orthonormal basis for Krylov subspace K_k(A,b) = span{b, Ab, A²b,...}
8
9
Result? Eigenvalues of the Hessenberg matrix H_k converge to extreme eigenvalues of A!
10
11
Foundation of GMRES & eigenvalue solvers.
12
13
#Python #NumPy #LinearAlgebra #Math
14
15
---
16
17
## 2. Bluesky (300 chars max)
18
19
Implemented the Arnoldi iteration algorithm in Python today.
20
21
It constructs an orthonormal basis Q_k for the Krylov subspace and produces an upper Hessenberg matrix H_k satisfying:
22
23
AQ_k = Qk₊₁H̃_k
24
25
The Ritz values (eigenvalues of H_k) rapidly approximate the extreme eigenvalues of A.
26
27
---
28
29
## 3. Threads (500 chars max)
30
31
Ever wondered how iterative eigenvalue solvers work?
32
33
The Arnoldi iteration is the answer! Starting from just a matrix A and vector b, it builds:
34
35
• An orthonormal Krylov basis: span{b, Ab, A²b,...}
36
• An upper Hessenberg matrix H_k
37
38
The magic: eigenvalues of small matrix H_k approximate eigenvalues of huge matrix A!
39
40
In my test with a 100×100 matrix, the top 5 eigenvalues converged in just ~30 iterations. This powers GMRES, the go-to solver for large linear systems.
41
42
---
43
44
## 4. Mastodon (500 chars max)
45
46
Exploring the Arnoldi iteration - the workhorse behind GMRES and implicitly restarted Arnoldi methods.
47
48
Key insight: Given A ∈ ℝⁿˣⁿ and b ∈ ℝⁿ, construct orthonormal Q_k spanning K_k(A,b) via modified Gram-Schmidt.
49
50
The Arnoldi relation AQ_k = Qk₊₁H̃_k gives us an upper Hessenberg H_k whose eigenvalues (Ritz values) converge to extreme eigenvalues of A.
51
52
Complexity: O(k²n) for orthogonalization, O(kn) storage. Sweet spot for large sparse systems!
53
54
#NumericalLinearAlgebra #Python #ScientificComputing
55
56
---
57
58
## 5. Reddit
59
60
**Title:** Implemented Arnoldi Iteration from scratch - here's how eigenvalue approximation actually works
61
62
**Body:**
63
64
Hey r/learnpython!
65
66
I just built an implementation of the **Arnoldi iteration** and wanted to share what I learned.
67
68
### What is it?
69
70
The Arnoldi iteration takes a big matrix A and finds its eigenvalues without computing them directly. Instead, it:
71
72
1. Starts with a random vector b
73
2. Builds a sequence: b, Ab, A²b, A³b,... (called a Krylov subspace)
74
3. Orthogonalizes these vectors (like Gram-Schmidt)
75
4. Produces a small "summary" matrix H_k
76
77
The cool part? The eigenvalues of the small matrix H_k approximate the eigenvalues of the big matrix A!
78
79
### Why it matters
80
81
This is the foundation of:
82
- **GMRES** - solving Ax = b for huge sparse systems
83
- **Eigenvalue algorithms** - finding eigenvalues of matrices too big to decompose directly
84
85
### What I found
86
87
Testing on a 100×100 matrix with clustered eigenvalues:
88
- Top eigenvalues converged in ~30 iterations
89
- Orthogonality preserved to machine precision (~10⁻¹⁵)
90
- Arnoldi relation AQ_k = Qk₊₁H̃_k verified numerically
91
92
The visualization shows Ritz value convergence, the characteristic Hessenberg structure (zeros below subdiagonal), and eigenvalue distribution matching.
93
94
### View the full notebook
95
96
You can run and modify this notebook directly in your browser:
97
https://cocalc.com/github/Ok-landscape/computational-pipeline/blob/main/notebooks/published/arnoldiᵢteration.ipynb
98
99
Happy to answer questions about the implementation!
100
101
---
102
103
## 6. Facebook (500 chars max)
104
105
Ever wonder how computers find eigenvalues of massive matrices? They don't compute them directly - that's way too expensive!
106
107
Instead, algorithms like Arnoldi iteration build a small "summary" matrix whose eigenvalues approximate the real ones.
108
109
I implemented this in Python and watched eigenvalues converge in real-time. After just 30 iterations on a 100×100 matrix, the approximations matched to 10+ decimal places!
110
111
This powers Google's PageRank and countless scientific simulations.
112
113
Check out the interactive notebook: https://cocalc.com/github/Ok-landscape/computational-pipeline/blob/main/notebooks/published/arnoldiᵢteration.ipynb
114
115
---
116
117
## 7. LinkedIn (1000 chars max)
118
119
**Implementing the Arnoldi Iteration: A Foundation of Modern Numerical Computing**
120
121
Just completed an implementation of the Arnoldi iteration algorithm, which serves as the backbone for solving large-scale linear systems and eigenvalue problems in scientific computing.
122
123
**Technical Approach:**
124
125
The algorithm constructs an orthonormal basis for the Krylov subspace K_k(A,b) = span{b, Ab, A²b,...,Ak⁻¹b} using modified Gram-Schmidt orthogonalization. This produces:
126
127
• Orthonormal matrix Q_k ∈ ℝⁿˣᵏ
128
• Upper Hessenberg matrix H_k satisfying AQ_k = Qk₊₁H̃_k
129
130
**Key Results:**
131
132
• Ritz values (eigenvalues of H_k) converged to true eigenvalues within 30 iterations
133
• Maintained orthogonality to machine precision (~10⁻¹⁵)
134
• Verified the Arnoldi relation to numerical precision
135
136
**Applications:**
137
138
This method underpins GMRES (solving Ax = b), implicitly restarted Arnoldi for eigenproblems, model order reduction, and exponential integrators for PDEs.
139
140
**Skills demonstrated:** Python, NumPy, numerical linear algebra, algorithm implementation, scientific visualization with Matplotlib.
141
142
View the complete implementation: https://cocalc.com/github/Ok-landscape/computational-pipeline/blob/main/notebooks/published/arnoldiᵢteration.ipynb
143
144
---
145
146
## 8. Instagram (500 chars max)
147
148
Building eigenvalue solvers from scratch ✨
149
150
The Arnoldi iteration transforms finding eigenvalues of huge matrices into a tractable problem.
151
152
How? It builds a small matrix H_k whose eigenvalues approximate those of the original matrix A.
153
154
Swipe to see:
155
📊 Ritz value convergence → eigenvalues emerging
156
🔢 Hessenberg matrix structure → the zeros below subdiagonal
157
📈 Distribution matching → approximation quality
158
159
The largest eigenvalues converge fastest - exactly what we need for most applications!
160
161
#NumericalMethods
162
#LinearAlgebra
163
#Python
164
#DataScience
165
#Mathematics
166
#CodingLife
167
#ScienceVisualization
168
169