Visual-first
Mermaid diagrams, geometric intuition, tables, and worked examples turn abstract linear algebra into something you can picture.
Intuitive linear algebra for readers who want to see what matrices mean, not just compute with them.
This build is designed to feel closer to a digital textbook than a plain markdown dump. The site has persistent navigation, chapter-by-chapter flow, book typography, local search, math rendering, and Mermaid diagrams rendered inline.
The core teaching approach stays consistent throughout:
Authors: John Olafenwa and GPT-5.4
This book was created for personal learning by John Olafenwa, using the OpenAI Codex App powered by GPT-5.4.
For a citation-ready version, see the Attribution page.
Book Part
Build the language of matrices from intuition, shape, operations, and systems.
Why matrices appear everywhere, and how to think of them as tables, machines, and maps.
Rows, columns, dimensions, vectors, and the different ways a matrix can be read.
Addition, scaling, multiplication, transpose, and the meaning behind the rules.
Gaussian elimination, pivots, augmented matrices, and the logic of solving many equations at once.
Book Part
Move from calculation into geometry, structure, invertibility, and the hidden shape inside matrices.
Stretching, rotating, shearing, and understanding matrices through movement in space.
Signed area, signed volume, orientation, invertibility, and why determinants measure scaling.
Undoing transformations, solving efficiently, and breaking matrices into simpler pieces.
Span, independence, column space, null space, dimension, and rank.
Book Part
Study invariant directions, approximation, and the decompositions that organize complex linear behavior.
Dot products, projections, orthonormal bases, and fitting imperfect data.
Invariant directions, scaling factors, and why some directions matter more than others.
Repeated matrix action, powers of matrices, and discrete-time systems.
Energy, curvature, ellipses, principal axes, and why symmetry simplifies everything.
Rotate, stretch, rotate again: the geometry and power of SVD.
Book Part
Apply matrix thinking to networks, data, continuous systems, and real computational constraints.
Graphs, transitions, walks, steady states, and long-run behavior.
Datasets, features, image grids, embeddings, covariance intuition, and practical modeling ideas.
Coupled systems, matrix exponentials, and continuous-time dynamics.
Floating point arithmetic, conditioning, stability, and the computational reality of matrix problems.
A synthesis chapter with concept links, pitfalls, and directions for further study.
Reference
Practice support and quick-reference material for revision and review.
Worked practice that shows how to attack representative matrix problems step by step.
A notation guide and translation map between algebraic, geometric, and applied viewpoints.