Path: blob/main/frontier/08-lattices-post-quantum/sage/08b-shortest-vector-problem.ipynb
483 views
The Shortest Vector Problem
Module 08 | 08-lattices-post-quantum
SVP definition, good vs bad bases, successive minima, Minkowski bounds, brute-force enumeration
Objectives
By the end of this notebook you will be able to:
Define the Shortest Vector Problem (SVP) and its approximate variant -SVP.
Explain why SVP is easy with a good basis but computationally hard in general.
Compute successive minima for small lattices.
Apply Minkowski's theorem and the Gaussian heuristic to estimate shortest vector lengths.
Connect SVP hardness to the security of post-quantum cryptosystems like Kyber and Dilithium.
Prerequisites
Completion of Lattices and Bases (08a) — you should be comfortable with lattice bases, integer linear combinations, and the idea that different bases can generate the same lattice.
Basic linear algebra: norms, determinants, Gram-Schmidt orthogonalization.
From Bases to Hard Problems
Bridge from 08a: In the previous notebook, we saw that a lattice can have many different bases. Some bases are "good" — short, nearly orthogonal vectors — and some are "bad" — long, nearly parallel vectors. We even saw that a unimodular transformation can turn a nice basis into a terrible one while keeping the lattice unchanged.
This raises a natural question: if I hand you a bad basis, can you find a short lattice vector? This is essentially the Shortest Vector Problem, and it turns out to be one of the hardest problems in computational mathematics.
Motivating Question: I give you a lattice basis in 500 dimensions. Somewhere in that lattice is a vector of length 1. Can you find it? The best known algorithms would take roughly operations — far more than the number of atoms in the observable universe. This hardness is exactly what makes post-quantum cryptography possible.
1. SVP: The Definition
The Shortest Vector Problem (SVP): Given a basis of a lattice , find a nonzero vector such that
We write for the length of this shortest nonzero vector (using the Euclidean norm).
The key subtlety: the problem gives you a basis, not the lattice itself. The lattice is infinite, and you only see it through the window of whatever basis you are given. A good basis makes obvious; a bad basis hides it completely.
2. SVP Is Easy with a Good Basis
If the basis vectors are orthogonal (or nearly so), the shortest lattice vector is simply the shortest basis vector. Let us verify this.
3. SVP Is Hard with a Bad Basis
Now let us take the same lattice but present it with a "bad" basis — one where the vectors are long and nearly parallel. Can you spot the shortest vector just by looking at the basis?
Checkpoint: Before running the next cell, look at the bad basis vectors below. Can you guess what is? (Hint: it is the same lattice as above!)
Misconception Alert: "SVP is easy — just try all lattice vectors!"
This is tempting but fatally flawed. The number of lattice points within a ball of radius grows as roughly in dimension . In dimension , even restricting to vectors of length gives on the order of candidates. No computer can enumerate this. SVP is easy in 2D (we can draw the picture!) but becomes astronomically hard as the dimension grows.
4. Successive Minima
The shortest vector length is just the first of a sequence of important quantities.
Definition (Successive Minima). For an -dimensional lattice , the -th successive minimum is the smallest radius such that the ball of radius centered at the origin contains linearly independent lattice vectors:
In particular:
= length of the shortest nonzero lattice vector
= length of the shortest vector linearly independent from the -vector
Let us compute these for a concrete lattice.
5. Approximate SVP: the -SVP Problem
Exact SVP is NP-hard (under randomized reductions). But even approximate SVP is believed to be hard!
-SVP: Given a basis of lattice , find a nonzero vector with
The factor is the approximation factor:
: exact SVP (hardest)
: what LLL achieves (polynomial time, but exponential approximation factor)
: believed to be hard! This is the regime that cryptography relies on.
Crypto Foreshadowing: The LLL algorithm (next notebook, 08c) approximately solves SVP with . This is good enough to break some old cryptosystems (knapsack-based schemes, for instance), but not good enough to break Kyber or Dilithium, which require approximation.
6. How Short Can the Shortest Vector Be?
We need bounds on to understand what "short" means for a given lattice.
Minkowski's Theorem
Theorem (Minkowski, 1896). For any -dimensional lattice ,
More precisely, where is Hermite's constant — the smallest value such that the bound holds for every -dimensional lattice.
Known values of : , , , , ...
For large : (asymptotically).
The Gaussian Heuristic
For a "random" lattice in dimension , the expected shortest vector length is approximately:
This is the Gaussian heuristic. It tells us that in high dimensions, the shortest vector length grows like times the -th root of the determinant.
7. Brute-Force SVP in 2D
In low dimensions we can solve SVP exactly by enumerating all lattice points within a radius bound and keeping the shortest. Let us implement this and see why it is completely infeasible in high dimensions.
Checkpoint: Before running the next cell, think about this: if we search all integer combinations with , how many vectors do we check? How does this grow in dimension ?
8. Connection to Cryptography
The hardness of SVP (and its approximate variant) is the foundation of lattice-based cryptography.
Here is the key chain of ideas:
SVP is hard in high dimensions. No known algorithm solves exact SVP in polynomial time. The best algorithms (lattice sieving) run in time .
Approximate SVP is also hard. For polynomial approximation factors , no efficient algorithm is known. Even quantum computers do not help!
LWE reduces to approximate SVP. The Learning With Errors problem (notebook 08d) has a worst-case to average-case reduction: if you can solve random LWE instances, you can solve approximate SVP on any lattice.
Kyber and Dilithium are built on LWE/MLWE. These NIST post-quantum standards derive their security from the assumed hardness of structured variants of SVP.
| Algorithm | Based on | Security relies on |
|---|---|---|
| Kyber (ML-KEM) | Module-LWE | Approximate SVP in module lattices |
| Dilithium (ML-DSA) | Module-LWE + SIS | Approximate SVP + Short Integer Solution |
Crypto Foreshadowing: In the next notebook (08c), we will implement the LLL algorithm, which solves SVP with approximation factor . This exponential factor is too large to threaten Kyber (which only needs hardness), but it is enough to break some classical lattice schemes. The gap between what LLL achieves and what cryptography needs is precisely where security lives.
Exercises
Exercise 1: Finding and (Fully Worked)
Problem: For the lattice with basis , find and .
Solution: We enumerate lattice points and sort by norm.
Exercise 2: Minkowski Bound Check (Guided)
Problem: For the lattice with basis :
Compute .
Compute the Minkowski bound .
Compute the Gaussian heuristic estimate.
Find the actual using
brute_force_svp(defined earlier).Verify that Minkowski bound.
Scaffolding: The structure is provided below. Fill in the lines marked # YOUR CODE.
Exercise 3: Bad Basis Challenge (Independent)
Problem: I give you this basis for a 2D lattice:
The basis vectors have norms around 1690 and 2727. But the shortest lattice vector is much shorter than either basis vector.
Your tasks:
Use
brute_force_svpto find and the shortest vector.Compare the shortest vector length to the basis vector lengths. What is the ratio?
Use SageMath's built-in
.LLL()method on the basis and examine the first row. Is it the shortest vector?Reflect: Why is this problem trivial in 2D but terrifying in 500D?
Summary
| Concept | Key idea |
|---|---|
| SVP definition | Given a lattice basis , find the shortest nonzero vector in . Its length is . |
| Basis quality matters | With a good (nearly orthogonal) basis, the shortest vector is a basis vector. With a bad basis, it is hidden and hard to find. |
| Successive minima | capture the sequence of shortest linearly independent vectors in the lattice |
| Minkowski's theorem | Upper bound: . The Gaussian heuristic predicts the typical value. |
| Brute-force enumeration | Works in 2D but is infeasible in high dimensions, where the search space grows as |
| Approximate SVP | -SVP is also believed hard for polynomial . This is the foundation of Kyber, Dilithium, and all lattice-based post-quantum crypto. |
Next: The LLL Algorithm (08c) — a polynomial-time algorithm that approximately solves SVP with factor .