title: | ProblemSet 1 -- Linear algebra and representations **Solutions**author: GeorgeMcNinchdate: 2024-01-29
---
F denotes an algebraically closed field of characteristic 0. Ifyou like, you can suppose that F=C is the field ofcomplex numbers.
Let V be a finite dimensional vector space over the field F.Suppose that ϕ,ψ:V→V are linear maps. Let λ∈F be an eigenvalue of ϕ and write W for theλ-eigenspace of ϕ; i.e. W={v∈V∣ϕ(v)=λv}. If ϕψ=ψϕ show that W isinvariant under ψ -- i.e. show that ψ(W)⊆W.
::: {.solution}Solution:Let w∈W. We must show that x=ψ(w)∈W. To do this, wemust establish that x=ψ(w) is a λ-eigenvector forϕ.
We haveϕ(x)=ϕ(ψ(x))=ψ(ϕ(w))=ψ(λw)=λψ(w)=λxsince ϕ∘ψ=ψ∘ϕsince w is a λ-eigenvectorsince ψ is linearThis completes the proof.:::
Let n∈N be a non-zero natural number, and let V bean n dimensional F-vector space with a given basise1,e2,⋯,en.
Consider the linear transformation T:V→V given by the ruleTei=ei+1(modn).In other wordsTei={ei+1e1i<ni=n.
a. Show that T is invertible and that Tn=idV.
::: {.solution}To check that Tn=idV, we check thatTn(ei)=ei for 1≤i≤n.
From the definition, it follows by induction on the natural number m thatTm(ei)=ei+m(modn).Thus Tn(ei)=ei+n(modn)=ei. Since this holds for every i, concludeTn=idV.
Now T is invertible since its inverse is given by Tn−1.:::
b. Consider the vector v0=i=1∑nei. Show thatv0 is a 1-eigenvector for T.
::: {.solution}We computeT(v0)=T(i=1∑nei)=i=1∑nT(ei)=i=1∑nei+1(modn)=j=2∑n+1ej(modn)=j=1∑nej(modn)=v0(let j=i+1)Thus T(v0)=v0 so indeed v0 is a 1-eigenvector.:::
Let ζ∈F be a primitive n-th root of unity. (e.g. if you assume F=C, you may as well takeζ=e2πi/n).
c. Let v1=i=1∑nζiei. Show thatv1 is a ζ−1-eigenvector for T.
::: {.solution}We computeT(v1)=T(i=1∑nζiei)=i=1∑nζiT(ei)=i=1∑nζiei+1(modn)=j=2∑n+1ζj−1ej(modn)=ζ−1j=2∑n+1ζjej(modn)=ζ−1j=1∑nζjej(modn)=ζ−1v1(let j=i+1)(since ζj=ζj(modn) ∀j)Thus T(v1)=ζ−1v1 so indeed v0 is a ζ−1-eigenvector.:::
d. More generally, let 0≤j<n and let vj=i=1∑nζijei. Show that vjis a ζ−j-eigenvector for T.
::: {.solution} The calcuation in the solution to part (c) is validfor anyn-th root of unity unity ζ. Applying thiscalculation for ζj shows that vj is aζ−j-eigenvector for T as required.:::
e. Conclude that v0,v1,⋯,vn−1 is a basis of Vconsisting of eigenvectors for T, so that T isdiagonalizable.
Hint: You need to use the fact that eigenvectors for distinct eigenvaluesare linearly independent.
What is the matrix of T in this basis?
::: {.solution}
Since eigenvectors for distinct eigenvalues are linearly independent,conclude that the vectors B={v0,v1,⋯,vn−1} are linearly independent.Since there n vectors in B and since dimV=n, concludethat B is a basis for V.
The matrix of T in the basis B is given by[T]B=100⋮00ζ−10⋮000ζ−2⋮0⋯⋯⋯⋱⋯000⋮ζ−n+1
(This form explains why an n×n matrix M isdiagonalizable iff Fn has a basis of eigenvectors for M).:::
Let G=Z/3Z be the additive group of order3, and let ζ be a primitive 3rd root of unity in F.
To define a representation ρ:G→GLn(F), itis enough to find a matrix M∈GLn(F) with M3=1; in turn, M determines a representation ρ by the ruleρ(i+3Z)=Mi.
Consider the representationρ1:G→GL3(F)given by the matrix ρ1(1+3Z)=M1=1000ζ000ζ2and consider the representationρ2:G→GL3(F) given bythe matrix ρ2(1+3Z)=M2=010001100.
Show that the representationsρ1 and ρ2 areequivalent (alternative terminology: are isomorphic). In otherwords, find a linear bijection Φ:F3→F3 with the propertythat Φ(ρ2(g)v)=ρ1(g)Φ(v) for every g∈Gand v∈F3.
Hint: First find a basis of F3 consisting of eigenvectorsfor the matrix M2.
::: {.solution}The matrix M1 is diagonal, which is to say that the standard basis vectorse1=100,e2=010,e3=001are eigenvectors for M1 with respective eigenvalues 1,ζ,ζ2.
By the work in problem 2, we see thatv1=e1+e2+e3,v2=e1+ζe2+ζ2e3,v3=e1+ζ2e2+ζe3are eigenvectors for M2 with respective eigenvalues1,ζ2,ζ.
Now let Φ:F3→F3 be the linear transformationfor which Φ(e1)=v1,Φ(e2)=v3,Φ(e3)=v2.
We claim that Φ defines an isomorphism of G-representations(ρ1,F3)∼(ρ2,F3).
We must check that Φ(ρ1(g)v)=ρ2(g)Φ(v) for all g∈G and all v∈F3.
Since G is cyclic it suffices to check that (♣)Φ(M1v)=M2Φ(v)∀v∈F3.
(Indeed, (♣) amounts to "checking on a generator". If(♣) holds then for every natural number i astraightforward induction argument shows for every v∈F3 thatΦ(ρ1(i+3Z)v)=Φ(ρ1(1+3Z)iv)=Φ(M1iv)=M2iΦ(v)=ρ2(1+3Z)iΦ(v)=ρ2(i+3Z)Φ(v) )
In turn, it suffices to verify the (♣) holds for the basis vectors e1,e2,e3for V=F3.
Since e1 and v1 are 1-eigenvectors for M1 resp. M2, we haveΦ(M1e1)=Φ(e1)=v1=M2v1.
Since e2 and v3 are ζ-eigenvectors for M1 resp. M2, we haveΦ(M1e2)=Φ(ζe2)=ζΦ(e2)=ζv3=M2v3.
Since e3 and v2 are ζ2-eigenvectors for M1 resp. M2, we haveΦ(M1e3)=Φ(ζ2e3)=ζ2Φ(e3)=ζ2v2=M2v2.Thus (♣) holds and the proof is complete.
---
Alternatively, note that the matrix of Φ in the standard basis is given by[Φ]=1111ζ2ζ1ζζ2
Now, to prove that Φ∘ρ1(g)=ρ2(g)∘Φ, it suffices to check thatM2[Φ]=[Φ]M1 i.e. that$$010001100 \cdot 1111ζ2ζ1ζζ2
1111ζ2ζ1ζζ2
\cdot 1000ζ000ζ2$$
IN fact, both products yield the matrix111ζ1ζ2ζ21ζ:::
Let V be a n dimensional F-vector space for n∈N.
Let GL(V) denote the group $$\operatorname{GL}(V)
= \{ \text{all invertible $F−lineartransformations\phi:V \toVParseError: KaTeX parse error: Expected 'EOF', got '}' at position 1: }̲\}$where the group operation is composition of linear transformations.
Recall that GLn(F) denotes the groupof all invertible n×n matrices.
If B={b1,b2,⋯,bn} is a choice of basis, showthat the assignment ϕ↦[ϕ]Bdetermines an isomorphismGL(V)∼GLn(F).
Here [ϕ]B=[Mij] denotes the matrix of ϕin the basis B defined by equations
ϕ(bi)=k=1∑nMkibk.
::: {.solution}Lets write Φ for the mappingΦ:GL(V)→GLn(F) defined above.
An important property -- proved in Linear Algebra -- is that forϕ,ψ:V→V we have (♡)[ϕ∘ψ]B=[ϕ]B⋅[ψ]B. In words: "once you choose a basis,composition of linear transformations corresponds to multiplicationof the corresponding matrices".
Now, since the matrix of the endomorphism ϕ:V→V is equalto the identity matrix In if and only if ϕ=idV, (♡) shows at once that a lineartransformation ϕ:V→V is invertible if and only if[ϕ]B is an invertible matrix.
This confirms that Φ is indeed a group homomorphism.
To show that Φ is an isomorphism, we exhibit its inverse.Namely, we defined a group homomorphismΨ:GLn(F)→GL(V)and check that Ψ is the inverse to Φ.
TO define Ψ, we introduce the linear isomorphismβ:Fn→V defined by the ruleβa1a2⋮an=i=1∑naibi.
For an invertible matrix M, we defineΨ(M):V→Vby the ruleΨ(M)(v)=βM⋅β−1v
If M1,M2∈GLn(F) then for every v∈V we haveΨ(M1M2)v=βM1M2⋅β−1v=βM1β−1βM2⋅β−1v=Ψ(M1)Ψ(M2)vThis confirms that Ψ is a group homomorphism.
It remains to observe that for M∈GLn(F) wehave Φ∘Ψ(M)=M, which amounts to the fact thatM is the matrix of Ψ(M), and we must observe for g∈GL(V) hat Ψ∘Φ(g)=g whichamounts to the observation that the transformation g:V→V isdetermined by its effect on the basisvectors bi and hence by the matrix Φ(g).