Katrin Schenk
Problem 1.
Part a. Let the transformation from the coordinate system {x
α} to the system {x
α¯} be
x
α¯= x
α¯(x
α). (1)
Define the matrices
Λ
α¯α≡ ∂x
α¯∂x
α(2)
and
Λ
αα¯≡ ∂x
α∂x
α¯, (3)
which satisfy Λ
α¯αΛ
αβ¯= δ
βα¯¯. It follows that
∂
α¯= Λ
αα¯∂
α. (4)
From the definition of the connection coefficients we now obtain
∇
∂β¯∂
α¯= Γ
τ¯α ¯¯β∂
τ¯= ∇
(Λλ¯β∂λ)
(Λ
µα¯∂
µ) = Λ
λβ¯∇
∂λ(Λ
µα¯∂
µ)
= Λ
λβ¯(Λ
µα,λ¯∂
µ+ Λ
µα¯Γ
τµλ∂
τ)
= Λ
λβ¯(Λ
µα,λ¯Λ
¯τµ+ Λ
µα¯Λ
¯τγΓ
γµλ)∂
τ¯, (5)
and thus
Γ
τ¯α ¯¯β= Λ
λβ¯Λ
µα¯Λ
¯τγΓ
γµλ+ Λ
λβ¯Λ
τ¯µΛ
µα,λ¯, (6) which can be rewritten as
Γ
αβ ¯¯¯γ= ∂x
α¯∂x
α∂x
β∂x
β¯∂x
γ∂x
¯γΓ
αβγ+ ∂x
α¯∂x
α∂
2x
α∂x
β¯∂x
γ¯. (7)
Part b. As an example, we will calculate one of the connection coefficients, Γ
rϕϕ, by transforming this connection coefficient from Cartesian coordinates [x
i= (x, y, z)] to spherical polar coordinates [x
¯i= (r, θ, ϕ)] using the result of Part (a). In Cartesian coordinates, the connection vanishes, so only the second term in Eq. (6) contributes. Hence we obtain
Γ
¯i¯j¯k= ∂x
¯i∂x
i∂
2x
i∂x
¯j∂x
¯k. (8)
Now using the transformation (x, y, z) = (r sin θ cos ϕ, r sin θ sin ϕ, r cos θ) we get
Γ
rϕϕ= ∂r
∂x
i∂
2x
i∂ϕ∂ϕ
= ∂r
∂x
∂
2x
∂ϕ∂ϕ + ∂r
∂y
∂
2y
∂ϕ∂ϕ
= x
r
∂
2x
∂ϕ∂ϕ + y r
∂
2y
∂ϕ∂ϕ
= − (x
2+ y
2) r
= −r sin
2θ. (9)
As a check, lets make sure this matches the Γ
rϕϕcalculated in the standard way from the metric in spherical coordinates
ds
2= dr
2+ r
2dθ
2+ r
2sin
2θdϕ
2. (10)
From this metric we get
Γ
rϕϕ= 1
2 g
rα(g
αϕ,ϕ+ g
αϕ,ϕ− g
ϕϕ,α)
= − 1
2 g
rrg
φϕ,r= −r sin
2θ. (11)
The other connection coefficients are
Γ
rθθ= −r
Γ
θϕϕ= − cos θ sin θ Γ
θθr= 1
r Γ
ϕϕθ= cot θ Γ
ϕϕr= 1
r , (12)
all others being zero except those related to the above coefficients by interchanging the two covariant indices. To calculate the divergence of a vector in this coordinate system we write
∇
iv
i= v
,ii+ Γ
ijiv
j= v
,rr+ v
,θθ+ v
ϕ,ϕ+ Γ
rjrv
j+ Γ
θjθv
j+ Γ
ϕjφv
j= v
,rr+ v
,θθ+ v
ϕ,ϕ+ Γ
θrθv
r+ Γ
ϕrϕv
r+ Γ
ϕθϕv
θ= v
,rr+ 2
r v
r+ v
θ,θ+ cot θv
θ+ v
,ϕϕ= 1
r
2∂
∂r (r
2v
2) + 1 sin θ
∂
∂θ (sin θv
θ) + ∂v
ϕ∂ϕ . (13)
Question: Why is this answer different than the result for ∇ · v that you would find, say, on the back cover of Jackson?
Problem 2:
Part a. We want to prove that if g is a non-degenerate, symmetric, covariant, two index tensor on a vector space V , that one can always find a basis θ
αˆof V
∗such that g = g
α ˆˆβθ
αˆ⊗ θ
βˆ, where g
α ˆˆβis diagonal with each element being ±1. [Note that the hypothesis of symmetry was omitted from the question, it should have been included]. From the fact that g is symmetric and non-degenerate we know that it is diagonalizable and has a set of real, non-zero eigenvalues {λ
α} α = 1, . . . , n, and a corresponding set of linearly independent eigenvectors, {~v
α}, such that
g(~ v
α, ~ v
β) = δ
βαλ
β. (14)
By linearity we are free to choose the normalization of these eigenvectors. If we choose
~ v
αˆ≡ ~ v
αq
|λ
α| (15)
then we get
g(~ v
αˆ, ~ v
βˆ) = ±δ
αβˆˆ. (16) From the orthonormal basis vectors, {~v
αˆ} we can construct the dual basis {θ
αˆ} in the usual way, from which the result follows.
Part b. Pick an arbitrary coordinate system {x
α¯} in a neighborhood of P with x
α¯( P) = 0.
Then, the basis vectors ~ e
αcan be written as
~ e
α= e
µ¯α(x
γ¯) ∂
∂x
µ¯.
Define a new coordinate system x
αto be linearly related to the old coordinate system, via x
µ¯= e
µ¯α(0)x
α,
where the matrix appearing in this equation is e
µ¯αevaluated at the point P. It now follows that
~
e
α( P) = ∂
∂x
α!
P