Skip to main content

Postmodern Algebra

Section 13.1 Integral Extensions

“What you do not wish upon yourself, extend not to others.”
―Confucius
In field theory, there is a close relationship between (vector space-)finite field extensions and algebraic equations. The situation for rings is similar, but much more subtle.

Definition 13.1. Integral Element.

Let \(R\) be a ring, \(S\) an \(R\)-algebra, and \(I\) an ideal. An element \(r\) of \(R\) is integral over \(I\) if it satisfies an equation of the form
\begin{equation*} r^n + a_1 r^{n-1} + \cdots + a_{n-1} r + a_n = 0 \quad \text{with} \ a_i \in I^i \ \text{for all} \ i. \end{equation*}
An element of \(S\) is integral over \(I\) if
\begin{equation*} s^n + a_1 s^{n-1} + \cdots + a_{n-1} s + a_n = 0 \quad \text{with} \ a_i \in I^i \ \text{for all} \ i. \end{equation*}
In both cases, we say that \(r\) and \(s\) satisfy an equation of integral dependence over \(I.\)
FIX

Remark 13.2.

Integral automatically implies algebraic, but the condition that there exists an equation of algebraic dependence that is monic is stronger in the setting of rings. This is very different to what happens over fields, where algebraic and integral are equivalent conditions.

Example 13.3. \(\sqrt{2}\) Integral in \(\Z\).

Consider the \(\Z\)-algebra \(R = \Z[\sqrt{2}] = \{ a + b \sqrt{2} \mid a, b \in \Z \}\text{.}\) The element \(\sqrt{2}\) is integral over \(\Z.\)
Solution.
since it satisfies the equation of integral dependence \(x^2-2 = 0\text{.}\)

Example 13.4. \(\frac{1}{2}\) in \(\Q\) Not Integral in \(\Z\).

Prove \(\frac{1}{2} \in \mathbb{Q}\) is not integral over \(\mathbb{Z}.\)
Solution.
If \(a_0, \ldots, a_{n-1} \in \Z\) are such that
\begin{equation*} \left( \frac{1}{2} \right)^n + a_{n-1} \left( \frac{1}{2} \right)^{n-1} + \cdots + a_0 = 0, \end{equation*}
then multiplying by \(2^n\) gives
\begin{equation*} 1 + 2a_{n-1} + \cdots + 2^n a_0 = 0, \end{equation*}
which is impossible for parity reasons (the left hand-side is odd!). Notice, in contrast, that \(\frac{1}{2}\) is algebraic over \(\Z\text{,}\) since it satisfies \(2x-1=0\text{.}\)

Definition 13.5. Integrally Closed.

Consider an an inclusion of rings \(A \subseteq R\text{.}\) The integral closure of \(A\) in \(R\) is the set of elements in \(R\) that are integral over \(A\text{.}\) We say \(A\) is integrally closed in \(R\) if \(A\) is its own integral closure in \(R\text{.}\) The integral closure of a domain \(R\) in its field of fractions is usually denoted by \(\overline{R}.\)

Definition 13.6. Normal Domain.

A normal domain is a domain \(R\) that is integrally closed in its field of fractions, meaning \(R = \overline{R}.\)

Example 13.7. Integers are a Normal Domain.

The ring of integers \(\mathbb{Z}\) is a integrally closed, meaning its integral closure in its fraction field \(\mathbb{Q}\) is \(\mathbb{Z}\) itself. The key idea to show this is similar to the argument we used Example 13.4 is not integral over \(\Z\text{.}\)
In fact, this is a special case of the fact that every UFD is normal.

Example 13.8. UFDs are Normal.

Show that every UFD is normal.

Remark 13.9.

We cannot talk about the integral closure of a ring \(R\) without specifying in what extension; the integral closures of \(R\) in different extension can be very different. In Example 13.4, we saw that the integral closure of \(\Z\) in \(\Z[\sqrt{2}]\) contains at least \(\Z\) and \(\sqrt{2}\text{,}\) while Example 13.7 says that the integral closure of \(\Z\) in \(\Q\) is \(\Z\text{.}\)
When \(R\) is a domain, if we ever refer to the integral closure of \(R\text{,}\) it is understood that we mean the integral closure of \(R\) in its field of fractions, \(\overline{R}\text{.}\)

Remark 13.10.

An element \(r \in R\) is integral over \(A\) if and only if \(r\) is integral over the subring \(\varphi(A)\subseteq R\text{,}\) so we might as well assume that \(\varphi\) is injective.

Proof.

  1. Let \(r\) be integral over \(A\text{,}\) with \(r^n + a_{n-1} r^{n-1} + \cdots + a_1 r + a_0 = 0\) for some \(a_i \in A\text{.}\) We claim that \(A[r] = A + Ar + \cdots + A r^{n-1}\text{.}\) Since \(A[r]\) is generated by all the powers \(r^m\) of \(r\) as an \(A\)-module, to show that any polynomial \(p(r) \in A[r]\) is in \(A + Ar + \cdots + A r^{n-1}\) it is enough to show that \(r^m \in A + Ar + \cdots + A r^{n-1}\) for all \(m\text{.}\)
Using induction on \(m\text{,}\) the base cases \(1, r, \ldots, r^{n-1} \in A + Ar + \cdots + A r^{n-1}\) follow from the fact that \(r^{k}\) is contained in the \(Ar^{k}\)-th component. For the induction step, we need to show that \(r^m \in A + Ar + \cdots + A r^{n-1}\) for all \(m \geq n-1\text{;}\) we can do this by induction because we can use the equation above to rewrite \(r^m\) as
\begin{equation*} \begin{aligned}r^m & = - r^{m-n}(a_{n-1} r^{n-1} + \cdots + a_1 r + a_0) \\& = - a_{n-1} r^{m-1} - \cdots a_1 r^{m-n+1} - a_0 r^{m-n},\end{aligned} \end{equation*}
which is a linear combination of powers of \(r\) of degree up to \(m-1\text{.}\)
  1. Write
    \begin{equation*} A_0 := A \subseteq A_1 := A[r_1] \subseteq A_2 := A[r_1,r_2] \subseteq \cdots \subseteq A_t := A[r_1,\dots,r_t]. \end{equation*}
    Since \(r_i\) is integral over \(A\text{,}\) it is also integral over \(A_{i-1}\text{,}\) via the same monic equation that \(r_i\) satisfies over \(A\text{.}\) By Part (a), we conclude that the each extension \(A_{i-1} \subseteq A_i\) is module-finite. Thus the inclusion \(A \subseteq A[r_1,\dots,r_t]\) is a composition of module-finite maps, and thus by this [provisional cross-reference: cite] [[Mathematics/Commutative Algebra/Results/Lemma - Compositions Preserve Module Finite|Lemma]] it is also module-finite.
In what follows, we will need the following elementary linear algebra fact, which is actually very useful in various contexts within commutative algebra. In fact, later in this class we will use this useful fact again, perhaps when you least expect it. This is a nice example of an algebra fact that holds over any ring that we can actually reduce to the case of fields.

Definition 13.12. Classical Adjoint.

The classical adjoint of an \(n\times n\) matrix \(B=[b_{ij}]\) is the matrix \(\mathrm{adj}(B)\) with entries
\begin{equation*} \mathrm{adj}(B)_{ij}=(-1)^{i+j} \det(\widehat{B_{{ji}}}), \end{equation*}
where \(\widehat{B_{{ji}}}\) is the matrix obtained from \(B\) by deleting its \(j\)th row and \(i\)th column.

Proof.

  1. When \(R\) is a field, this is a basic linear algebra fact. We will deduce the case of a general ring from the field case. The ring \(R\) is a \(\Z\)-algebra, so we can write \(R\) as a quotient of some polynomial ring \(\Z[X]\text{.}\) Let \(\psi:\Z[X]\to R\) be a surjection, \(a_{ij}\in \Z[X]\) be such that \(\psi(a_{ij})=b_{ij},\) and let \(A=[a_{ij}]\text{.}\) Note that
    \begin{equation*} \psi(\mathrm{adj}(A)_{ij})=\mathrm{adj}(B)_{ij} \quad \textrm{ and } \quad \psi((\mathrm{adj}(A) A)_{ij}) = (\mathrm{adj}(B) B)_{ij}, \end{equation*}
    since \(\psi\) is a homomorphism, and the entries are the same polynomial functions of the entries of the matrices \(A\) and \(B\text{,}\) respectively. Thus, it suffices to establish
    \begin{equation*} \mathrm{adj}(B) B = \det(B) I_{n\times n} \end{equation*}
    in the case when \(R=\Z[X]\text{,}\) and we can do this entry by entry. Now, \(R=\Z[X]\) is an integral domain, hence a subring of a field (its fraction field). Since both sides of the equation
    \begin{equation*} \left( \mathrm{adj}(B) B \right)_{ij} = \left( \det(B) I_{n\times n}\right)_{ij} \end{equation*}
    live in \(R\) and are equal in the fraction field (by linear algebra) they are equal in \(R\text{.}\) This holds for all \(i, j\text{,}\) and thus 1) holds.
  2. By assumption, we have \((r I_{n\times n} - B) v=0\text{,}\) so by part 1)
    \begin{equation*} \det(r I_{n\times n} - B) v=\mathrm{adj}(r I_{n\times n} - B) (r I_{n\times n} - B) v = 0. \end{equation*}

Proof.

Given \(r\in R\text{,}\) we want to show that \(r\) is integral over \(A\text{.}\) The idea is to show that multiplication by \(r\text{,}\) realized as a linear transformation over \(A\text{,}\) satisfies the characteristic polynomial of that linear transformation.
Suppose that \(R = A f_1 + \cdots + A f_n\text{.}\) We may assume that \(f_1=1\text{,}\) perhaps by adding a module generator. Since every element in \(R\) is an \(A\)-linear combination of \(f_1, \ldots, f_n\text{,}\) this is in particular true for the elements \(rf_1, \ldots, r f_n\text{.}\) Thus we can find \(a_{ij} \in A\) such that
\begin{equation*} r f_i = \sum_{j=1}^t a_{ij} f_j \end{equation*}
for each \(i\text{.}\) Consider the matrix \(C=[a_{ij}]\) and the column vector \(v=(f_1,\dots,f_n)\text{.}\) We can now write the equalities above more compactly as \(r v = C v\text{.}\) By the Determinantal Technique, \(\det(r I_{n\times n} - C)v=0\text{.}\) Since we chose one of the entries of \(v\) to be \(1\text{,}\) we have in particular that \(\det(r I_{n\times n} - C)=0\text{.}\) Expanding this determinant as a polynomial in \(r\text{,}\) this is a monic equation with coefficients in \(A\text{.}\)
We are now ready to show the following important characterization of module-finite extensions, which tells us exactly what we need besides algebra-finite to force an extension to be module-finite:

Proof.

(\(\Rightarrow\)): Module-finite implies integral, and algebra-finite.
(\(\Leftarrow\)): If \(R=A[r_1,\dots,r_t]\) is integral over \(A\text{,}\) then each \(r_i\) is integral over \(A\text{,}\) and this implies \(R\) is module-finite over \(A\text{.}\)

Proof.

Let \(R=A[\Lambda]\text{,}\) with \(\lambda\) integral over \(A\) for all \(\lambda\in\Lambda\text{.}\) Given \(r\in R\text{,}\) there is a finite subset \(L\subseteq \Lambda\) such that \(r\in A[L]\text{.}\) This \(A[L]\) is now a finitely-generated algebra generated by integral elements, and thus it must be module-finite over \(A\text{.}\) Module-finite implies integral, and thus \(A[L]\) is an integral extension of \(A\text{.}\) In particular, \(r \in A[L]\) is integral over \(A\text{.}\)

Proof.

By [provisional cross-reference: cor] the \(A\)-subalgebra of \(R\) generated by all elements in \(R\) that are integral over \(A\) is integral over \(A\text{,}\) so it is contained in the set of all elements that are integral over \(A\text{:}\) this means that
\begin{equation*} \{\text{integral elements}\} \subseteq A[\{\text{integral elements}\}] \subseteq \{\text{integral elements}\}, \end{equation*}
so equality holds throughout, and \(\{\text{integral elements}\}\) is a ring contained in \(S\) by definition.
In other words, the integral closure of \(A\) in \(R\) is a subring of \(R\) containing \(A\text{.}\)

Example 13.18. \(\Z[\sqrt{d}]\) Integral Over \(\Z\).

  1. The ring \(\mathbb{Z}[\sqrt{d}]\text{,}\) where \(d \in \mathbb{Z}\) is not a perfect square, is integral over \(\mathbb{Z}\text{.}\) Indeed, \(\sqrt{d}\) satisfies the monic polynomial \(x^2-d\text{,}\) and since the integral closure of \(\mathbb{Z}\) is a ring containing \(\mathbb{Z}\) and \(\sqrt{d}\text{,}\) and \(\mathbb{Z}[\sqrt{d}]\) is the smallest such ring, we conclude that every element in \(\Z[\sqrt{d}]\) is integral over \(\mathbb{Z}\text{.}\)
  2. Let \(R=\C[x,y] \subseteq S=\C[x,y,z]/(x^2+y^2+z^2)\text{.}\) Then we claim that \(S\) is module-finite over \(R.\) To see this we first need to realize \(R\) as a subring of \(S\text{.}\) To do that, consider the \(\C\)-algebra homomorphism
    \begin{equation*} \begin{CD} R@>\varphi>> S\\\\ (x,y)@>>> (x,y) \end{CD} \end{equation*}
    The kernel of \(\varphi\) consists of the polynomials in \(x\) and \(y\) that are multiples of \(x^2+y^2+z^2\text{,}\) but any nonzero multiple of \(x^2+y^2+z^2\) in \(\C[x,y,z]=R[z]\) must have \(z\)-degree at least \(2\text{,}\) which implies it involves \(z\) and thus it is not in \(\C[x,y]\text{.}\) We conclude that \(\varphi\) is injective, and thus \(R \subseteq S.\)
    Now \(S\) is generated over \(R\) as an algebra by one element, \(z\text{,}\) and \(z\) satisfies the monic equation \(t^2 + (x^2+y^2) = 0\text{,}\) so \(S\) is integral over \(R\text{.}\)
Note, however, that not all integral extensions are module-finite.

Example 13.19. Integral but Not Algebra or Module Finite.

Let \(k\) be a field, and consider the \(k[x]\)-algebra \(R\) given by
\begin{equation*} k[x] \subseteq R=k[x,x^{1/2},x^{1/3},x^{1/4},x^{1/5},\ldots]. \end{equation*}
Note that \(x^{1/n}\) satisfies the monic polynomial \(t^n - x\text{,}\) and thus it is integral over \(k[x]\text{.}\) Since \(R\) is generated by elements that are integral over \(k[x]\text{,}\) by it must be an integral extension of \(A\text{.}\) However, \(k[x] \subseteq R\) is not algebra-finite, and thus it is also not module-finite.

Example 13.20. Tower Integral iff Each Piece Integral.

Given ring extensions \(A \subseteq B \subseteq C\text{,}\) the extensions \(A \subseteq B\) and \(B \subseteq C\) are integral if and only if \(A \subseteq C\) is integral.
Finally, here is a useful fact about integral extensions that we will use multiple times.

Proof.

Suppose that \(R\) is a field, and let \(s \in S\) be a nonzero element, which is necessarily integral over \(R\text{.}\) The ring \(R[s]\) is algebra-finite over \(R\) by construction, and integral over \(R\text{.}\) Since \(R \subseteq R[s]\) is integral and algebra-finite, it must also be module-finite. Since \(R\) is a field, this means that \(R[s]\) is a finite-dimensional vector space over \(R\text{.}\) Since \(R[s] \subseteq S\) is a domain, the map \(R[s] \xrightarrow{\, s \,} R[s]\) is injective. Notice that this is a map of finite-dimensional \(R\)-vector spaces, and thus it must also be surjective. In particular, there exists an element \(t \in R[s]\) such that \(st = 1,\) and thus \(s\) is invertible. We conclude that \(S\) must be a field.
Now suppose that \(S\) is a field, and let \(r \in R\text{.}\) Since \(r \in R \subseteq S\text{,}\) there exists an inverse \(r^{-1}\) for \(r\) in \(S\text{,}\) which must be integral over \(R\text{.}\) Given any equation of integral dependence for \(r^{-1}\) over \(R\text{,}\) say
\begin{equation*} (r^{-1})^n + a_{n-1} (r^{-1})^{n-1} + \cdots + a_0 = 0 \end{equation*}
with \(a_i \in R\text{,}\) we can multiply by \(r^{n-1}\) to obtain
\begin{equation*} r^{-1} = -a_{n-1} - \cdots -a_0 r^{n-1} \in R. \end{equation*}
Therefore, \(r\) is invertible in \(R\text{,}\) and \(R\) is a field.
Before we move on from algebra-finite and module-finite extensions, we should remark on what the situation looks like over fields. First, note that over a field, module-finite just means finite dimensional vector space. While over a general ring the notions of algebra-finite and module-finite are quite different, they are actually equivalent over a field. This is a very deep fact, and we will unfortunately skip its proof — since it is a key ingredient in proving a fundamental result in algebraic geometry, we will leave it for the algebraic geometry class next semester. This is a nice application of the Artin-Tate Lemma, which we are going to discuss shortly, together with some facts about transcendent elements.
The following corollary follows immediately from what we proved in this section:

Proof.

By Zariski’s Lemma, \(k \subseteq L\) must be module-finite, making the extension integral. When we are over a field, integral is the same as algebraic, but integrally closed fields have no nontrivial algebraic extensions.