44
$\begingroup$

In general, the intersection of subgroups/subrings/subfields/sub(vector)spaces will still be subgroups/subrings/subfields/sub(vector)spaces. However, the union will (generally) not be.

Is there a "deep" reason for this?

$\endgroup$
2
  • 18
    $\begingroup$ I think you can view this phenomenon as an incarnation of forgetful functors preserving limits, but not colimits (or that forgetful functors often have left adjoints, but not right adjoints). $\endgroup$
    – Stahl
    Commented Dec 14, 2018 at 8:39
  • 5
    $\begingroup$ I suppose generally you could just "fall out of the union" if you multiply/add/whatever objects in a union that are unable to embed in one another. $\endgroup$
    – Munk
    Commented Dec 14, 2018 at 8:43

8 Answers 8

52
$\begingroup$

I wouldn't call it "deep", but here's an intuitive reasoning.

Intersections have elements that come from both sets, so they have the properties of both sets. If, for each of the component sets, there is some element(s) guaranteed to exist within that set, then such element(s) must necessarily exist in the intersection. For example, if $A$ and $B$ are closed under addition, then any pair of elements $x,y\in A\cap B$ is in each of $A$ and $B$, so the sum $x+y$ must be in each of $A$ and $B$, and so $x+y\in A\cap B$. This line of reasoning holds for basically any "structure" property out there, simply by virtue of the fact that all elements come from a collection of sets that simultaneously have that property.

Unions, on the other hand, have some elements from only one set or the other. In a sense, these elements only have one piece of the puzzle, i.e. they only have the properties of one set rather than both. Even if the statement of those properties is the same, like "closure under addition", the actual mechanics of those properties is different from set to set, and may not be compatible. Given $x\in A$ and $y\in B$, we have $x,y\in A\cup B$, but there's no reason to believe that $x+y \in A\cup B$. Sometimes it's simply not true, such as $\Bbb{N}\cup i\Bbb{N}$, where $i\Bbb{N} = \{ z \in \Bbb{C} \ | \ z = in \text{ for some } n\in\Bbb{N} \}$. In this case, the closure under addition which is guaranteed for each of the component sets is not compatible with one another, so you get sums like $1+i$ which isn't in either set. On the other hand, sometimes you do have sets with compatible structure, such as $\Bbb{N}\cup -\Bbb{N}$ (considering $0\in\Bbb{N}$), where any sum of elements from this union still lies in the union.

$\endgroup$
1
  • 12
    $\begingroup$ +1 for the giving an example where the union does still preserve structure. $\endgroup$ Commented Dec 14, 2018 at 18:52
28
$\begingroup$

Algebraic structures are typically defined by universal statements. For example, a group is a structure $(G,\cdot,^{-1},e)$, where $\cdot$ is a binary function, $^{-1}$ is a unary function, and $e$ is a nullary function, satisfying the following axioms:

  1. $\forall x,y,z \; (x\cdot y)\cdot z = x\cdot(y\cdot z)$.
  2. $\forall x \; x \cdot x^{-1} = x^{-1} \cdot x = e$.
  3. $\forall x \; x \cdot e = e \cdot x = x$.

Universal axioms are preserved under intersection but not under union.

$\endgroup$
8
  • 9
    $\begingroup$ Can you elaborate? I don't really get it. (I'm not saying you're wrong, it's just that I don't understand) $\endgroup$ Commented Dec 14, 2018 at 11:42
  • 7
    $\begingroup$ @goblin say you have a universal statement "for all x in A, Px is true"; then if B is a subset of A, you have that "for all x in B, Px is true" too, right?; since the axioms for algebraic structures are universal statements, a subset of a given structure that closed under the operations in question will satisfy the axioms, and hence be an algebraic structure 'of the same kind'; you can check an intersection of structures (of the same kind, provided it exists) is closed under the relevant operations, and, being a subset of them all, is thus an algebraic structure of their kind $\endgroup$
    – user359302
    Commented Dec 14, 2018 at 15:13
  • 2
    $\begingroup$ @goblin This is discussed in most textbooks on model theory and universal algebra. Look up "Tarksi's HSP theorem" and "varieties" in such textbooks, e.g. Chang and Keisler or Hodges Model Theory $\endgroup$ Commented Dec 15, 2018 at 16:29
  • 1
    $\begingroup$ $\DeclareMathOperator{\alg}{Alg}$But isn't this just plain wrong tho? If I have a universal algebra $A\in \alg(\Omega)$ of signature $\Omega$, and define „subalgebra“ as being closed under the operations, then the fact that intersections of subalgebras are again subalgebras has nothing to do with our algebra satisfying any equations or „universal axioms“. The fact that this also applies to „substructures obeying a set of equations“ follows from the fact that if $A$ satisfies some equations, so does every subalgebra. I feel like this answer is completely missing the point. $\endgroup$ Commented Dec 19, 2018 at 0:30
  • $\begingroup$ @Luke Why don’t you write your own, better answer? $\endgroup$ Commented Dec 19, 2018 at 1:10
28
$\begingroup$

Since no one has explained this from a categorical perspective yet, let me try to offer another point of view. Each of the types of objects you mention (groups, rings, fields, vector spaces) form a concrete category. That is, every group, ring, field, or vector space is a set equipped with the data of extra structure, and the homomorphisms between them are set maps which preserve that extra structure.

Another way we might say the above is that if $\mathcal{C}$ is the category of any of the above algebraic objects and their morphisms, we have a forgetful functor \begin{align*} U : \mathcal{C}&\to\mathsf{Set}\\ A&\mapsto UA, \end{align*} which sends each algebraic structure $A$ to its underlying set $UA$ and each homomorphism of algebraic structures $f : A\to B$ to the underlying function on sets $Uf : UA\to UB.$

In each of these situations (well, except when $\mathcal{C}$ is the category of fields), the forgetful functor has a left adjoint - the free object functor. Explicitly, this means that if you're given a group, ring, or vector space (more generally module) $A$ and a set $S,$ then there is a natural bijection $$ \{\textrm{homomorphisms of algebraic structures }f : F(S)\to A\}\cong\{\textrm{maps of sets }g : S\to UA\}, $$ where $F(S)$ denotes the free [group, ring, vector space, module...] on $S.$ This is essentially the definition of a free object: to give a homomorphism of the free group, ring, or vector space $F(S)$ on a set $S$ to another group, ring, or vector space $A,$ you need to give a map of sets $S\to UA.$ Think of $S$ as being the set of generators of $F(S),$ and the "freeness" means that there are no relations between these generators other than the relations forced by the axioms of the algebraic structure.

For example, the free vector space on a set $S$ may be described as the vector space $F(S)$ with basis $\{e_s\mid s\in S\}$ indexed by the elements of $s.$ To give a map from $F(S)$ to any other vector space $V,$ you need only specify where the basis elements $e_s$ are sent, and this is completely determined by a set map $S\to UV$ (again, $UV$ is the underlying set of the vector space $V$).

As another example, the free commutative ring on a set $S$ is the ring $\Bbb Z[x_s\mid s\in S]$ - the polynomial ring over $\Bbb Z$ with one variable for each element of $s.$

Now that I've set this up, the point is that intersections are limits in the category of sets, and that forgetful functors (or more generally, right adjoints) play nicely with limits. In particular, if $S$ and $T$ are subsets of some set $X,$ then we may consider the diagram

$$ \require{AMScd} \begin{CD} Y @>>> S\\ @VVV @VVV \\ T @>>> X, \end{CD} $$ where $Y$ is some unspecified set together with maps such that the diagram commutes. The intersection $S\cap T$ has the nice property that any set $Y$ with maps to $S$ and $T$ as in the diagram will factor uniquely through a map $Y\to S\cap T.$ This is the statement that $S\cap T$ is the limit of the diagram above (without the $Y$).

By abstract nonsense arguments, right adjoints (like the forgetful functor in these cases) preserve limits. (We also sometimes have even nicer results, but let me not get in too deep.) Preservation of limits means that if we have a limit of algebras $\varprojlim A_i$ over some diagram, then the underlying set of the limit is canonically isomorphic to the limit $\varprojlim UA_i$ (in the category of sets) of the underlying sets of the algebras.

So, if you have subalgebras $A_1,A_2$ of a given algebra $A,$ and you consider the limit $B$ of these inclusions as we did for sets: $$ \require{AMScd} \begin{CD} B @>>> A_1\\ @VVV @VVV \\ A_2 @>>> A, \end{CD} $$

then the underlying set of the limit $B$ is the limit of the underlying sets $U A_1$ and $UA_2,$ which is simply the intersection $UA_1\cap UA_2.$

The other punchline is that the union of two sets $S$ and $T$ (which are subsets of some ambient set $X$) is the colimit of an appropriate diagram. However, if $S$ and $T$ are the underlying sets of some algebraic objects $S = UA_1,$ $T = UA_2,$ the forgetful functor does not preserve colimits (even if $X$ is the underlying set of some large algebra $X = UA$). So, unless you are surprisingly lucky, the smallest algebra containing two given algebras $A_1$ and $A_2$ (which is a colimit) will not be the same as the smallest set containing $UA_1$ and $UA_2.$

Many others have already expressed that this is also related to the fact that products and intersections commute but the same is not true of products and unions: this is also a categorical fact! Products and intersections are both examples of limits, but unions are colimits. Limits commute with limits, but limits do not necessarily commute with colimits, unless certain nice conditions hold.

All in all, the failure of existence of an algebraic structure on a union is a combination of a number of categorical facts, which are much more general than the specific situations you mention. While proving that the forgetful functors described have the properties I claim essentially come down to making arguments like in the other answers, I prefer this perspective because taking "unions" or "intersections" is somehow an unnatural thing to do when you have things which aren't sets - you want to combine your algebraic objects in ways that result in objects with the same algebraic structure (e.g., via taking limits and colimits or using other categorical constructions). The fact that the underlying set of a limit coincides with the limits of the underlying sets is a result of nice properties that the forgetful functor in question has.

Note: I said we didn't consider fields above, and that is because the category of fields is particularly badly behaved, because fields are rather restrictive.

$\endgroup$
9
  • 2
    $\begingroup$ Nice answer. Though I would rather say that is the property of creating limits of $U$ that plays the important role, instead of the limit preservation, don't you agree? $\endgroup$ Commented Dec 15, 2018 at 12:57
  • 1
    $\begingroup$ @GiorgioMossa That's probably more accurate, especially if you're using the nlab definition of creating limits. I did think that stressing preservation was perhaps putting the cart before the horse, but it's a subtler/more philosophical point than I was willing to make when I wrote things up :^) $\endgroup$
    – Stahl
    Commented Dec 15, 2018 at 16:57
  • 2
    $\begingroup$ I like this answer a lot, and I agree with a lot of the philosophy here (+1). However, I think the failure of this answer to explain why intersections of fields are fields is indicative of a flaw in this point of view. $\endgroup$
    – jgon
    Commented Dec 22, 2018 at 19:57
  • 1
    $\begingroup$ @jgon it is unfortunate that the same formal adjoint argument doesnt work for fields, but if you interpret them within the category of rings you obtain the corresponding statement. In any case, the computation that the functors have the desired properties essentially come down to direct computations that show up in the other arguments anyway. $\endgroup$
    – Stahl
    Commented Dec 22, 2018 at 20:02
  • 2
    $\begingroup$ @Stahl Note that the limit of fields in rings isn't always a field, taking $k_1\times k_2$ as an example, it's important that you take a limit of subfields. And I don't mean to come off as overly critical. I quite like your answer. $\endgroup$
    – jgon
    Commented Dec 22, 2018 at 21:41
9
$\begingroup$

Let $X$ be a set, and let $Y$ and $Z$ be subsets of $X$. Let $f: X^2 \to X$ be a binary function, and assume that the restrictions of $f$ to $Y$ and $Z$ are also functions (i.e. $f|_{Y^2} \subseteq Y$ and $f|{Z^2} \subseteq Z$).

Is it the case that $f|_{(Y \cap Z)^2} \subseteq Y \cap Z$? Yes, it is: if $a, b \in Y \cap Z$ then $f(a, b) \in Y$ because $f|_{Y^2} \subseteq Y$ and $f(a, b) \in Z$ because $f|_{Z^2} \subseteq Z$, so $f(a, b) \in Y \cap Z$.

Is it the case that $f|_{(Y \cup Z)^2} \subseteq Y \cup Z$? Not necessarily: if $a \in Y$ and $b \in Z$ then we know nothing at all about $f(a, b)$.

(Almost?) any structure we would call "algebraic" has some binary function (group multiplication, vector space addition, etc) which runs into this problem.

$\endgroup$
9
$\begingroup$

The fact that the intersection of subgroups are themselves subgroups, the intersections of subrings are subrings, etc., is indeed an example of a more general property:

Call a set $X$ "closed under $f$", where $f$ is a function with $n$ arguments, if the domain of $f$ includes $X^n$ and if $f(x_1, x_2, \dots, x_n) \in X$ for all $(x_1, x_2, \dots, x_n) \in X^n$.

Theorem: If $X$ and $Y$ are both closed under $f$, then $Z = X \cap Y$ is closed under $f$.

Proof. Since $X$ is closed under $f$, $Z^n \subset X^n$ is part of the domain of $f$. Furthermore, since every $n$-tuple $(z_1, z_2, \dots, z_n) \in Z^n$ is in both $X^n$ and $Y^n$, and both $X$ and $Y$ are closed under $f$, it follows that $f(z_1, z_2, \dots, z_n)$ belongs to both $X$ and $Y$, and therefore to their intersection $Z$. $\square$

For example, let $(G, +, 0)$ be a group with the group operation $+$ and the zero element $0$. Consider the functions $f_+: G^2 \to G$ and $f_0: G^0 \to G$ defined by $f_+(a, b) = a + b$ and $f_0(\varepsilon) = 0$ (where $\varepsilon$ denotes the zero-element tuple, the only element of $G^0 = \{\varepsilon\}$). Clearly, the subgroups of $(G, +, 0)$ are exactly the subsets of $G$ that are closed under both $f_+$ and $f_0$. Thus, if $X$ and $Y$ are subgroups of $(G, +, 0)$, and $Z = X \cap Y$ is their intersection, then $Z$ must also be closed under both $f_+$ and $f_0$, and thus also a subgroup.

More generally, any time we can define a "subthingy" of a "thingy" $(T, \dots)$ as a subset of $T$ that is closed under one or more functions $f: T^n \to T$, it automatically follows from this definition that the intersection of two subthingies of the same thingy must itself be a subthingy. Since most definitions of substructures of an algebraic structure are indeed naturally of this form, they do have this property.


On the other hand, for unions of substructures we have no equivalent of the theorem above, and thus the union $W = X \cup Y$ of two subthingies $X$ and $Y$ of a thingy $(T, \dots)$ is not usually a subthingy.

Probably the nearest thing that we can say, kind of trivially, is that the closure $\bar W$ of $W$ (i.e. the unique smallest subset of $T$ that includes $W$ and is closed under all the relevant functions, if one exists) will be a subthingy of $T$. Which, of course, is actually true by definition for all $W \in T$, not just those that arise as a union of two (or more) subthingies.

For example, the union of two subspaces of a vector space is not generally a subspace, because adding together two vectors from different subspaces can produce a vector that belongs to neither of the original subspaces. But the span of the union is indeed a subspace — as is the span of any arbitrary subset of the full vector space.

$\endgroup$
8
$\begingroup$

A little warning: the whole question is meaningful only if your are talking about substructures of a given structure $A$, so I what follow I will assume this.

Under the remark above one you can regard substructures in two different, but equivalent ways,

  1. as structures whose underlying set is a subset of $A$ and such that the inclusion is an homomorphism
  2. as subsets of $A$ closed under the operations of $A$ (and then the underlying structure is the one induced by $A$).

If we take the second approach we have that a subset $S \subseteq A$ is substructure of $A$ if and only if for every operation $f \colon A^n \to A$ we have that $$G(f) \cap S^{n} \times A = G(f) \cap S^{n+1}\ ,$$ here with $G(f)$denotes the graph of $f$, i.e. the set $\{(\bar a,a) \in A^{n} \times A \mid f(\bar a)=a\}$.

Using this formula is clear why intersection works well: if $(S_i)_i$ is a family of substructures of $A$, i.e. a family of subsets satisfying the equation above, the we have that $$ \begin{align*} G(f) \cap (\bigcap_i S_i)^n \times A &= G(f) \cap (\bigcap_i S_i^n \times A)\\ &= \bigcap_i G(f) \cap (S_i^n \times A)\\ &= \bigcap_i G(f) \cap S_i^{n+1} \text{ (because the $S_i$ are substructures)}\\ &= G(f) \cap (\bigcap_i S_i)^{n+1}\ . \end{align*}$$

What makes this work is the fact that products commutes with intersection: i.e. the following holds $$\bigcap_i (A^1_i \times \dots \times A_i^n) = (\bigcap_i A_i^1) \times \dots \times (\bigcap_i A_i^n)$$ A similar formula does not hold for unions, we have $$\bigcup_i (A_i^1 \times \dots \times A_i^n) \subsetneq (\bigcup_i A_i^1) \times \dots \times (\bigcup_i A_i^n)\ .$$

So if you like, a deep reason why intersection of substructures works so well is because intersection commutes with products.

$\endgroup$
5
$\begingroup$

If we have a set $S$ and binary operator $O$, then $O$ is defined on the cartesian production $S$ with itself. So it comes down to the fact that $(A\cap B)\times (A\cap B)=(A \times A) \cap (B \times B)$, but $(A\cup B)\times (A\cup B) \neq (A \times A) \cup (B \times B)$.

If we're taking the intersection of $A$ and $B$, then adding elements isn't a problem: we're adding elements that are in both $A$ and $B$, so we can use either the addition defined for $A$ or the one defined for $B$. But if we have the union of $A$ and $B$, then we have to add an element of $A$ to an element of $B$, and we can't use either previously defined addition.

$\endgroup$
0
$\begingroup$

For intersection case, the result is again an algebraic structure. For some times the union does't make much sense. For example, consider the intersection of two lines in a plane. It is either a point or empty. But we consider the union of two lines in the plane, it is, typically, a cross, something like an $\times$ or, in an extreme case, when the two given lines coincide, it is just a line. Similar comments can be made about the union of two planes in space-geometric intuition is still working. Geometric intuition is likely to stop working when it is consulted about the union of two subspaces of a $19$-dimensional space-say a $17$-dimensional one and an $18$-dimensional one. So ingeneral

  • a group cannot be written as union of two subgroups
  • a real vector space cannot be the union of a finite number of proper subspaces.
  • a Banach space cannot be written as a union of even a countable infinity of proper subspaces

$$\vdots$$

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .