Introduction to Mathematical Analysis I - 3rd Edition

9 Theorem 1.1.2 Let A,B, andCbe subsets of a universal set X. Then the following hold: (i) A∪Ac =X. (ii) A∩Ac =0/ . (iii) (Ac)c =A. (iv) (Distributive law) A∩(B∪C) = (A∩B)∪(A∩C). (v) (Distributive law) A∪(B∩C) = (A∪B)∩(A∪C). (vi) (DeMorgan’s law) A\(B∪C) = (A\B)∩(A\C). (vii) (DeMorgan’s law) A\(B∩C) = (A\B)∪(A\C). (viii) A\B=A∩Bc. Proof: We prove some of the results and leave the rest for the exercises. (i) Clearly, A∪Ac ⊂X since both Aand Ac are subsets of X. Now let x ∈X. Then either x is an element of Aor it is not an element of A. In the first case, x ∈Aand, so, x ∈A∪Ac. In the second case, x ∈Ac and, so, x ∈A∪Ac. Thus, X ⊂A∪Ac. Applying Theorem (1.1.1) it follows that A∪Ac =X. (ii) No element x can be simultaneously inAand not in A. Thus, A∩Ac =0/ . (iv) Let x ∈A∩(B∪C). Then x ∈A and x ∈B∪C. Therefore, x ∈B or x ∈C. In the first case, since x is also in Awe get x ∈A∩Band, hence, x ∈(A∩B)∪(A∩C). In the second case, x ∈A∩Cand, hence, x ∈(A∩B)∪(A∩C). Thus, in all cases, x ∈(A∩B)∪(A∩C). This shows A∩(B∪C) ⊂(A∩B)∪(A∩C). Now we prove the other inclusion. Let x ∈(A∩B)∪(A∩C). Then x ∈A∩B or x ∈A∩C. In either case, x ∈A. In the first case, x ∈B and, hence, x ∈B∪C. It follows in this case that x∈A∩(B∪C). In the second case, x∈Cand, hence, x∈B∪C. Again, we conclude x∈A∩(B∪C). Therefore, (A∩B)∪(A∩C) ⊂A∩(B∪C). Applying Theorem (1.1.1) the equality follows. □ A set whose elements are sets is often called a collection/family of sets and is often denoted by script letters such as A or B. Let I be a nonempty set such that to eachi ∈I corresponds a set Ai. Then the family of all sets Ai as i ranges over I is denoted by {Ai : i ∈I}. Such a family of sets is called anindexed family and the set I is called the index set. Consider the indexed family of sets {Ai : i ∈I}. The union and intersection of this family as i ranges over I is defined respectively by [ i∈I Ai ={x : x ∈Ai for some i ∈I} and \ i∈I Ai ={x : x ∈Ai for every i ∈I}. ■ Example 1.1.1 The following examples illustrate the notation. (a) Let the index set be I =Nand for each n∈Nwe have An = [−n,n]. Then [ n∈N An =R and \ n∈N An = [−1,1].

RkJQdWJsaXNoZXIy NTc4NTAz