User:SirMeowMeow/sandbox/Vector Space

Definition

edit

A vector space is one of the central objects of study in linear algebra. Any set which is compatible with the two operations of a vector space, namely vector addition and scaling, can be considered a vector space, and any element from this space is a vector.

To be compatible with vector scaling, a vector space must be accompanied with a field (or ring for generality), typically denoted   or  .[1] This field may sometimes be called the base, ground, or underlying field, and an element from a field may be called a scalar. When the context is clear, the mention of a field may be omitted.

The smallest vector space contains only the identity element under vector vector addition, and is known as the trivial vector space.

Abstract Definition

edit

A vector space   over a field   is a set defined with an abelian addition   and a vector scaling function   of the form  , such that:[2][3][4][5]

Vector Addition[a]

  Identity element
  Commutative
  Associative
  Inverse element

Where  .

Vector Scaling

  Identity element
  Vector scaling is compatible with field multiplication
  Field addition distributes over vector scaling
  Vector scaling distributes over field addition

Where  .

Finite Definition

edit

A finite list of   scalars from   is written  , and all finite-dimensional vector spaces are isomorphic or linearly equivalent to the vector space defined over  .

Vector Addition

edit

Let  . Finite vector addition is defined as pointwise scalar addition.

 

Vector Scaling

edit

Let   and  . Finite vector scaling is defined as pointwise scalar multiplication.

 

Examples of Vector Spaces

edit

Functional Spaces

edit

The vector space of polynomials forms an infinite-dimensional vector space.

Bitvectors

edit

The field of two elements   is considered the smallest field, and is named the Galois field of two elements, denoted as   or  . Because this vector space does not admit any inner product, the dot product of two identical vectors from   may be zero.[b]

 

Finite Vector Spaces

edit

Finite vector spaces may be defined only with finite groups over finite fields with a prime order of elements, denoted   for any prime  .

Polynomial F[x]

edit

The set of polynomial terms below spans  , and is countably infinite.

 

The polynomials form a vector space because elements from   form a commutative group, and because polynomials are closed under vector scaling.

A set polynomials are independent when trivial combination is the only zero polynomial.[c]

The derivative is a linear endomorphism which is defined for all polynomials, and it's surjective but not injective. But for endomorphisms on finitely generated modules, surjectivity, injectivity, and isomorphism are all equivalent conditions.[d] Thus   forms an infinite-dimensional vector space.

Linear Combination

edit

Let   be a subset of a vector space, and let  . Then a linear combination of   is defined as any vector which is the sum of scaled vectors in  .[6]

 

A trivial combination means every  .

Span

edit

Let   be a subset of a vector space, and let  . Then the span of   is defined as the set of all combinations from  .[7][8][9]

 

The empty vector sum is defined as the additive identity  , and thus the span of the empty set is the trivial vector space. Alternatively we can say that the span of any set generates the smallest vector space containing that set.[10][11]

Independence

edit

Let   be a subset of a vector space, and let  . Then   is independent iff the trivial combination is the only vanishing sum.[12][13][14]

 

Conversely,   is dependent if any non-trivial combination of   can be  .

Observations

edit
  •   is independent iff the removal of any vector changes the span.
  •   is independent iff no vector in   can be expressed as a combination of other vectors in  .

Basis and Dimension

edit

The basis of a vector space   is any independent set whose span is exactly  ,[15] and the elements of a basis are called basis vectors.[14] The dimension of   is the number of vectors or cardinality of its basis, written as  .[16][17]

If the basis of   has finite cardinality, then   is defined as a finite-dimensional vector space; otherwise,   is an infinite-dimensional vector space.

Observations

edit
  • Every vector space has a basis.
  • All choices of basis for a vector space will have the same cardinality.
  • Any independent subset of   which isn't spanning can be expanded into a basis.
  • Any dependent subset which spans   can remove vectors until it is a basis.

Linear Subspace

edit

A linear subspace of a vector space   is any subset which is also a vector space under the same abelian addition and vector scaling as  .[18] The trivial vector space is the smallest subspace which contains only the identity element of vector addition.

The subspaces   are independent iff for any pair of subspaces the only vector in common is  . Thus for any pair of independent subspaces their intersection is the trivial subspace.

  •   is independent iff the trivial combination is the unique vanishing sum.
  •   is independent iff every subspace contributes to the span of the subspace sum.

Construction of Vector Spaces

edit

Union and Intersection of Subspaces

edit

Sum of Subspaces

edit

The sum of subspaces   is the set of all vector sums with summands drawn from each corresponding subspace.[19]

 

Direct Sum of Subspaces

edit

A list of subspaces   is independent if for any pair of subspaces the only vector in common is  . Thus for any pair of independent subspaces their intersection is the trivial subspace.

The direct sum of subspaces   is sum of independent subspaces, and is written:[20]

 
  •  
  •  

Cartesian Product of Subspaces

edit

Let   be vector spaces over the same field. Then the cartesian product of these spaces is defined as the set of all lists whose indexed elements are drawn from their corresponding vector spaces:

 
  •  

Maps on Vector Spaces

edit

A mapping between vector spaces which preserves vector addition and scaling may be known as a linear map, operator, homomorphism, or function.

Let   be vector spaces over a ring or field  . Then a linear map   is defined as any mapping such that:

 

Where  , and  , and  .

Observations

edit
  • Linear combinations in   are mapped to linear combinations in  .
  • Group homomorphism implies identity is mapped to identity, and inverses are mapped to inverses:  .

Inner Product Spaces

edit

Let   be a vector space over a field  . An inner product is a mapping   such that:

  Linearity in the first argument
  Conjugate symmetry
  Positive-definite

A vector space for which an inner product can be defined is called an inner product space.

Examples

edit

Dot Product

edit

Let   be from the vector space   over a field  . Then dot product is a map   such that:

 

For vector spaces where the dot product qualifies as an inner product, the dot product is known as a definition for Euclidean distance.

Affine Subset

edit

Let   be a subset of the vector space  . Then an affine subset can be defined as the set:[21]

 

Any affine subset   is defined as parallel to  .[22]

Vector Space of Affine Subsets

edit

Let   be a subset of the vector space  , and let  .

 
 

Quotient Space

edit

The quotient space   is defined as the set of affine subsets which are parallel to  .[23]

 
  •  

Quotient Map

edit

Let   be a linear subspace of  . The quotient map   is defined:

 

Notes

edit
  1. ^ Any set of these properties is known as an abelian or commutative group.
  2. ^ For the finite field of two elements   we have  .
  3. ^ The zero or trivial polynomial is the polynomial that maps all inputs to zero.
  4. ^ Stacks Project (2015). Let be   a ring. Let   be a finite  -module. Let   be a surjective  -module map. Then   is an isomorphism.

Citations

edit
  1. ^ nLab (2020) Vector space.
  2. ^ Axler (2015) p. 12, § 1.19
  3. ^ Gallian (2012) p. 351 ch. 19: Vector Spaces.
  4. ^ Katznelson & Katznelson (2008) p. 4, § 1.2.1
  5. ^ ProofWiki (2021) Definition: Vector space.
  6. ^ Axler (2015) p. 28, § 2.3
  7. ^ Axler (2015) pp. 29-30, §§ 2.5, 2.8
  8. ^ Roman (2005) pp. 41-42, ch. 2
  9. ^ Hefferon (2020) p. 100, ch. 2, Definition 2.13
  10. ^ Axler (2015) p. 29, § 2.7
  11. ^ Halmos (1974) pp. 17, § 11
  12. ^ Axler (2015) pp. 32-33, §§ 2.17, 2.19
  13. ^ Katznelson & Katznelson (2008) p. 14, § 1.3.2
  14. ^ a b Gallian (2012) p. 353, ch. 19: Linear Independence.
  15. ^ Strang (2016) p. 168, § 3.4
  16. ^ Strang (2016) pp. 170-171, § 3.4
  17. ^ Gallian (2012) p. 355, ch 19: Linear Independence.
  18. ^ Gallian (2012) p. 352, ch 19: Subspaces.
  19. ^ Axler (2015) p. 20, § 1.36
  20. ^ Axler (2015) p. 21 § 1.40
  21. ^ Axler (2015) p. 94, § 3.79
  22. ^ Axler (2015) p. 94, § 3.81
  23. ^ Axler (2015) p. 95, § 3.83

Sources

edit

Textbook

edit
  • Axler, Sheldon Jay (2015). Linear Algebra Done Right (3rd ed.). Springer. ISBN 978-3-319-11079-0.
  • Gallian, Joseph A. (2012). Contemporary Abstract Algebra (8th ed.). Cengage. ISBN 978-1-133-59970-8.
  • Halmos, Paul Richard (1974) [1958]. Finite-Dimensional Vector Spaces (2nd ed.). Springer. ISBN 0-387-90093-4.
  • Hefferon, Jim (2020). Linear Algebra (4th ed.). Orthogonal Publishing. ISBN 978-1-944325-11-4.
  • Katznelson, Yitzhak; Katznelson, Yonatan R. (2008). A (Terse) Introduction to Linear Algebra. American Mathematical Society. ISBN 978-0-8218-4419-9.
  • Roman, Steven (2005). Advanced Linear Algebra (2nd ed.). Springer. ISBN 0-387-24766-1.
  • Strang, Gilbert (2016). Introduction to Linear Algebra (5th ed.). Wellesley Cambridge Press. ISBN 978-0-9802327-7-6.