Vector Space

A vector space is a collection of vectors that remains closed under both vector addition and scalar multiplication. It adheres to the principles of associativity, commutativity, and distributivity, and guarantees the presence of a zero vector and additive inverses.

What is a Vector Space?

To understand what a vector space is, imagine you have two non-empty sets:

  • A set S consisting of elements called "vectors"
  • A set R consisting of real numbers called "scalars"

The set S includes elements referred to as "vectors," but these are not necessarily vectors in the traditional sense.

For example, you can form a vector space using vectors in the plane \( \mathbb{R}^2 \) or space \( \mathbb{R}^3 \), but you can also construct a vector space consisting of matrices \( M(p, q, \mathbb{R}) \). Therefore, vector spaces are not limited to vectors alone.

In the set S, two binary operations are defined:

  • Addition
    If you take two vectors in the vector space and add them, the result is still a vector in the vector space. $$ \forall \ v,w \in S \Rightarrow v+w \in S $$
  • Scalar Multiplication
    If you take a vector in the vector space and multiply it by a number (a scalar), the result is still a vector in the vector space. $$ \forall \ v \in S \ , \ k \in R \Rightarrow k \cdot v \in S $$

These two operations are called "binary" because they take two elements as input, at least one of which is from the set S (vectors).

Addition is the internal binary operation because it uses two vectors from the set S and returns a result that still belongs to the set S. Scalar multiplication, on the other hand, is the external binary operation because it uses a vector from the set S and a scalar number k from the set R.

The set S, with the operations of addition and scalar multiplication, is defined as a vector space over R if it satisfies the following properties:

  1. Commutativity of Addition
    The order in which you add two vectors does not change the result (i.e., v + w = w + v for all vectors v and w).

    $$ v + w = w + v \quad \text{for all} \ v, w \in V $$

  2. Associativity of Addition
    If you add three vectors, it doesn't matter in which order you add them (i.e., (u + v) + w = u + (v + w) for all vectors u, v, and w).

    $$ (u + v) + w = u + (v + w) \quad \text{for all} \ u, v, w \in V $$

  3. Existence of a Zero Vector
    There is a special vector, called the zero vector, that when added to any other vector does not change the other vector (i.e., v + 0 = v for all vectors v).

    $$ \exists \ 0 \in V \ \text{such that} \ v + 0 = v \quad \text{for all} \ v \in V $$

  4. Existence of an Opposite Vector
    For every vector, there is another vector that, when added, gives the zero vector. That is, for every vector v, there exists a vector -v such that v + (-v) = 0.

    $$ \forall v \in V, \ \exists \ (-v) \in V \ \text{such that} \ v + (-v) = 0 $$

  5. Associativity of Scalar Multiplication
    If you multiply a vector by two scalars, it doesn't matter in which order you do it (i.e., a * (b * v) = (a * b) * v for all vectors v and scalars a, b).

    $$ a(b(v)) = (ab)v \quad \text{for all} \ v \in V \ \text{and for all} \ a, b \in \mathbb{R} $$

  6. Distributivity of Scalar Multiplication over Vector Addition
    If you add two vectors and then multiply the result by a scalar, you get the same result as if you multiply each vector by the scalar and then add the results (i.e., a * (v + w) = a * v + a * w for all vectors v, w and scalars a).

    $$ a(v + w) = av + aw \quad \text{for all} \ v, w \in V \ \text{and for all} \ a \in \mathbb{R} $$

  7. Distributivity of Scalar Addition over Scalar Multiplication
    If you add two scalars and then multiply a vector by the result, you get the same result as if you multiply the vector by each scalar and then add the results (i.e., (a + b) * v = a * v + b * v for all vectors v and scalars a, b).

    $$ (a + b)v = av + bv \quad \text{for all} \ v \in V \ \text{and for all} \ a, b \in \mathbb{R} $$

  8. Scalar Multiplication with 1
    Multiplying a vector by the scalar 1 does not change the vector (i.e., 1 * v = v for all vectors v).

    $$ 1v = v \quad \text{for all} \ v \in V $$

If a set of vectors satisfies all these axioms, then it is a vector space.

How to Verify if n Vectors Form the Basis of a Vector Space

To determine if a set of \( n \) vectors forms a vector space, you can follow two main approaches.

  • Check that all eight vector space properties are satisfied
    This method involves verifying that the vectors satisfy the associativity and commutativity of addition, the existence of the neutral and opposite elements, the distributive properties, and the associativity of scalar multiplication, and so on.
  • Check that the \( n \) vectors generating the space are linearly independent
    This is an alternative method to verify the existence of a vector space and is much more efficient, especially in vector spaces generated by \( n \) vectors. It is sufficient to verify that the \( n \) basis vectors are linearly independent. Linear independence means that none of the basis vectors can be written as a linear combination of the others and ensures that all the necessary properties of a vector space are satisfied.

    Given \( n \) vectors \(\mathbf{v}_1, \mathbf{v}_2, \ldots, \mathbf{v}_n\), they are linearly independent if and only if the only linear combination that produces the zero vector is the one in which all coefficients are zero: $$ a_1\mathbf{v}_1 + a_2\mathbf{v}_2 + \ldots + a_n\mathbf{v}_n = \mathbf{0} \Leftrightarrow a_1 = a_2 = \ldots = a_n = 0 $$ If this condition is met, the vectors are said to be linearly independent.

In other words, every vector in the space can be expressed as a unique linear combination of the basis vectors.

Therefore, by demonstrating the linear independence of the basis vectors, you can conclude that they generate a vector space without having to explicitly verify all the properties.

This approach significantly simplifies the verification process, making it more practical and faster in many problems.

Example

Consider the vectors \(\mathbf{v} = (1, 2)\) and \(\mathbf{w} = (1, 3)\) as a basis.

The vector space generated by these two vectors is the set of all linear combinations of \(\mathbf{v}\) and \(\mathbf{w}\).

Formally, this space is defined as:

\[
\text{span}(\{\mathbf{v}, \mathbf{w}\}) = \{ a\mathbf{v} + b\mathbf{w} \mid a, b \in \mathbb{R} \}
\]

To find the space generated by \(\mathbf{v}\) and \(\mathbf{w}\), we must consider all possible linear combinations of the two vectors.

A linear combination of \(\mathbf{v}\) and \(\mathbf{w}\) has the form:

\[
\mathbf{u} = a\mathbf{v} + b\mathbf{w} = a(1, 2) + b(1, 3) = (a + b, 2a + 3b)
\]

For example, if \(a = 1\) and \(b = 0\), the vector (1,2) is generated.

\[
\mathbf{u} = 1\mathbf{v} + 0\mathbf{w} = 1(1, 2) + 0(1, 3) = (1, 2)
\]

If \(a = 0\) and \(b = 1\), the vector (1,3) is generated.

\[
\mathbf{u} = 0\mathbf{v} + 1\mathbf{w} = 0(1, 2) + 1(1, 3) = (1, 3)
\]

If \(a = 1\) and \(b = 1\), the vector (2,5) is generated.

\[
\mathbf{u} = 1\mathbf{v} + 1\mathbf{w} = 1(1, 2) + 1(1, 3) = (1 + 1, 2 + 3) = (2, 5)
\]

If \(a = 2\) and \(b = -1\), the vector (1,1) is generated.

\[
\mathbf{u} = 2\mathbf{v} + (-1)\mathbf{w} = 2(1, 2) + (-1)(1, 3) = (2 - 1, 4 - 3) = (1, 1)
\]

In general, the vector space generated by the vectors \(\mathbf{v} = (1, 2)\) and \(\mathbf{w} = (1, 3)\) is the entire plane \( \mathbb{R}^2 \), since every vector in the plane can be expressed as a linear combination of \(\mathbf{v}\) and \(\mathbf{w}\).

Graphically, we can represent these vectors in the Cartesian plane to visualize the space they generate.

The vectors \(\mathbf{v} = (1, 2)\) and \(\mathbf{w} = (1, 3)\) are not parallel, so the space generated by them is a two-dimensional plane in \( \mathbb{R}^2 \).

Proof of Linear Independence

To prove that \(\mathbf{v}\) and \(\mathbf{w}\) indeed generate a two-dimensional plane, we must show that they are linearly independent.

Two vectors \(\mathbf{v}\) and \(\mathbf{w}\) are linearly independent if the only solution to the equation \(a\mathbf{v} + b\mathbf{w} = \mathbf{0}\) is \(a = 0\) and \(b = 0\).

Consider the equation of the linear combination of the two vectors:

\[
a(1, 2) + b(1, 3) = (0, 0)
\]

This translates to a system of equations:

\[
\begin{cases}
a + b = 0 \\
2a + 3b = 0
\end{cases}
\]

Solving the first equation of the system gives us \(b = -a\).

\[
\begin{cases}
b = -a \\
2a + 3b = 0
\end{cases}
\]

Substituting \(b = -a\) into the second equation:

\[
\begin{cases}
b = -a \\
2a + 3(-a) = 0
\end{cases}
\]

\[
\begin{cases}
b = -a \\
a = 0
\end{cases}
\]

Substituting \(a = 0\) into the first equation:

\[
\begin{cases}
b = 0 \\
a = 0
\end{cases}
\]

Since the only solution is \(a = 0\) and \(b = 0\), the vectors \(\mathbf{v}\) and \(\mathbf{w}\) are linearly independent.

This confirms that the two vectors generate a vector space.




Report a mistake or post a question




FacebookTwitterLinkedinLinkedin