Loading [MathJax]/jax/output/HTML-CSS/jax.js
Skip to main content
Library homepage
 

Text Color

Text Size

 

Margin Size

 

Font Type

Enable Dyslexic Font
Mathematics LibreTexts

Search

  • Filter Results
  • Location
  • Classification
    • Article type
    • Stage
    • Author
    • Embed Hypothes.is?
    • Cover Page
    • License
    • Show Page TOC
    • Transcluded
    • PrintOptions
    • OER program or Publisher
    • Autonumber Section Headings
    • License Version
    • Print CSS
    • Screen CSS
  • Include attachments
Searching in
About 79 results
  • https://math.libretexts.org/Courses/SUNY_Schenectady_County_Community_College/MAT_149%3A_Topics_in_Finite_Mathematics_(Holz)/05%3A_Probability/5.04%3A_Markov_Chains/5.4.01%3A_Stochastic_Matrices
    This section is devoted to one common kind of application of eigenvalues: to the study of difference equations, in particular to Markov chains. We will introduce stochastic matrices, which encode this...This section is devoted to one common kind of application of eigenvalues: to the study of difference equations, in particular to Markov chains. We will introduce stochastic matrices, which encode this type of difference equation, and will cover in detail the most famous example of a stochastic matrix: the Google Matrix.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Interactive_Linear_Algebra_(Margalit_and_Rabinoff)/00%3A_Front_Matter
  • https://math.libretexts.org/Courses/Mission_College/MAT_04C_Linear_Algebra_(Kravets)/05%3A_Vector_Spaces_and_Subspaces/5.03%3A_Basis_and_Dimension
    In order to show that B is a basis for V, we must prove that V=Span{v1,v2,,vm}. If not, then there exists some vector vm+1 in V that is not ...In order to show that B is a basis for V, we must prove that V=Span{v1,v2,,vm}. If not, then there exists some vector vm+1 in V that is not contained in Span{v1,v2,,vm}. By the increasing span criterion Theorem 2.5.2 in Section 2.5, the set {v1,v2,,vm,vm+1} is also linearly independent.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Interactive_Linear_Algebra_(Margalit_and_Rabinoff)/02%3A_Systems_of_Linear_Equations-_Geometry/2.07%3A_Basis_and_Dimension
    This page discusses the concept of a basis for subspaces in linear algebra, emphasizing the requirements of linear independence and spanning. It covers the basis theorem, providing examples of finding...This page discusses the concept of a basis for subspaces in linear algebra, emphasizing the requirements of linear independence and spanning. It covers the basis theorem, providing examples of finding bases in various dimensions, including specific cases like planes defined by equations. The text explains properties of subspaces such as the column space and null space of matrices, illustrating methods for finding bases and verifying their dimensions.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Interactive_Linear_Algebra_(Margalit_and_Rabinoff)/01%3A_Systems_of_Linear_Equations-_Algebra
    This page discusses the algebraic study of linear equations, detailing methods for solving them, particularly through row reduction. It explains a systematic approach to solving equations and how to e...This page discusses the algebraic study of linear equations, detailing methods for solving them, particularly through row reduction. It explains a systematic approach to solving equations and how to express solutions in parametric form. The content is organized into sections that build foundational knowledge on linear equations, algorithms for solutions, and solution representation.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Interactive_Linear_Algebra_(Margalit_and_Rabinoff)/05%3A_Eigenvalues_and_Eigenvectors/5.01%3A_Eigenvalues_and_Eigenvectors
    This page explains eigenvalues and eigenvectors in linear algebra, detailing their definitions, significance, and processes for finding them. It discusses how eigenvectors result from matrix transform...This page explains eigenvalues and eigenvectors in linear algebra, detailing their definitions, significance, and processes for finding them. It discusses how eigenvectors result from matrix transformations and the linear independence of distinct eigenvectors. The text covers specific examples, including eigenvalue analysis for specific matrices and the conditions for eigenvalues, including zero.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Interactive_Linear_Algebra_(Margalit_and_Rabinoff)/05%3A_Eigenvalues_and_Eigenvectors/5.03%3A_Diagonalization
    This page covers diagonalizability of matrices, explaining that a matrix is diagonalizable if it can be expressed as A=CDC1 with D diagonal. It discusses the Diagonalization Theorem, eig...This page covers diagonalizability of matrices, explaining that a matrix is diagonalizable if it can be expressed as A=CDC1 with D diagonal. It discusses the Diagonalization Theorem, eigenspaces, eigenvalues, and the significance of linear independence among eigenvectors. Multiple diagonal forms can arise, while geometric and algebraic multiplicities influence diagonalizability.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Interactive_Linear_Algebra_(Margalit_and_Rabinoff)/03%3A_Linear_Transformations_and_Matrix_Algebra/3.02%3A_One-to-one_and_Onto_Transformations
    This page discusses the concepts of one-to-one and onto transformations in linear algebra, focusing on matrix transformations. It defines one-to-one as each output having at most one input and outline...This page discusses the concepts of one-to-one and onto transformations in linear algebra, focusing on matrix transformations. It defines one-to-one as each output having at most one input and outlines examples and theorems related to this property. The text emphasizes that a transformation is onto if every output corresponds to some input.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Interactive_Linear_Algebra_(Margalit_and_Rabinoff)/03%3A_Linear_Transformations_and_Matrix_Algebra/3.03%3A_Linear_Transformations
    This page covers linear transformations and their connections to matrix transformations, defining properties necessary for linearity and providing examples of both linear and non-linear transformation...This page covers linear transformations and their connections to matrix transformations, defining properties necessary for linearity and providing examples of both linear and non-linear transformations. It highlights the importance of the zero vector, standard coordinate vectors, and defines transformations like rotations, dilations, and the identity transformation.
  • https://math.libretexts.org/Bookshelves/Linear_Algebra/Interactive_Linear_Algebra_(Margalit_and_Rabinoff)/06%3A_Orthogonality/6.01%3A_Dot_Products_and_Orthogonality
    This page covers the concepts of dot product, vector length, distance, and orthogonality within vector spaces. It defines the dot product mathematically in Rn and explains properties lik...This page covers the concepts of dot product, vector length, distance, and orthogonality within vector spaces. It defines the dot product mathematically in Rn and explains properties like commutativity and distributivity. Length is derived from the dot product, and the distance between points is defined as the length of the connecting vector. Unit vectors are introduced, and orthogonality is defined as having a dot product of zero.
  • https://math.libretexts.org/Courses/Mission_College/MAT_04C_Linear_Algebra_(Kravets)/01%3A_Systems_of_Linear_Equations/1.05%3A_Vector_Equations_and_Spans
    The thing we really care about is solving systems of linear equations, not solving vector equations. The whole point of vector equations is that they give us a different, and more geometric, way of vi...The thing we really care about is solving systems of linear equations, not solving vector equations. The whole point of vector equations is that they give us a different, and more geometric, way of viewing systems of linear equations.

Support Center

How can we help?