Introduction to Applied Linear Algebra: Vectors, Matrices, and Least SquaresThis groundbreaking textbook combines straightforward explanations with a wealth of practical examples to offer an innovative approach to teaching linear algebra. Requiring no prior knowledge of the subject, it covers the aspects of linear algebra - vectors, matrices, and least squares - that are needed for engineering applications, discussing examples across data science, machine learning and artificial intelligence, signal and image processing, tomography, navigation, control, and finance. The numerous practical exercises throughout allow students to test their understanding and translate their knowledge into solving real-world problems, with lecture slides, additional computational exercises in Julia and MATLAB®, and data sets accompanying the book online. Suitable for both one-semester and one-quarter courses, as well as self-study, this self-contained text provides beginning students with the foundation they need to progress to more advanced study. |
Contents
Linear functions | 29 |
Norm and distance | 45 |
Clustering | 69 |
Linear independence | 89 |
Matrices | 107 |
Matrix examples | 129 |
Linear equations | 147 |
Linear dynamical systems | 163 |
Least squares data fitting | 245 |
Least squares classification | 285 |
Multiobjective least squares | 309 |
Constrained least squares | 339 |
Constrained least squares applications | 357 |
Nonlinear least squares | 381 |
Constrained nonlinear least squares | 419 |
A Notation | 439 |
Other editions - View all
Common terms and phrases
a₁ affine function asset basis functions block matrix Boolean called choose classifier clustering coefficients compute confusion matrix consider constrained least squares data set defined denote digit entries example expressed feature vector Figure flops formula function f given gives Gram matrix graph independent columns inner product input iteration Jclust k-means algorithm least squares approximate least squares problem left inverse Levenberg-Marquardt Levenberg-Marquardt algorithm linear combination linear dynamical system linear equations linear function linearly independent linearly independent columns matrix-vector means method minimize multiplication n-vector node nonlinear least squares nonnegative nonzero norm notation objective orthogonal orthonormal parameters period pixel polynomial positive prediction error QR factorization regression model represents residual right inverse right-hand side RMS error rows satisfies scalar shows solving sparse standard deviation Suppose Taylor approximation test set total number training set variables weights zero