### 2240 class highlights

• Fri Mar 2
clicker. Share the final research presentations topic with the rest of the class (name, major(s), concentrations/minors, research project idea, and whether you prefer to go 1st, 2nd or have no preference).
informal eval while I check in about the projects, and then formal evaluations.
• Wed Apr 30
Take questions on the final research presentations
Hamburger earmuffs and the pickle matrix
clicker.
April is mathematics awareness month - the theme is magic, mystery and mathematics. Have the class give me a 3x3 matrix. Look at
h,P:=Eigenvectors(A)
MatrixInverse(P).A.P
which (ta da) has the eigenvalues on the diagonal (when the columns of P form a basis for Rn) - definition of diagonalizability. [We can uncover the mystery and apply this to computer graphics].
Applications to mathematical physics, quantum chemistry...
• Mon Apr 28
big picture discussion
clicker survey questions
Discuss the final research presentations.
• Fri Apr 25 Test 3
• Wed Apr 23 Take questions. If time remains:
eigenvector clicker review
3.3 and 2.8 clickers #6-8
• Fri Apr 18
Finish Dynamical Systems and Eigenvectors
Review: Ax=b solutions when mult makes sense, when b=0, when A is invertible, when A=Matrix([[1,0],[0,1],[0,0]]) (and column space and null space). Elementary matrix.
eigenvector decomposition clickers 2 #3-5

• Wed Apr 16
Review eigenvectors and eigenvalues:
definition (algebra and geometry)
What equations have we seen
Why we use det(A-lambdaI)=0
Why we use the eigenvector decomposition versus high powers of A for longterm behavior (reliability)
Continue Dynamical Systems and Eigenvectors
Highlight predator prey, predator predator or cooperative systems (where cooperation leads to sustainability)
eigenvector decomposition clickers 2 #1 and 2

• Mon Apr 14
Dynamical Systems and Eigenvectors first example
eigenvector decomposition clickers 1
• Fri Apr 11
5.1 clicker questions
Finish Geometry of Eigenvectors and compare with Maple
>Ex4:=Matrix([[1/2,1/2],[1/2,1/2]]);
>Eigenvectors(Ex4);

Begin 5.6: Eigenvector decomposition for a diagonalizable matrix A_nxn [where the eigenvectors form a basis for all of Rn]
Foxes and Rabbits
If ___ equals 0 then we die off along the line____ [corresponding to the eigenvector____], and in all other cases we [choose one: die off or grow or hit and then stayed fixed] along the line____ [corresponding to the eigenvector____].

• Wed Apr 9 3.3 and 2.8 clickers # 4 and 5
Continue 5.1: Review the algebra of eigenvectors and eigenvalues. [Ax=lambdax, vectors that are scaled on the same line through the origin, matrix multiplication is turned into scalar multiplication]. Solving Ax=lambdax algebraically using determinant(A-lambdaI)x=0, and substituting each lambda in to find a basis for the eigenspaces of A and equivalently the nullspace of (A-lambda I).
Compute the eigenvectors of Matrix([[0,1],[1,0]] by-hand and compare with Maple's work.
Geometry of Eigenvectors and compare with Maple
>Ex1:=Matrix([[0,1],[1,0]]);
>Eigenvalues(Ex1);
>Eigenvectors(Ex1);
>Ex2:=Matrix([[0,1],[-1,0]]);
>Eigenvectors(Ex2);
>Ex3:=Matrix([[-1,0],[0,-1]]);
>Eigenvectors(Ex3);

• Mon Apr 7 3.3 and 2.8 clickers #1,2,3
Define eigenvalues and eigenvectors [Ax=lambdax, vectors that are scaled on the same line through the origin, matrix multiplication is turned into scalar multiplication].
Algebra: Show that we can solve Ax=lambdax using det(A-lambdaI)=0 and (A-lambdaI)x=0 (ie the nullspace of A-lambdaI).
Eigenvectors of Matrix([[0,0],[1,0]]); and the Eigenvectors command in Maple
• Fri Apr 4 2.8 using the matrix 123,456,789 and finding the Nullspace and ColumnSpace (using 2 methods - reducing the spanning equation with a vector of b1...bn, and separately by examining the pivots of the ORIGINAL matrix.)
• Wed Apr 2
Review the LaTex Beamer slide
The relationship of row operations to the geometry of determinants - row operations can be seen as shear matrices when written as elementary matrix form, which preserve area, volume, etc...
Clicker questions #1-3
• Mon Mar 31 Past determinants clicker questions
Determinants including 2x2 and 3x3 diagonals methods, and Laplace's expansion (1772 - expanding on Vandermonde's method) method in general. [general history dates to Chinese and Leibniz]
M:=Matrix([[a,b,c],[d,e,f],[g,h,i]]);
Determinant(M); MatrixInverse(M);
M:=Matrix([[a,b,c,d],[e,f,g,h],[i,j,k,l],[m,n,o,p]]);
Determinant(M); MatrixInverse(M);

LaTex Beamer slides
Review the 2 determinant methods for the 123,456,789 matrix. Show that for 4x4 matrix in Maple, only Laplace's method will work.
The connection of row operations to determinants
The determinant of A transpose and A triangular (such as in Gaussian form).
The determinant of A inverse via the determinant of the product of A and A inverse - so det A non-zero can be added into Theorem 8 in Chapter 2.
• Fri Mar 28 Test 2
• Wed Mar 26 Take questions. Finish 2.3 clicker review Begin chapter 3 via mentioning google searches:
application of determinants in physics
application of determinants in economics
application of determinants in chemistry
application of determinants in computer science
Eight queens and determinants
Chapter 3 in Maple via MatrixInverse command for 2x2 and 3x3 matrices and then determinant work, including 2x2 and 3x3 diagonals methods

• Mon Mar 24 Finish 2.3 clicker review and take questions on the study guide
• Fri Mar 21
Clicker review of race track transformations
Begin Yoda (via the file yoda2.mw) with data from Kecskemeti B. Zoltan (Lucasfilm LTD) as on Tim's page
2.3 clicker review

• Wed Mar 19
Clicker review of linear transformations
Review of linear transformations of the plane, including homogeneous coordinates and the extension:
Keeping a car on a racetrack

• Mon Mar 17
Computer graphics and linear transformations (1.8, 1.9, 2.3 and 2.7):
Clicker review of linear transformations
Finish general geometric transformations on R2 [1.8, 1.9]
Computer graphics demo [2.7]
• Fri Mar 7 The University cancelled class.

• Wed Mar 5
Computer graphics and linear transformations (1.8, 1.9, 2.3 and 2.7):
Guess the transformation
general geometric transformations on R2 [1.8, 1.9]
In the process, review the unit circle

• Mon Mar 3
Clicker questions and review the Hill Cipher

Counterexamples for false statements [If A then B counterexample: A is true but the conclusion B is false]
Can a matrix equation have both 1 and infinite solutions but never be inconsistent?
Ax=0 where A varies
Ax=b where A is fixed but b varies
Condition # of matrices

Review guidelines for Problem Sets, including
• You have more time to work on fewer problems than practice exercises - Maple, interesting applications...
• Print Maple or show by-hand work
• Annotated work / explanations that show your critical reasoning
• Be careful of any additional instructions from the book or me

Computer graphics and linear transformations (1.8, 1.9, 2.3 and 2.7):
Dilation inverses
• Fri Feb 28 2.3 clicker questions
Hill Cipher: Linear transformation of uncoded message vectors to coded message vectors.
A.[uncoded vector] = [coded vector]
 A B C D E F G H I J K L M N O P Q R S T U V W X Y Z 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26

• Wed Feb 26
Review Theorem 8 in 2.3 [without linear transformations] for square matrices via clicker question. Discuss what it means for a square matrix that violates one of the statements. Discuss what it means for a matrix that is not square (all bets are off) via counterexamples.

Catalog description: A study of vectors, matrices and linear transformations, principally in two and three dimensions, including treatments of systems of linear equations, determinants, and eigenvalues.

-2.1-2.3 Applications: Hill Cipher, Condition Number and Linear Transformations (2.3, 1.8, 1.9 and 2.7)
-Chapter 3 determinants and applications
-Eigenvalues and applications (2.8, 4.9 and chap 5 selections, 7... as time allows)
-Final research sessions

Applications: Introduction to Linear Maps
The black hole matrix: maps R^2 into the plane but not onto (the range is the 0 vector).
Dilation by 2 matrix
Linear transformations in the cipher setting:
 A B C D E F G H I J K L M N O P Q R S T U V W X Y Z 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26
• Mon Feb 24
Obtain A inverse via elementary row operations E_p...E_2.E_1.A = I and Gaussian reductions of [A|I]
three := Matrix([[a, b, c], [d, e, f], [g, h, i]]);
MatrixInverse(three);
scalerow2 := Matrix([[1, 0, 0], [0, 5, 0], [0, 0, 1]]);
scalerow2.three;
swaprows12 := Matrix([[0, 1, 0], [1, 0, 0], [0, 0, 1]]);
swaprows12.three;
usualrowop := Matrix([[1, 0, 0], [0, 1, 0], [-4, 0, 1]]);
usualrowop.three;

The rest of the clicker questions for 2.2

Theorem 8 in 2.3 [without linear transformations]: A matrix has a unique inverse, if it exists. A matrix with an inverse has Ax=b with unique solution x=A^(-1)b, and then the columns span and are l.i...
• Fri Feb 21
Repeated methodology: apply inverse, use associativity, use def of inverse to obtain the Identity, use definition of Identity to cancel it:

Inverse of a matrix.
twobytwo := Matrix([[a, b], [c, d]]);
MatrixInverse(twobytwo);
MatrixInverse(twobytwo).twobytwo
simplify(%)
Clicker question on glossary
Test 1 corrections
three := Matrix([[a, b, c], [d, e, f], [g, h, i]]);
MatrixInverse(three);
scalerow2 := Matrix([[1, 0, 0], [0, 5, 0], [0, 0, 1]]);
scalerow2.three;

Obtain via elementary row operations and Gaussian reductions of [A|I]

• Wed Feb 19 Test 1

• Mon Feb 17
Take questions on the study guide.
clicker review of past hw questions
spans but not l.i, li but doesn't span, both
1.7 clicker questions - since we didn't finish them in class, here are solutions
• [University Cancelled Class] Fri Feb 14

• Wed Feb 12
Review matrix multiplication and matrix algebra. Introduce transpose of a matrix via Wikipedia, including Arthur Cayley. Applications including least squares estimates, such as in linear regression, data given as rows (like Yoda).
2.1 clicker questions # 7-9

• Mon Feb 10
Begin Chapter 2:
Continue via 2.1 clicker questions 1-5

Image 1   Image 2   Image 3   Image 4   Image 5   Image 6   Image 7.
Matrix multiplication Algebra of matrix multiplication: AB and BA...
• Fri Feb 7
Take questions on 1.7. Review Maple from Wednesday
Linear Combination check of adding a vector that is outside the plane containing Vector([1,2,3]), Vector([4,5,6]), Vector([7,8,9]), ie b3+b1-2*b2 not equal to 0: Vector([5,7,10])
M:=Matrix([[1, 4, 7, 5], [2, 5, 8, 7], [3, 6, 9, 10]]);
ReducedRowEchelonForm(M);

span2:=Matrix([[1, 4, 7, 5,b1], [2, 5, 8,7,b2], [3, 6, 9,10,b3]]);
GaussianElimination(span2);

Linearly independent check with additional vector:
li2:= Matrix([[1, 4, 7, 5,0], [2, 5, 8,7,0], [3, 6, 9,10,0]]); ReducedRowEchelonForm(li2);

Removing Redundancy
li3:= Matrix([[1, 4, 5,0], [2, 5,7,0], [3, 6,10,0]]); ReducedRowEchelonForm(li3);

e1:=spacecurve({[5*t,7*t,10*t,t = 0 .. 1]},color=black,thickness = 2):
e2:=textplot3d([5,7,10,` vector [5,7,10]`], color = black):
display(a1, a2, b1, b2, c1, c2, d1, d2,e1,e2);

Clicker questions:
1.7 clicker questions # 1, 2 and 6

• Wed Feb 5
Take questions. Review the geometry of v1+tv2
Review 1.4 #31 and 33
1.7 definition of linearly independent and connection to efficiency of span
l.i. equivalences and clicker
In R^2: spans R^2 but not li, li but does not span R^2, li plus spans R^2.

Examples in R^3 via Maple Code:
Linearly independent and span checks:
li1:= Matrix([[1, 4, 7,0], [2, 5,8,0], [3, 6,9,0]]);
ReducedRowEchelonForm(li1);
span1:=Matrix([[1, 4, 7, b1], [2, 5, 8,b2], [3, 6, 9,b3]]);
GaussianElimination(span1);

Plotting - to check whether they are in the same plane:
a1:=spacecurve({[t, 2*t, 3*t, t = 0 .. 1]}, color = red, thickness = 2):
a2:=textplot3d([1, 2, 3, ` vector [1,2,3]`], color = black):
b1:=spacecurve({[4*t,5*t,6*t,t = 0 .. 1]}, color = green, thickness = 2):
b2:=textplot3d([4, 5, 6, ` vector [4,5,6]`], color = black):
c1:=spacecurve({[7*t, 8*t, 9*t, t = 0 .. 1]},color=magenta,thickness = 2):
c2:=textplot3d([7,8,9,`vector[7,8,9]`],color = black):
d1:=spacecurve({[0*t,0*t,0*t,t = 0 .. 1]},color=yellow,thickness = 2):
d2:=textplot3d([0,0,0,` vector [0,0,0]`], color = black):
display(a1, a2, b1, b2, c1, c2, d1, d2);

• Mon Feb 3
Collect hw and take questions. Review span of the columns (1.3 and 1.4) compared to the span of the solutions of a system of equations (1.5) via examples:
The algebra and geometry of a system of equations with solutions a plane in R^5 off the origin.
s13n15extension:=Matrix([[1,-5,b1],[3,-8,b2],[-1,2,b3]]);

Clicker question. Then discuss what happens when we correctly use GaussianElimination(s13n15extension) - write out the equation of the plane that the vectors span. Choose a vector that violates this equation to span all of R^3 instead of the plane:
M:=Matrix([[1,-5,0,b1],[3,-8,0,b2],[-1,2,1,b3]]);

Theorem 4 in 1.4.
Review that t*vector1 + vector2 is the collection of vectors that end on the line parallel to vector 1 and through the tip of vector 2
• Fri Jan 31
Coffee mixing clicker question
The matrix vector equation and the augmented matrix. Decimals (don't use in Maple) and fractions, and the connection of mixing to span and linear combinations. Geometry of the columns as a plane in R^4, of the rows as 4 lines in R^2 intersecting in the point (40,60). Maple commands:
Coff:=Matrix([[.3,.4,36],[.2,.3,26],[.2,.2,20],[.3,.1,18]]);
ReducedRowEchelonForm(Coff);
Coffraction:=Matrix([[3/10,4/10,36],[2/10,3/10,26],[2/10,2/10,20],[3/10,1/10,18]]);
ReducedRowEchelonForm(Coffraction);

1.5: vector parametrization equations of homogeneous and non-homogeneous equations.

• Wed Jan 29 Collect hw and take questions
1.3 clicker questions #4 and continue the algebra and geometry of span and linear combinations.
Begin 1.4. Ax via using weights from x for columns of A versus Ax via dot products of rows of A with x and Ax=b the same (using definition 1 of linear combinations of the columns) as the augmented matrix [A |b].
The matrix vector equation and the augmented matrix.

• Mon Jan 27 Collect problem set 1. Register remaining iclickers. Take any questions. Review the language of vectors, scalar mult and addition, linear combinations and weights, vector equations and connection to 1.1 and 1.2 systems of equations and augmented matrix with 2 vectors in R^3. Introduce span.
1.3 clicker questions 1 and 2 and introduce the algebra and geometry of span and linear combinations.
• Fri Jan 24. Register the i-clickers. Collect the responses to the multiple choice questions. Take any questions. 1.3.
vectors, scalar mult and addition, linear combinations and weights, vector equations and connection to 1.1 and 1.2 systems of equations and augmented matrix. linear combination language (addition and scalar multiplication of vectors). #8 in multiple choice questions from 1.1 and 1.2

• [University Cancelled Class] Wed Jan 22
History of linear equations and the term "linear algebra" images, including the Babylonians 2x2 linear equations, the Chinese 3x3 column elimination method over 2000 years ago, Gauss' general method arising from geodesy and least squares methods for celestial computations, and Wilhelm Jordan's contributions.
Gauss quotation. Gauss was also involved in other linear algebra, including the history of vectors, another important "linear" object.
Take questions on the glossary / syllabus. clicker questions.
• Fri Jan 17 Take questions on Solutions to 1.1 on ASULearn, hw readings in 1.2. If you have questions on the first problem set, message Dr. Sarah on the ASULearn Forum.

Finish 1.2 examples and review Gaussian/Gauss-Jordan, Maple and geometry.
T/F: A linear system of 3 equations and 3 unknowns, where no 2 of the equations are multiples, can be inconsistent...
Reminder: You'll need your clickers on Wednesday.
Take a look at the number of solutions, the algebra and geometry arising from:
implicitplot3d({x-2*y+z=2, x+y-2*z=3, (-2)*x+y+z=1}, x = -4 .. 4, y = -4 .. 4, z = -4 .. 4)
implicitplot3d({x+y+z=3, x+y+z=2, x+y+z=1}, x = -4 .. 4, y = -4 .. 4, z = -4 .. 4)

Review the following vocabulary, which is also on the ASULearn glossary that Dr. Sarah is experimenting with.

New vocabulary in 1.1 and 1.2: (testing out a glossary in ASULearn)
augmented matrix
coefficients
consistent
free
Gaussian elimination / row echelon form (in Maple GaussianElimination(M))
Gauss-Jordan elimination / reduced row echelon form (in Maple ReducedRowEchelonForm(M))
homogeneous system
implicitplot
implicitplot3d
linear system
line
parametrization
pivots
plane
row operations / elementary row operations
solutions
system of linear equations
unique
hw is on the calendar page
• Wed Jan 15
Collect the hw due at the beginning of class and pass around the attendance sheet.
Mention that solutions are on ASULearn and are part of the hw for Fri
Please remind how to get to the main calendar page: google Dr. Sarah / click on webpage / then 2240).
Give the 2 handouts to those not there on Monday.

Gauss-Jordan elimination on 3 equations in 2 unknowns.

Look at the geometry, number of missing pivots, and parametrization of x+y+z=1.

Gaussian and Gauss-Jordan or reduced row echelon form in general: section 1.2, focusing on algebraic and geometric perspectives and solving using by-hand elimination of systems of equations with 3 unknowns. Follow up with Maple commands and visualization: ReducedRowEchelon and GaussianElimination as well as implicitplot3d in Maple (like on the handout):
with(plots): with(LinearAlgebra):
implicitplot3d({x+2*y+3*z=3,2*x-y-4*z=1,x+y+z=2}, x=-4..4,y=-4..4,z=-4..4);
A:=Matrix([[-1,2,1,-1],[2,4,-7,-8],[4,7,-3,3]]); ReducedRowEchelonForm(A);
P:=Matrix([[1,3,4,k],[2,8,9,0],[10,10,10,5],[5,5,5,5]]); GaussianElimination(P);
Highlight:
equations with 3 unknowns with infinite solutions, one solution and no solutions in R3, and the corresponding geometry.
• Mon Jan 13
History of solving equations 1.1 Work on the introduction to linear algebra handout motivated from Evelyn Boyd Granville's favorite problem (#1-3). At the same time, begin 1.1 (and some of the words in 1.2) including geometric perspectives, by-hand algebraic Gaussian Elimination and pivots, solutions, plotting and geometry, parametrization and GaussianElimination in Maple for systems with 2 unknowns in R2.
Evelyn Boyd Granville #3:
with(LinearAlgebra): with(plots):
implicitplot({x+y=17, 4*x+2*y=48},x=-10..10, y = 0..40);
implicitplot({x+y-17, 4*x+2*y-48},x=-10..10, y = 0..40);
EBG3:=Matrix([[1,1,17],[4,2,48]]);
GaussianElimination(EBG3);
ReducedRowEchelonForm(EBG3);

Course intro slides
Gaussian
In addition, do #4 and #5 with k as an unknown but constant coefficient.
Evelyn Boyd Granville #4
EBG4:=Matrix([[1,1,a],[4,2,b]]);
GaussianElimination(EBG4);

Evelyn Boyd Granville #5
EBG5:=Matrix([[1,k,0],[k,1,0]]);
GaussianElimination(EBG5);
ReducedRowEchelonForm(EBG5);

Prove using geometry of lines that the number of solutions of a system with 2 equations and 2 unknowns is 0, 1 or infinite.
Mention homework and the class webpages

pointplot
spacecurve