Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

SVD, QR, Least Squares, Conditioning and Stability - Lecture Slides | MATH 543, Study notes of Mathematics

Material Type: Notes; Professor: Blomgren; Class: NUMERICAL MATRIX ANALYSIS; Subject: Mathematics; University: San Diego State University; Term: Spring 2010;

Typology: Study notes

2009/2010

Uploaded on 03/28/2010

koofers-user-jdl
koofers-user-jdl 🇺🇸

10 documents

1 / 6

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
The Big Picture
Fundamentals
Fundamentals, ctd.
Numerical Matrix Analysis
Lecture Notes #16 Review:
SVD, QR, Least Squares, Conditioning and Stability
Peter Blomgren,
hblomgren.peter@gmail.comi
Department of Mathematics and Statistics
Dynamical Systems Group
Computational Sciences Research Center
San Diego State University
San Diego, CA 92182-7720
http://terminus.sdsu.edu/
Spring 2010
Peter Blomgren, hblomgren.peter@gmail.comiSVD, QR, LSQ, Conditioning and Stability (1/23)
The Big Picture
Fundamentals
Fundamentals, ctd.
Outline
1The Big Picture
Model Problem and Attacks; Analysis
Building Blocks; Tools
2Fundamentals
Basic Linear Algebra
The SVD; Projections
QR-Factorization
3Fundamentals, ctd.
Linear Least Squares
Conditioning, Stability and Accuracy
Error Analysis & Stability
Peter Blomgren, hblomgren.peter@gmail.comiSVD, QR, LSQ, Conditioning and Stability (2/23)
The Big Picture
Fundamentals
Fundamentals, ctd.
Model Problem and Attacks; Analysis
Building Blocks; Tools
The Big Picture: Model Problem and Attacks
The Linear Least Squares Problem
min
˜xCnkA˜x ˜
bk2,ACm×n,mn,rank(A) = n
Attacks
Normal Equations QR-Factorization The SVD
Methods
Brute Force Gram-Schmidt Orthogonalization “magic”
Householder Triangularization
Modes
Explicit Q
Implicit Q˜
b
Peter Blomgren, hblomgren.peter@gmail.comiSVD, QR, LSQ, Conditioning and Stability (3/23)
The Big Picture
Fundamentals
Fundamentals, ctd.
Model Problem and Attacks; Analysis
Building Blocks; Tools
The Big Picture: Analysis
Conditioning
The inherent difficulty of the mathematical problem
Sensitivity to perturbations; quantified by the condition number,κ
Stability
The robustness of the algorithm
Backward Stability
˜
f(˜x) = f(˜
˜x),
k˜
˜x˜xk
k˜xk=O(ǫmach )
Stability
k˜
f(˜x)f(˜
˜x)k
kf(˜
˜x)k,
k˜
˜x˜xk
k˜xk=O(ǫmach )
Accuracy
For a backward stable algorithm, the accuracy is
k˜
f(˜x)f(˜x)k
kf(˜x)k=O(κ(˜x)ǫmach)
Peter Blomgren, hblomgren.peter@gmail.comiSVD, QR, LSQ, Conditioning and Stability (4/23)
pf3
pf4
pf5

Partial preview of the text

Download SVD, QR, Least Squares, Conditioning and Stability - Lecture Slides | MATH 543 and more Study notes Mathematics in PDF only on Docsity!

The Big PictureFundamentalsFundamentals, ctd.

Numerical Matrix Analysis^ Lecture Notes #16 — Review:

SVD, QR, Least Squares, Conditioning and Stability

Peter Blomgren, 〈blomgren.peter@gmail.com

Department of Mathematics and Statistics

Dynamical Systems Group Computational Sciences Research Center^ San Diego State UniversitySan Diego, CA 92182-7720^ http://terminus.sdsu.edu/

Spring 2010

Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

— (1/23)

The Big PictureFundamentalsFundamentals, ctd.

Outline^1

The Big Picture

Model Problem and Attacks; Analysis Building Blocks; Tools 2 Fundamentals

Basic Linear Algebra The SVD; Projections QR-Factorization 3 Fundamentals, ctd.

Linear Least Squares Conditioning, Stability and Accuracy Error Analysis & Stability Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

The Big PictureFundamentalsFundamentals, ctd.

Model Problem and Attacks; AnalysisBuilding Blocks; Tools

The Big Picture: Model Problem and Attacks

The Linear Least Squares Problem^ min^ ˜x∈C

‖An ˜x^ −

˜ b‖^2

,^ A

∈^ C

m×n

,^ m

≥^ n

,^ rank

(A) =

n

Attacks

Normal Equations

QR-Factorization

The SVD

Methods

Brute Force

Gram-Schmidt Orthogonalization

“magic”

Householder Triangularization

Modes Explicit

Q Implicit

∗˜ Qb

Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

— (3/23)

The Big PictureFundamentalsFundamentals, ctd.

Model Problem and Attacks; AnalysisBuilding Blocks; Tools

The Big Picture: Analysis

Conditioning

The inherent difficulty of the mathematical problemSensitivity to perturbations; quantified by the

condition number

,^ κ

Stability

The robustness of the algorithm Backward Stability

˜f^ (˜x) =

f^ (˜˜x),^

˜‖˜x^ −^ ˜ x‖=^ ‖˜x‖ O(ǫmach

)

Stability

˜ ‖f^ (˜x)^ −^ f^ (˜˜x )‖ ‖f^ (˜˜x)‖

,^ ˜‖˜x^ −^ ˜x ‖=^ O ‖˜x‖

(ǫmach

)

Accuracy

For a backward stable algorithm, the accuracy is

˜ ‖f^ (˜x) −^ f^ (

˜x)‖ ‖f^ (˜x)

‖^

=^ O(

κ(˜x)ǫ

)mach

Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

The Big PictureFundamentalsFundamentals, ctd.

Model Problem and Attacks; AnalysisBuilding Blocks; Tools

The Big Picture: Building Blocks / Tools

The SVD

Here used primarily (so far) for matrix understanding, expression of the condition numberof a matrix, simplification of proofs;

“every matrix is diagonal.”^ Projectors

(^2) P= P.^

Orthogonal if

∗^ P =^ P

.^ Can be formed using a orthogonal (

P^ =^

∗QQ ), or

non-orthogonal (

P^ =^

∗A(A −^1 A) ∗A) basis.^ Floating Point

A source of

unavoidable

errors in representation of numerical values, and computations.

Norms

Matrix and vector norms give us the fundamental measurements of size and distance inour vector spaces.^ Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

— (5/23)

The Big PictureFundamentalsFundamentals, ctd.

Basic Linear AlgebraThe SVD; ProjectionsQR-Factorization

Basic Linear Algebra, etc.

Always a Good Idea to Review

-^ Fundamental

matrix/vector

operations,

orthogonality,

or-

thonormality, inner products, the angle between two vectors,Hermitian transpose, linear independence, basis for a space,unitary matrices. • Vector and matrix norms, especially the

,^ ‖ · ‖ 1

, and 2

norms, also the Frobenius norm of a matrix; 2-norm∞ invariance under multiplication by a unitary matrix. • Vector- and matrix-norm inequalities; Cauchy-Bunyakovsky-Schwarz. • The SVD as a tool for simplifying analysis and understandingof matrix properties. Geometric understanding. Unlikely

to show up as

explicit

questions on the midterm.

Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

The Big PictureFundamentalsFundamentals, ctd.

Basic Linear AlgebraThe SVD; ProjectionsQR-Factorization

The SVD

A^ =

∗V

-^ (For now) a theoretical tool. •^ The full and reduced SVD. •^ Expressing

range

(A) and

null

(A) in terms of the components

of the SVD. • The singular values

Ã^

rank

(A),

κ(A

),^ ‖A

‖, and^2

‖A

‖.F^

Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

— (7/23)

The Big PictureFundamentalsFundamentals, ctd.

Basic Linear AlgebraThe SVD; ProjectionsQR-Factorization

Projectors^ Definition (Projector)^ A

projector

is a square matrix

P^ that satisfies^2 P=

P

An^ orthogonal projector

is a projector that projects onto a

subspace

S^1

along a space

S, where^2

S^1

and

S^2

are orthogonal;

⇔^ P

=^ P

v range(P) Pv

Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

The Big PictureFundamentalsFundamentals, ctd.

Basic Linear AlgebraThe SVD; ProjectionsQR-Factorization

The Reduced and Full QR-Factorization^ As for the SVD, we can extend the QR-factorization by “fleshingout”

̂ Q^ with an additional (

m^ −

n) orthonormal columns, and

zero-padding

̂ R^ with an additional (

m^ −

n) rows of zeros:

 

Figure:

The Reduced QR-Factoriza- tion, A^ =

bb Q^ R

Figure:

The Full QR-Factorization A^ =^ QR

In the full QR-factorization, the columns

˜q,j^ j^ >

n^ are orthogonal

to^ range

(A). If

rank

(A) =

n, they are an orthonormal basis for

range

⊥(A)

=^ null

∗(A

), the space orthogonal to

range

(A).

Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

— (13/23)

The Big PictureFundamentalsFundamentals, ctd.

Basic Linear AlgebraThe SVD; ProjectionsQR-Factorization

Algorithms for the QR-Factorization

1 of 3

-^ Classical Gram-Schmidt •^ —

Think of how we use projectors to build

CGS

•^ —

Problem: Loss of orthogonality in

Q,

•^ —

Problem: Errors in

R^ ∼ O

√(ǫ

),mach

10 20

30 40

50 60

70 80

10 20 30 40 50 60 70 80

1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.

0 10 20 30

40 50 60

70 80

010 −5 10 −10 10 −15 10 −20 10 −25 10 Classical Gram−SchmidtModified Gram−SchmidtCorrect Eigenvalues

Figure:

∗ QQ 6 =^ I^ , illustrating the loss of

orthogonality in

CGS.

nn

Figure:

The blue circles illustrate that we only reach an accuracy level of

√ ∼ ǫmach

for^ CGS

.

Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

— (14/23)

The Big PictureFundamentalsFundamentals, ctd.

Basic Linear AlgebraThe SVD; ProjectionsQR-Factorization

Algorithms for the QR-Factorization

2 of 3

-^ Modified Gram-Schmidt •^ —

Mathematically equivalent to

CGS

; only a slight re-ordering of

operations,

-^ —

Numerically stable — If used in

“implicit mode”

for least

squares problems. 10 20 30 40

50 60

70 80

10 20 30 40 50 60 70 80

1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0

0 10

20 30

40 50 60

70 80

010 −5 10 −10 10 −15 10 −20 10 −25 10 Classical Gram−SchmidtModified Gram−SchmidtCorrect Eigenvalues

Figure:

∗ Q Q^ ≈

I^ ,^ illustrating the im-

proved orthogonality properties in

MGS.

Figure:

The red crosses illustrate the that we reach an accuracy level of

∼^ ǫmach

for

MGS.

Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

— (15/23)

The Big PictureFundamentalsFundamentals, ctd.

Basic Linear AlgebraThe SVD; ProjectionsQR-Factorization

Algorithms for the QR-Factorization

3 of 3

•^

Householder Triangularization

-^

—^

Use of^

reflectors,

closely

related

to^

projectors.

Non-

uniqueness of reflectors: understand the choice of reflector!

-^

—^

The basic implementation is

Q-less!

Understand how to

get the action

∗ Q

˜b^ “implicitly,”

and how to add the explicit

formation of

Q^ to the algorithm.

•^

—^

Numerically stable, and almost perfect orthogonality.

x

−||x||e

||x||e

H(−)^

H(+)

10 20

30 40

50 60

70 80

10 20 30 40 50 60 70 80

1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0

Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

— (16/23)

The Big PictureFundamentalsFundamentals, ctd.

Linear Least SquaresConditioning, Stability and AccuracyError Analysis & Stability

Least Squares Problems

-^ Understand the formulation and interpretation of the leastsquares problems —

range

(A), min

,^ A 2

˜x^ =

P˜y

, the

residual and orthogonality... • Know how to set up a least squares problem for fitting a lowdegree polynomial to a given data set. • What is the solution of the least squares problem, as expressedin terms of the results of the QR-factorization, Singular ValueDecomposition, and the Normal Equations? • Rough work comparison for the different solutions: QR

∼^ 2(

NE)

,^ SVD

≤^ 10(

QR)

,^ but SVD

≈^ QR when

m^ ≫

n

Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

— (17/23)

The Big PictureFundamentalsFundamentals, ctd.

Linear Least SquaresConditioning, Stability and AccuracyError Analysis & Stability

Cornerstones: Conditioning, Stability and Accuracy

-^ Absolute and relative condition numbers; definitions; use ofthe Jacobian when

f^ :^

X^ →

Y^ is differentiable.

-^ Note that the condition number may be a function of

˜x,^

i.e.

a problem may be well-conditioned for

some

range of inputs,

but ill-conditioned for other inputs. • Building block

— The condition number of a matrix:

in

terms of

‖A

‖,^ ‖

−^1 A

‖^ (‖

†A‖

as appropriate), and

σ.∗

-^ The Floating Point Axioms. •^ Absolute and relative errors. •^ Accuracy of an algorithm. •^ Stability and Backward Stability.^ Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

The Big PictureFundamentalsFundamentals, ctd.

Linear Least SquaresConditioning, Stability and AccuracyError Analysis & Stability

Conditioning of a Problem

Cornerstone Concept

•^

The absolute and relative condition numbers (what kind ismost useful in the context of computational science, why?)Definitions, and ability to compute for simple problems.

-^ Differentiability and non-differentiability (impact) -^ What is a “small” condition number? A large one? -^ The condition number is a measure of the inherent difficultyof the (mathematical) problem. -^ Conditioning of basic linear algebra operations, and thecondition number of a matrix. Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability

— (19/23)

The Big PictureFundamentalsFundamentals, ctd.

Linear Least SquaresConditioning, Stability and AccuracyError Analysis & Stability

Stability

Cornerstone Concept

Floating point arithmetic

-^ The axioms, and impact on computations. Stability -^ Stability is a statement about the quality of an algorithm. •^ The formal and informal definitions of stability and back-ward stability.

•^

“A^

stable

algorithm

gives

approximately

the

right

answer, to approximately the right question.”

-^ “A backward stable algorithm gives exactly the rightanswer, to approximately the right question.” Peter Blomgren,

〈blomgren.peter@gmail.com

〉^

SVD, QR, LSQ, Conditioning and Stability