Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Dynamics Cheat sheet, Cheat Sheet of Dynamics

cheat sheet for dyn exam if your professor allows

Typology: Cheat Sheet

2020/2021

Uploaded on 12/13/2023

bp3d
bp3d 🇺🇸

1 document

1 / 2

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
pf2

Partial preview of the text

Download Dynamics Cheat sheet and more Cheat Sheet Dynamics in PDF only on Docsity!

1. Approach. + Bisection and Interpolation Methods: These are bracketing methods. They rely or identifying an interval or bracket la, li where the function changes sign Ge, J (e) f(0) < 0)-They do not require the function tobe differentiable. Thase methods iteratively re‘ine the bracket until the roctisfound within the inte-val * Newton-Rachson and Secant Methods: These are open methods. Trey reauire the furction to be differentiable. These mathade tert with aninitial enaroximation and use: lope of the functien (Cerivative):o iteratively approach the root. They co nat require bracketirg 2. Initial Approximations: * Bisection and Interpolation Methods: These methods require you te provide an intial ‘racket cr interval that contains tre root. The iritia approximaticns aretwo values, w and b,suerthat f(a): f18) < 0. + Newton-Raphson and Secant Methods: These mathade require one or swo initial arpraximations which shculdlbe reasonably clesctto the root bur de net nced to bracsot it &. Convergence: * Bisection Method: Guaranteed to converge to ¢ root within the bracket, butit converges slowly. The convergence is linear * Interpolation Method: Like the bisection method, itis quaranteed toconveine within the bracket, butt can be faster for well-hehaved functions + Nowton Raphson Mothod:It can sorvorge rapidly quadratically) for well boheved furetions bul nay fall converge slowly if ie initial ysess is poor & if there are rulliple rostsneerby, * Secant Nethod: It has similar convergence propertiesto the Newtor-Reahson method but does not require the calculation of derivatives. t can be less predictable than Newton's methed 4 Function Requirements + Bisection and Interpolation Methods: Thay do not “aquire the function to be Uiferentiable, niaki ag them suitable for awide range ef functions including ose with discontinuities. * Newton-Raphson and Secant Methods: They requ re the “unction te be differentiable, ard the derivative must exist andlbe continuous in the neigheortood of the roct. Jn summary, the Bisection and Interpolation methods are suiteble fora broaderclass of functions and pravide a guaranteed way 0 find the roo: within a bracket while the Vewton- Raphson andS: bbut de nor guarantee convergence ardl are sensitive tothe choice of inital eaareximations. t methods aremere efficien: for well-beheved, different able functions “Tho choice of methed depends or the nature ofthe prablem end the propertios ofthe function being analyzed. Newton-Raphson Method: The Newton-Raphson method is based on the idea of approximating a function f(x) byits tangent line at a given point. The tangent line isa first-order Taylor series expansion of f(r) around the point fla) = flz=) ta) fi To find the root of this approximation {f(r} = 0), weset fix.) + fle jin for 2,) = and solve Now, we can find the next approximation, «...., by setting « = 2.1. Thisis the formula for the Newton-Raphson method, ‘Secant Method: ‘The Secant method is based on the idea of approximating the derivative of a function f(z) Land 2... The derivative using finite differences. We start with two initial approximations =r, ean be approximated using a first-order forward difference: f f(r0) ) =Oand solve for Gc To find the root of the tangent ling, we set f(x 1.Linear Least Squares Regression: In linear regression, we alm to find the equation of a straight line, y=ax | b\, where \la represents the slope and } isthe intercept. To derive this equation, we minimize the sumof ‘squared differences between the observed and predicted values. This is done by setting up ‘two equations: one for the derivative with respect to a and the other for the derivative with respect to b. Solving these equations will give usthe values of a and é for the best-fitting linear equation. 2. Polynomial Least Squares Regression: In polynomial regression, we seek to find the best-fitting polynomial equation, such as yr tare + ay find roots, with the Newton- ‘The derivation process's similar to linear regression, where we 5) and the Secant method minimize the sum of squared residuals. We differentiate the sum with respect to each coefficient a; and set these cerivatives to zero to obtain the coefficients for the polynomial ip posi bas Ro besncay ese equation. tory root is found, Step 1: Define Parameters * choose a‘unction f(x) that you wart to minimize. * Sat the interval [o,8) where you suspect the minimum exists * Choose a tolerance level for how close you want to get to the minimum. * Cefine two interior points 2, and 2» within the interval a6] such that 2. where 6 isthe golcen ratio, approximately 1618. Step 2: Iterative Process * Calculate the function values at f(x) and f(r2). * Compare f(z) and f(2). * Ht f(a) < fl), setb — 2 “It f(a) > fles)seta— 2, * Calculate two new interior points x; and 22 using the updated interval. * Repeat the process until the interval |a, bj becomes very small Le, b — Step 3: Convergence and Result The Golden Section Search converges to the minimum of the function within the specified tolerance « The result will bean interval [a,0|thatis very small indicatirg the location of he mnimurn. The value at f((a +6)/2) wil bea good approximation of the minimum value of Ure runction vast Squares Regression: y ~ a2?, we take the natural logarithm of both sides to linearize the «) + bln(z) We can then uss linear least squares regression to find the 2and the exponent b. Afterward, we take the exponential to obtain a, n Least Squares Regression: ssion, with the equation y — ae“, we similarly linearize the equation by yarithm:In(y) ~ In(a) + br. Using linear least squares regression, we ‘ithm of a and the coefficient b, and then we exponentiate the results to ovide the equations and steps required for performing Least Squares 4s types of functions, includ ng linear, polynomial, power, and 's. The specific equations wil vary depending on the function's formand reters involved