Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Dynamic Programming: Optimal Binary Search Trees and Knapsack Problem, Study notes of Algorithms and Programming

The concepts of dynamic programming and its applications to the knapsack problem and optimal binary search trees. The mfknapsack algorithm for solving the knapsack problem and the derivation of the recurrence for optimal binary search trees. It also provides an example problem and the average number of comparisons in a successful search.

Typology: Study notes

Pre 2010

Uploaded on 08/18/2009

koofers-user-vn2
koofers-user-vn2 🇺🇸

10 documents

1 / 27

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Dynamic
Programming
part2
(Chapter8)
COMP157
Nov5,2007
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b

Partial preview of the text

Download Dynamic Programming: Optimal Binary Search Trees and Knapsack Problem and more Study notes Algorithms and Programming in PDF only on Docsity!

Dynamic

Programming

part

(Chapter

COMP

Nov

Dynamic

Programming

  • Main

idea:

  • set

up

a

recurrence

relating

a

solution

to

a

larger

instance

to

solutions

of

smaller

instances

  • solve

smaller

instances

once

  • record

solutions

in

a

table

  • extract

solution

to

the

initial

instance

from

table

Slide

8

5

Problem:Given

n

keys

a

a

n

and

probabilities

p

p

n

of searching for them,

find a BST with a minimum average number ofcomparisons in successful search.There are

BSTs with

n

nodes.

c(n) is the n

th

Catalan number, which grows as fast as

,

so brute force is hopeless.

Optimal

Binary

Search

Trees

1

1

2

)

(

⎞ ⎟⎟⎠

⎛ ⎜⎜⎝

=

n

n

n

n

c

5 . 1 4 n

n

Aside:

Binomial

Coefficients

)!

(

!

!

)

,

(

k

n

k

n

k

n

C

n k

=

=

⎞ ⎟⎟ ⎠

⎛ ⎜⎜ ⎝

C(n,k) is number of ways

to choose k things from set of n things

Average

Number

of

Comparisons

keys

a

a

n

and

probabilities

p

p

n

c

i

is

number

of

comparisons

to

reach

a

i

average

number

of

comparisons

is

n = i

i

i

p

c

1

A
D
C
B

average

number

of

comparisons:

0.1*

0.2*

0.4*

0.3*

=

Optimal

BST

Sub

problem

T

i

j

is

some

BST

of

subset

of

keys,

a

i

… a

j

i

j

n.

C[i,j]

is

the

smallest

average

number

of

comparisons

in

any

BST

T

i

j

To

derive

recurrence,

consider

possible

roots

of

T

i

j

a

k

i

k

j.

Optimal

BST

Recurrence

⎫ ⎬ ⎭

⎧ ⎨ ⎩

        • ⋅ =

− =

=

1

1

1

1

) 1 ( ) 1 ( 1

min

]

,

[

k

i

s

j k

s

j k s s k i s s k j k i

inT

c

p

inT

c p p j i C

over all

choices of

root a

k

a

k

is root,

so c

k

c

s

in T

k-1 i

means

comparisons

to find key a

s

in

optimal

T

i k-

+1 because

these aredepths insub-trees

Recurrence gives following recursive algorithm:• pick a root• solve for optimal BST on both sub-trees• chose best root

Optimal

BST

Recurrence

⎫ ⎬ ⎭

⎧ ⎨ ⎩

        • ⋅ =

− =

=

1

1

1

1

) 1 ( ) 1 ( 1

min

]

,

[

k

i

s

j k

s

j k s s k i s s k j k i

inT

c

p

inT

c p p j i C

⎫ ⎬ ⎭

⎧ ⎨ ⎩

=

− =

=

=

− =

1

1

1

1

1

1

)

(

)

(

min

]

,

[

k

i

s

j k

s

j k

s

s j k s s k

i

s

s k i s s k j k i

p

inT

c

p

p

inT

c p p j i C

⎫ ⎬ ⎭

⎧ ⎨ ⎩

=

− =

=

=

1

1

1

1

)

(

)

(

min

]

,

[

k

i

s

j k

s

j

i

s

s j k s s k

i s s j k i

p

inT

c

p

inT

c

p

j

i

C

{

}

=

=

j

i

s

s

j

k

i

p j k C k i C j i C

] , 1 [ ] 1 , [

min

]

,

[

Example:

key

Example:

key

A

B

C

D

A

B

C

D

probability

probability

0

1

2

3

4

1

0

.

.

1.

1.

2

0

.

.

1.

3

0

.

1.

4

0

.

5

0

0

1

2

3

4

1

1

2

3

3

2

2

3

3

3

3

3

4

4

5

optimal BST B

A
C
D

cost table

root table

Optimal

Binary

Search

Trees

Analysis:

DP

for

Optimal

BST

Time efficiency:

Θ

(

n

3

) not great, but better than exponential

Space efficiency:

Θ

(

n

2

)

Method can be expended to include unsuccessful searchesTime can be reduced to

n

2

) by taking advantage of monotonicity of entries

in the root table, i.e.,

R
[

i,j

] is always in the range between

R
[

i,j

-1] and R[

i

+1,j]

Warshall’s

Algorithm