









Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
This lecture introduces the 0-1 knapsack problem and its formal description, followed by a dynamic programming solution to this optimization problem. It covers the structure of an optimal solution, recursive definition of the value of an optimal solution, bottom-up computation, and construction of an optimal solution.
Typology: Study notes
1 / 17
This page cannot be seen from the preview
Don't miss anything!
Lecture 13: The Knapsack Problem
Outline of this Lecture
Introduction of the 0-1 Knapsack Problem.
A dynamic programming solution to this problem.
0-1 Knapsack Problem
Informal Description: We have computed data files
that we want to store, and we have available
bytes
of storage.
File has size bytes and takes minutes to re-
compute.
We want to avoid as much recomputing as possible,
so we want to find a subset of files to store such that
The files have combined size at most
.
The total computing time of the stored files is as
large as possible.
We can not store parts of files, it is the whole file or
nothing.
How should we select the files?
Recall of the Divide-and-Conquer
Remark: If the subproblems are not independent, i.e.
subproblems share subsubproblems, then a divide-
and-conquer algorithm repeatedly solves the common
subsubproblems.
Thus, it does more work than necessary!
Question: Any better solution?
Yes–Dynamic programming (DP)!
The Idea of Dynamic Programming
Dynamic programming is a method for solving
optimization problems.
The idea: Compute the solutions to the subsub-problems
once and store the solutions in a table, so that they
can be reused (repeatedly) later.
Remark: We trade space for time.
The Idea of Developing a DP Algorithm
Step 3: Bottom-up computation: Compute the value
of an optimal solution in a bottom-up fashion by
using a table structure.
Step 4: Construction of optimal solution: Construct
an optimal solution from computed information.
Steps 3 and 4 may often be combined.
Remarks on the Dynamic Programming Approach
Steps 1-3 form the basis of a dynamic-programming
solution to a problem.
Step 4 can be omitted if only the value of an opti-
mal solution is required.
Developing a DP Algorithm for Knapsack
Step 2: Recursively define the value of an optimal
solution in terms of solutions to smaller problems.
Initial Settings: Set
for
, no item
for ?
, illegal
Recursive Step: Use
for
.
Correctness of the Method for Computing 1 2 78 (
Lemma: For
,
Proof: To compute 1 2 <8
we note that we have only
two choices for file :
Leave file : The best we can do with files
!#"
%$& (
) and storage limit is 1 2 7
(^).
Take file (only possible if I / ): Then we gain
of computing time, but have spent bytes of
our storage. The best we can do with remaining
files
) and storage
is
.
Totally, we get L F 1 2 7
.
Note that if
, then F 1 2 7
so the lemma is correct in any case.
Example of the Bottom-up computation
Let
and
1 2 3 4 5 6 7 8 9 10
0 0 0 0 0 0 0 0 0 0 0
1 0 0 0 0 0 10 10 10 10 10 10
2 0 0 0 0 40 40 40 40 40 50 50
3 0 0 0 0 40 40 40 40 40 50 70
4 0 0 0 50 50 50 50 90 90 90 90
Remarks:
The final output is
.
The method described does not tell which subset gives the
optimal solution. (It is `
ba
in this example).
The Dynamic Programming Algorithm
KnapSack(c
SRdQ (^) Qfe ) g
for (
to
e )
CQhRiT W V ;
for (
to )
for (
to
e )
if (
3Tj R ) O P ,QkRUT W l mon `
Mp qQhRUTrQ c
3Tts
Mp qQhR p R
a
;
else O P ,QkRUT W
Mp qQhRUT ;
return
ue T ; v
Time complexity: Clearly, w
.
Constructing the Optimal Solution
Question: How do we use the values x#yqy(z|278 %
to
determine the subset
of items having the maximum
computing time?
If keep[ 5
] is 1, then }
. We can now repeat
this argument for keep[
If keep[ H
] is 0, the }~
and we repeat the argu-
ment for keep[
].
Therefore, the following partial program will output the
elements of
:
;
for (
downto 1)
if (keep 2 <
)
output i;
2 7
;
The Complete Algorithm for the Knapsack Problem
KnapSack(c
SRdQ (^) Qfe ) g
for (
to
e )
MQhRUT W V ;
for (
to )
for (
to
e )
if ((
3TXj R ) and ( c
3Tts
Cp qQhR p R
Mp qQhRUT )) g
,QhRUT W c
3Tts
Cp qQhR p R
;
keep
; v
else g
,QhRUT W
Mp qQhRUT ;
keep
; v
W e ;
for (
downto 1)
if (keep
) g
output i; W
p R
; v
return
e T ; v