algorithmrecursiondynamic-programmingmemoizationknapsack-problem

How does DP helps if there are no overlapping in sub problems [0/1 knapsack]


Consider below inputs for typical Knapsack problem.

V = [10,8,12]
W = [2,3,7]
i =  1,2,3
C = 10

I tried recursion with memoization to solve this sample but found no overlapping sub problem.

Signature of the recursive procedure :

knapsack(int c, int i) 

Called initially as knapsack(10,1)

enter image description here

The method of the solution is like as explained in https://www.youtube.com/watch?v=6h6Fi6AQiRM and https://www.youtube.com/watch?v=ocZMDMZwhCY.

How does Dynamic programming helps in reducing the time complexity for such samples of Knapsack ? If it does not help improving the time complexity of this case then does worst case complexity of DP solution also same as of back track search based i.e. 2 to the power n [ignoring the pruning, as if pruning applied then complexity will reduce for both the solution and again DP will not be better than non memoized recursive solution]

** Is overlapping in sub problems really missing in the above sample or I am missing something ?**


Solution

  • The 0/1 knapsack problem has a pseudo-polynomial-time solution. The running time of that solution is O(nW) where n is the number of items to choose from, and W is the weight that the knapsack can hold. The running time is described as pseudo polynomial because W is not a function of n.

    Or is it? Given a list of n items (by weight and value), there exists a maximum weight for one item, call it k. (That maximum weight can be determined in O(n) time.) If W is greater than or equal to kn, then the problem is trivial, all the items fit in the knapsack. Therefore, we only need to consider the cases where W < kn. So for the purposes of complexity analysis, W can be expressed as a function of n.

    Given that nW <= k n^2, the running time of the algorithm is O(k n^2).

    Now, the pedants in the audience will argue (correctly) that this is still pseudo-polynomial time, since there's no defined relationship between k and n. But any concrete statement of the problem (which lists items by weights and value) must necessarily have an explicit constant value for k in the problem statement itself.

    Enough with the theory. How do we apply this to the example in the question.

    Hence, the predicted running time is O(k n^2) = O(7x3x3) = 63. But the predicted exponential running time (without DP) is O(2^n) = O(2^3) = 8.

    And there's the problem with your example. Big-O analysis describes the asymptotic behavior of algorithms for large values of n. It tells you little to nothing about the performance for small values of n. If the value of n is small enough that 2^n < k n^2, then there's no expectation that DP will improve the running time of the algorithm, or have any effect at all.

    Your challenge is to find an example where 2^n > k n^2, and you still don't have overlapping sub problems.