What is the Master Theorem in the context of divide-and-conquer algorithms?
+
It provides a straightforward method to determine the asymptotic growth rate of recurrence relations that arise from divide-and-conquer algorithms.
In what form can a recurrence relation be expressed for the Master Theorem to apply?
+
The recurrence must be of the form T(n) = aT(n/b) + f(n), where a ≥ 1, b > 1, and f(n) is an asymptotically positive function.
What are the three cases defined by the Master Theorem?
+
Case 1: If f(n) = O(n^(log_b a - ε)) for some ε > 0, then T(n) = Θ(n^(log_b a)). Case 2: If f(n) = Θ(n^(log_b a)), then T(n) = Θ(n^(log_b a) * log n). Case 3: If f(n) = Ω(n^(log_b a + ε)) and af(n/b) ≤ cf(n) for some c<1 and large n, then T(n) = Θ(f(n)).
How does the Master Theorem help in analyzing algorithmic complexity?
+
It allows quick determination of the running time of many recursive algorithms without expanding the full recursion tree.
Can you give an example where the Master Theorem gives a tight bound?
+
For T(n) = 2T(n/2) + n, the theorem falls under Case 2 with f(n) = Θ(n), yielding T(n) = Θ(n log n).
Are there limitations to when the Master Theorem can be applied?
+
Yes; it cannot handle non-polynomial differences between f(n) and n^(log_b a) or when the recurrence lacks a uniform division of subproblems.
Why is the work done at each level of recursion important in the proof of the Master Theorem?
+
Because the theorem compares this work to the cost of solving subproblems, revealing how the total cost accumulates across all levels.
What is a common mistake students make when applying the Master Theorem?
+
Misidentifying which case applies due to overlooking the regularity condition in Case 3 or miscalculating log_b a.