Why is the Big O of merge sort not 2^logn?

Even after hearing. learning, programming merge sort several times in the past, suddenly I am caught in a logical knot today again. I know that Big O of Merge Sort is n*logn but

  • since there are atleast 1 comparisons at the last step and
  • 2 comparisons at the last but one step and
  • 4 comparisons in the last but second steps and so on..
  • isn’t 1 + 2 + 4 + ….n/2 =  (1 – 2^logn)/(1-2) = 2^logn
  • i.e. it seems the merge sort is O(2^logn).

Here’s how I untangled it by writing down a break down of all the steps involved.

  1. lets say values is an int array of size n.
  2. During merge sort we break it two halves of size n/2 and 
  3. continue to perform the same operation on each of the halves until we get arrays of sie 1
  4. at this point we merge two arrays of size 1 by comparing them and we get array of size 2
  5. Similarly 1 comparison is required at least n/2 times
  6. this is because we are breaking n into n/2 two size arrays recursively.
  7. From above it seems that we need to make n/2 1 comparisons and n copy statements in the worst case i.e. n/2 + n are the total number of operations required to come up with n/2 sorted 2 size arrays.
  8. Taking this further we need (n/4)*2 + n here there are 2 comparisons required and this continues
  9. given above the total number of steps seems to be = (n/2 + n) + ((n/4)*2 + n) + …
  10. i.e. nlogn + n/2 * logn
  11. i.e. 3n/2 logn

QED and my mind is restored, the logical mistake that I committed was assuming that 1 comparison was required only once and 2 comparisons were required only one etc. but in fact

  1. 1 comparisons was required n/2 times
  2. 2 comparison were required n/4 times and so on untill
  3. n/2 comparisons which was required only once
  4. i.e. n/2 + n2+ … = n/2 * logn comparisons in total
Proudly powered by WordPress | Theme: Outfit Blog by Crimson Themes.