The efficiency of quicksort is determined by calculating the running time of the two recursive calls plus the time spent in the partition. The partition step of quicksort takes n - 1 comparisons. The efficiency of the recursive calls depends largely on how equally the pivot value splits the array. In the average case, assume that the pivot does split the array into two roughly equal halves. As is common with divide-and-conquer sorts, the dividing algorithm has a running time of log(n). Thus the overall quicksort algorithm has running time O(nlog(n)). The worst case occurs when the pivot value always ends up being one of the extreme values in the array. For example, this might happen in a sorted array if the first value is selected as the pivot. In this case, the partitioning phase still requires n-1 comparisons, as before, but quicksort does not achieve the O(log(n)) efficiency in the dividing process. Instead of breaking an 8 element array into arrays of size 4, 2, and 1 in three recursive calls, the array size only reduces by one: 7, 6, and 5. Thus the dividing process becomes linear and the worst case efficiency is O(n2). Note that quicksort performs badly once the amounts of data become small due to the overhead of recursion. This is often addressed by switching to a different sort for data smaller than some magic number such as 25 or 30 elements.