Merge sort with O (sqrt (n)) auxiliary memory complexity (and even less) In this text an algorithm is described that merges two sorted arrays into one sorted array. Time complexity is O (n), auxiliary memory complexity is O (sqrt (n)), and stable.
I cannot follow this algorithm line by line as I struggle to visualise the recursion, but I do basically understand how the merge sort works, here is an MS-Paint diagram of me solving a simple 4 element Merge Sort and totaling the comparisons. How does the algorithm work when there is an odd number of elements?
As another example, take merge sort, which is a more efficient comparison-based sort. Here, the trick is to remember that merge-sort is an example of a divide-and-conquer algorithm. Here, the divide step is to split the array in half. Quick sort is another efficient comparison-based sort.
Bottom-up natural merge sort is a variation of merge sort that is used to sort linked lists efficiently. Lets implement this algorithm using a queue, by following below steps:
The problem that merge sort solves is general sorting: given an unordered array of elements that have a total ordering, create an array that has the same elements sorted.
The key to an O (nlogn) sort algorithm is the O (n) processing time and the 2 recursive calls on list sizes of approximately n/2. Quick sort with a random pivot has some uncertainty in its run time, but falling back on the time analysis of the algorithm on lists, we can still argue an O (nlogn) expected run time via the following:
The merge sort algorithm is a recursive algorithm that involves dividing the input array into smaller subarrays and then merging these subarrays back together in a sorted order.
Merge sort is a divide and conquer algorithm, so the aim is to solve the smaller subproblems until we solve the simplest version of the problem. When the array has one item left, it is sorted, and therefore it is solved – after that, we only need to merge.
One thing this article doesn't point out is that the merge sort may have a better worse case than Quicksort, but it is not better on average. Especially if you're using a randomized Quicksort, your chance of worst case is essentially nil.
ANOTHER NOTE: Last year, OCR presented an algorithm based on the bubblesort and mentioned the Binary Search : The only question for Merge/Insertion Sort was: 'Name two other sorting algorithms?' And Linear Search was not mentioned at all. So it is most likely that Linear Search will appear this year and unlikely that the bubble sort will not.