Algorithms Python

Implement Merge Sort in Python

The divide-and-conquer method is a powerful strategy for designing asymptotically efficient algorithms. In this post, we’ll explore applications of the divide-and-conquer method and acquire valuable mathematical tools that you can use to solve the recurrences that arise when analyzing divide-and-conquer algorithms.

In the divide-and-conquer method, if the problem is small enough- the base case– you just solve it directly without recursing. Otherwise- the recursive case -you perform three characteristic steps:
Divide the problem into one or more sub problems that are smaller instances of the same problem.
Conquer the sub problems by solving them recursively.
Combine the sub problem solutions to form a solution to the original problem.

Sorting using Divide and Conquer Method(Merge Sort)

The merge sort algorithm closely follows the divide-and-conquer method. In each step, it sorts a subarray A(p , r), starting with the entire array A(1 ,n) and recurring down to smaller and smaller subarrays.

Merge sort is a divide-and-conquer algorithm that divides the unsorted list into smaller sublists, sorts each sublist, and then merges them back together to produce a sorted list. Here’s a step-by-step explanation and a Python implementation of the merge sort algorithm:

Divide the subarray A(p, r)to be sorted into two adjacent subarrays, each of half the size. To do so, compute the midpoint q of A(p, r) (taking the average of p and r), and divide A(p, r) into subarrays A(p, q) and A(q, r).
Conquer by sorting each of the two subarrays A(p, q) and A(q, r) recursively using merge sort.
Combine by merging the two sorted subarrays A(p, q) and A(q , r) back into A(p, r), producing the sorted answer.

Pseudo code

MergeSort(arr):
    if length of arr <= 1:
        return arr

    // Split the input array into two halves
    mid = length of arr // 2
    left_half = arr[0 to mid - 1]
    right_half = arr[mid to end]

    // Recursively sort both halves
    left_half = MergeSort(left_half)
    right_half = MergeSort(right_half)

    // Merge the sorted halves
    merged_array = Merge(left_half, right_half)

    return merged_array

Merge(left, right):
    result = empty array
    left_index = 0
    right_index = 0

    // Compare elements in the left and right arrays and merge them in sorted order
    while left_index < length of left and right_index < length of right:
        if left[left_index] <= right[right_index]:
            append left[left_index] to result
            increment left_index by 1
        else:
            append right[right_index] to result
            increment right_index by 1

    // Append any remaining elements from left and right
    append remaining elements in left to result
    append remaining elements in right to result

    return result

Implementation in Python

def merge_sort(a):
    if len(a)>1:
        mid=len(a)//2
        left=a[:mid]
        right=a[mid:]
        merge_sort(left)
        merge_sort(right)
        i=j=k=0
        while i<len(left) and j<len(right):
            if left[i]right[j]:
                a[k]=left[i]
                i+=1
            else:
                a[k]=right[j]
                j+=1
            k=k+1
        while i<len(left):
            a[k]=left[i]
            i+=1
            k+=1
        while j<len(right):
            a[k]=right[j]
            j+=1
            k+=1
a=[32,4,456,56,789,8,94,2]
merge_sort(a)
print("Sorted Array is: ",a)

Time Complexity

The time complexity of Merge Sort is O(n log n), where ‘n’ is the number of elements to be sorted. Merge Sort consistently performs in O(n log n) time regardless of the input data, which makes it a reliable choice for general-purpose sorting.

The O(n log n) time complexity can be explained as follows:

  • Divide: The array is repeatedly divided into two halves until you have subarrays of size 1 (which are inherently sorted). This takes O(log n) time because you divide the array in half at each level of the recursion.
  • Conquer: At each level of the recursion, you perform a linear time merge operation for each element in the array. Merging n elements at each level takes O(n) time.
  • Merge: The merging process is repeated log n times since there are log n levels in the recursion tree. Hence, it’s O(n log n).

Merge Sort’s time complexity makes it efficient for large datasets, and it’s a stable sorting algorithm.

Space Complexity

The space complexity of Merge Sort is O(n). This arises from the additional memory required to store the temporary subarrays during the merge process. In the worst case, you need to allocate memory for two additional arrays that are roughly the same size as the input array.

It’s worth noting that Merge Sort is an out-of-place sorting algorithm, meaning it doesn’t sort the array in place. It creates copies of subarrays, which can be a drawback in situations with limited memory.

In summary, Merge Sort has a time complexity of O(n log n), making it an efficient choice for sorting large datasets, and a space complexity of O(n), which can be a consideration when working with memory-constrained systems.

Important Notice for college students

If you’re a college student and have skills in programming languages, Want to earn through blogging? Mail us at geekycomail@gmail.com

For more Programming related blogs Visit Us Geekycodes . Follow us on Instagram.

Leave a Reply

%d bloggers like this: