Today, I am writing my last post in this SLog. Today, I shall be writing about sorting algorithms. The algorithms that I will be talking about are selection sort, quick sort, and merge sort. First, however, I will be talking about efficiency.
Efficiency
The efficiency of an algorithm is determined by how long an algorithm can be expected to run for before completing given a size of input. We usually count the amount of steps an algorithm can take to complete in the best case, average case, and the worst case. We classify algorithms into groups of algorithms that grow at different speeds with increasing input size . A best case for an algorithm is usually when the input is a sorted list, and therefore nothing needs to be done. A worst case could be a randomly sorted list or a list that is sorted backwards. In this case, the algorithm needs to go through the most steps possible in order to sort the list. The complexities of these groups is denoted by
, where
is a class of functions. When we determine whether a runtime of an algorithm is within a certain class of algorithms, we only look at the fastest growing part of the algorithm. For example, given that the function
describes the worst case runtime of an algorithm,
, because
is the fastest growing portion of the algorithm. Furthermore, saying that a function
is in
is to say that
will perform no worse than $cf(n)$ as the input size
increases.
In this article I will be mostly looking at the worst case runtime of algorithms.
Now you may think that an algorithm with a worst case runtime of will always be faster than an algorithm with, say a complexity of
(which is constant time). However, this is not always the case. Let
(and therefore
, and let
(and therefore
. If we look at this on a graph, we get the following:
Until the algorithm with complexity reaches the breakpoint, that is, when
, the algorithm with a constant worst case runtime of 100 steps will have worse performance than the algorithm with complexity
. Thus, we must choose which algorithm to use for the job at hand. The algorithm that does poorly on large inputs may do the best on small inputs, and the algorithm that performs poorly on small inputs may scale better than other algorithms for large inputs.
Now, I will move on to the sorting algorithms.
Selection Sort
Selection sort consists of scanning the list to find the smallest element and putting it as the first element in the list. Then, scan the list and find the next smallest element, and so on until the list is sorted. Because this process happens times, the number of steps required to sort the list in any case is
times. Since it takes the same number of steps for any set, because it has to go through the same process whether the list is sorted or reversed, it has a complexity of
.
Here is an animation of the Selection Sort algorithm in action:

Selection Sort Animation from https://upload.wikimedia.org/wikipedia/commons/9/94/Selection-Sort-Animation.gif
Quick Sort
A Quick sort algorithm chooses a pivot (doesn’t matter which element), and reorders the list based on its relation to the pivot. If an element is smaller than the pivot, it is placed before the pivot. If an element is larger than the pivot, it is placed after the pivot. Then, the algorithm is applied recursively on the two partitions that result from this operation. After every partitioning operation is performed, the pivot is in its final position.
This algorithm is called a quick sort because it is surprisingly quick. Chances are that the pivot that you pick every time will be somewhere the middle of the list, which means that each time the list(s) will be divided in two. Its worst case complexity is , but that is rare, as the pivot would need to be the first or last element after each partitioning operation (this has a probability of
). Its average case and best case complexities are all
.
Merge Sort
Merge sort is an algorithm in which the elements of a list are split into the smallest portions possible, and then compared to the element adjacent to each element. Then, the groups of sorted elements are compared to the adjacent group, and so on. Finally, the list is rejoined, and the sort is complete.
This algorithm is actually one of the first sorts that I looked into. This sorting algorithm has a worst case, average, and best case complexity of . Merge sort works better than quick sort when data can be accessed sequentially (that is, one at at time in sequence). It also does slightly better than quick sort in the worst case, even though they are both in the same class of algorithms, that is,
.
Here is an animation of the Merge Sort in action:

Merge Sort Animation from https://upload.wikimedia.org/wikipedia/commons/c/cc/Merge-sort-example-300px.gif
The Future
Next week, the finals start. I hope that I’ll be ready for my exams, even though I certainly do not feel ready at the moment. CSC148 has been interesting and enjoyable, and I am looking forward to my next courses in the Computer Science program.
References
I got the data on how the algorithms worked (and some of the analysis), the animations to illustrate how the algorithms worked, and the Big-O complexities for each of the algorithms from the following pages on Wikipedia. These links were accessed on December 6, 2013 at around 11pm EST. The links are all permalinks to the revision that I used as a reference.