Selection Sort
Selection sort, like bubble sort, is a comparison-based in-place sorting algorithm. it is easy, and it has the obvious benefit of having the fewest swaps of any algorithm. It performs maximum (n – 1) swaps on a list of size n. However, its running time is quadratic, making it unsuitable for a long list.
Selection sort is known for its simplicity, and it outperforms more complex algorithms in some cases, particularly when auxiliary memory is limited.
The array is split into two sections, one sorted and one unsorted. The sorted section is initially empty, whereas the unsorted section includes the whole array. At every step, algorithm finds minimal element in the unsorted part and adds it to the end of the sorted one. When the unsorted section gets empty, the process comes to a halt.
In each iteration i, Selection sort identifies the smallest element in an unsorted list and swaps it with the i-th element in the list. k items are sorted at the end of the k-th pass. Sorting begins from the beginning.
At the end of the first pass, minimum element seats on the first location, in the second pass, second minimum element seats on the second location and so on
Example
Let us sort the letters of word “DESIGN” in alphabetical order using selection sort.
The following figure shows the simulation to sort characters of word “DESIGN”
The another simulation is shown here for clarification:
Pass | Data sequence | Comparisons |
1 | 74, 25, 16, 22, 10 | n – 1 |
2 | 10, 25, 16, 22, 74 | n – 2 |
3 | 10, 16, 25, 22, 74 | . |
4 | 10, 16, 22, 25, 74 | . |
5 | 10, 16, 22, 25, 74 | 2 |
6 | 10, 16, 22, 25, 74 | 1 |
Algorithm of Selection Sort
Algorithm for selection sort is shown below:
Algorithm SELECTION_SORT(A)
// A is an array of size n
for i ← 1 to n – 1 do
min ← i
for j ← i + 1 to n do
if ( A[j] < A[min]) do
min ← j
end
end
swap(A[i], A[min])
end
Complexity Analysis
- Similar to bubble sort, selection sort also dose the same number of comparisons. It iterates both loops irrespective of the input data pattern.
- Unlike bubble sort, selection sort cannot detect sorted sequence. So running time of selection sort in best, average and worst case is O(n2). We can come to this conclusion by adding a number of comparisons just like the bubble sort, but here we use a recurrence equation to derive the complexity.
- Let’s assume T(n) defines the running time to solve the problem of size n. In selection sort, after each iteration, one element gets sorted and problem size reduces by one.
- For each problem of size n, the inner loop iterates n times. So recurrence equation for selection sort can be written as,
T(n) = T(n – 1) + n
Put n = n – 1 in above equation, T (n – 1) = T(n – 2) + (n – 1)
Put the value of T(n – 1) in the previous equation of T(n).
T(n) = T(n – 2) + (n – 1) + n
Put n = n – 2 in above equation, T(n – 2) = T(n – 3) + n – 2
Put value of T(n – 2) in previous equation of T(n).
T(n) = T(n – 3) + (n – 2) + (n – 1) + n
After k iterations,
T(n – k) = T(n – k – 1) + (n – k)
T(n) = T(n – k) + (n – k + 1) + (n – k + 2) + … + (n – 1) + n
When k approaches to n,
T(n) = T(0) + 1 + 2 + 3 + … + (n – 1) + n
T(0) = 0, because it is running time of problem of size zero, and no effort is needed to solve this problem.
T(n) = 1 + 2 + 3 + … + n
= Σn = n(n + 1) / 2 = (n2 /2) + (n/2)
T(n) = O(max( (n2 /2) + (n/2) )) = O(n2 / 2) = O(n2)
T(n) = O(n2)
Selection sort can not detect whether list is sorted or not. So even for the sorted list, it does same comparisons. Its non adaptive algorithm and has same running time for all the cases.
Best case | Average case | Worst case |
O(n2) | O(n2) | O(n2) |
Simulation
The following picture is a simulation of a selection sort. The green cells have already been sorted. White cells show unsorted data, yellow cell represent the minimal element in unsorted data, and blue cell represents moving data that will be compared to discover the smallest element. The simulation clearly shows how selection sort works.
Image Source: https://gfycat.com/snappymasculineamericancicada [Link]
Properties of Selection Sort
- In place – O(1) Extra Space: Advantage over other algorithm when auxiliary memory is limited
- Not Adaptive
- Not stable
- Can be implemented as a stable sort if, rather than swapping in step 2, the minimum value is inserted into the first position (that is, all intervening items moved down)
- It requires a data structure that supports efficient insertions , such as a linked list, or it leads to performing
- O(n2) writes.
- O(n2) Comparison
- O(n) Swap
Discussion and Comparison
it is a good approach for sorting large objects (records) with tiny keys. Due to O(n2) complexity, it is inefficient for big lists and typically performs worse than insertion sort.
Selection sort performs O(n) swaps, where as insertion sort does O(n2) swaps. It nearly always beats bubble sort and gnome sort among basic average case O(n2) algorithms
Bingo Sort: A variant of selection sort orders items by first finding the least value, then repeatedly moving all items with that value to their final location and find the least value for the next pass. This is more efficient than selection sort if there are many duplicate values.
- Selection sort does one pass for each item
- Bingo sort does one pass for each value
Some Popular Sorting Algorithms
Following are the few popular sorting algorithms used in computer science: