Chapter 52: DSA Selection Sort Time Complexity
DSA Selection Sort Time Complexity Let’s go through this topic very carefully and honestly — like I’m sitting next to you with a notebook, writing down every number, counting every comparison together, and showing you exactly why the time complexity is what it is (and why it behaves the same in almost every case).
1. First — Quick reminder of how Selection Sort works
Selection Sort divides the array into two parts:
- Sorted part (initially empty, grows from left)
- Unsorted part (initially the whole array)
In every pass:
- Find the minimum element in the unsorted part
- Swap it with the first element of the unsorted part
- Now the sorted part grows by one element
Very small example (n = 5):
Array: 64 25 12 22 11
Pass 1 Find min in whole array = 11 Swap with position 0 → 11 25 12 22 64 (4 comparisons)
Pass 2 Find min in remaining [25,12,22,64] = 12 Swap with position 1 → 11 12 25 22 64 (3 comparisons)
Pass 3 Min in [25,22,64] = 22 Swap → 11 12 22 25 64 (2 comparisons)
Pass 4 Min in [25,64] = 25 (already correct) No swap (1 comparison)
Total comparisons = 4 + 3 + 2 + 1 = 10
2. Time Complexity – All Three Cases
Selection Sort is unusual because the time complexity is almost identical in best, average, and worst case.
| Case | When does it happen? | Number of comparisons | Number of swaps | Time Complexity | Important note |
|---|---|---|---|---|---|
| Best Case | Array already sorted | n(n-1)/2 | 0 (or very few) | O(n²) | Still does full scans |
| Average Case | Random order | n(n-1)/2 | ≈ n/2 | O(n²) | Same number of comparisons |
| Worst Case | Array reverse sorted | n(n-1)/2 | n-1 | O(n²) | Maximum swaps, but same comparisons |
3. Why is it always O(n²)? (the most important point)
In Selection Sort:
- The number of comparisons is completely fixed — it does not depend on the arrangement of elements
- In every pass i (from 0 to n-2), we always compare (n – i – 1) elements to find the minimum
So total comparisons:
(n-1) + (n-2) + (n-3) + … + 1 = n(n-1)/2
This number is exactly the same whether the array is:
- Already sorted
- Random
- Reverse sorted
Unlike Bubble Sort or Insertion Sort, Selection Sort never gets faster on sorted or nearly sorted data — it always does the full amount of work.
That’s why its best case is still O(n²).
4. Number of swaps – different in different cases
Comparisons are always ~n²/2, but swaps vary:
- Best case (already sorted): → Minimum is always already in correct position → 0 swaps (or very few if you count self-swap)
- Worst case (reverse sorted): → Every pass finds min at the end → n-1 swaps
- Average case: → Roughly n/2 swaps (random order)
But since swap is a constant time operation (just 3 assignments), it does not change the Big-O time complexity.
5. Real numbers – how slow it actually gets
| n | Comparisons (exactly) | Approx swaps (worst) | Real-world feeling (modern computer) |
|---|---|---|---|
| 10 | 45 | 9 | instant |
| 100 | 4,950 | 99 | instant |
| 1,000 | ~500,000 | 999 | still very fast (< 0.01 sec) |
| 10,000 | ~50 million | 9,999 | ~0.01–0.1 sec |
| 100,000 | ~5 billion | ~100,000 | usually too slow (5–50 seconds) |
| 1,000,000 | ~500 billion | ~1 million | practically unusable |
Conclusion:
- For n ≤ 10,000 → Selection Sort is usually acceptable
- For n ≥ 100,000 → it becomes too slow in most competitive programming / interview constraints
6. Summary – What you should say in interviews
“Selection Sort has O(n²) time complexity in all three cases — best, average, and worst — because it always performs approximately n(n-1)/2 comparisons to find the minimum element in each pass, regardless of the initial arrangement of the array. Unlike Bubble Sort (optimized) or Insertion Sort, it does not become faster on already sorted or nearly sorted data — it always does the full amount of work. The number of swaps varies: 0 in best case, ≈ n/2 in average case, n-1 in worst case — but since swaps are constant time operations, they do not affect the Big-O complexity. Space complexity is O(1) as it is an in-place sorting algorithm.”
7. Quick Cheat Sheet – Selection Sort Time Complexity
| Question | Answer |
|---|---|
| Best-case time complexity | O(n²) |
| Average-case time complexity | O(n²) |
| Worst-case time complexity | O(n²) |
| Space complexity | O(1) |
| Stable? | No |
| Adaptive? | No (never early exit) |
| When to use | Educational purposes, very small n (< 100–200), when minimum swaps are desired |
Do you want to continue deeper into any part?
Common follow-up questions:
- Why is Selection Sort O(n²) even in best case (unlike Insertion/Bubble)?
- Selection Sort vs Insertion Sort — which is better in practice?
- How many swaps exactly in best / average / worst case?
- Can we make Selection Sort adaptive? (spoiler: not really)
- Mathematical proof of n(n-1)/2 comparisons
Just tell me which one you want next — I’ll explain it in the same calm, detailed, teacher style with clear examples 😊
