Chapter 45: DSA Selection Sort Time Complexity
DSA Selection Sort Time Complexity – Let’s understand this in depth, calmly and step by step, like I’m sitting next to you explaining it on paper with small examples, counting operations together, and showing clearly why the time complexity is what it is.
1. First – Quick reminder of how Selection Sort works
Selection Sort works like this:
- In every pass, it finds the smallest element in the unsorted part of the array
- Then it swaps that smallest element with the first position of the unsorted part
- After each pass, the next smallest element is placed in its correct position
Example (very small array so we can count easily):
Initial array: 64 25 12 22 11
Pass 1: Find minimum in whole array → 11 Swap with position 0 → 11 25 12 22 64
Pass 2: Find minimum in remaining part (25,12,22,64) → 12 Swap with position 1 → 11 12 25 22 64
Pass 3: Minimum in (25,22,64) → 22 Swap → 11 12 22 25 64
Pass 4: Minimum in (25,64) → 25 (already correct) No swap needed → 11 12 22 25 64
2. Three Cases of Time Complexity (very important)
Like almost every sorting algorithm, we talk about:
- Best case
- Average case
- Worst case
Selection Sort is special because all three cases have almost the same time complexity.
3. Detailed Time Complexity Breakdown
Number of comparisons – the main work in Selection Sort
In each pass i (0 to n-2):
- We look at the unsorted part (from index i to n-1)
- We do (n – i – 1) comparisons to find the minimum
So total number of comparisons:
Pass 1: n-1 comparisons Pass 2: n-2 comparisons Pass 3: n-3 comparisons … Pass n-1: 1 comparison
Total comparisons = (n-1) + (n-2) + … + 1 = n(n-1)/2
This is exactly the same in every case:
- Best case (already sorted)
- Average case (random order)
- Worst case (reverse sorted)
Why? Selection Sort always scans the entire remaining unsorted portion to find the minimum — it never stops early, even if the array is already sorted.
Number of swaps – different in different cases
- Best case (already sorted): 0 swaps (or sometimes 1 per pass if you count self-swap, but usually 0)
- Average case: roughly n/2 swaps (random order)
- Worst case (reverse sorted): n-1 swaps (almost every pass swaps)
But swaps are constant time operations, so they don’t change the overall Big-O.
4. Final Time Complexity Verdict
| Case | Comparisons | Swaps | Time Complexity | Important note |
|---|---|---|---|---|
| Best Case | n(n-1)/2 | ~0 | O(n²) | No early stopping — always full scans |
| Average Case | n(n-1)/2 | ~n/2 | O(n²) | Same number of comparisons |
| Worst Case | n(n-1)/2 | n-1 | O(n²) | Maximum swaps, but still O(n²) |
Key sentence to say in interviews:
“Selection Sort has O(n²) time complexity in all cases — best, average, and worst — because it always performs approximately n(n-1)/2 comparisons to find the minimum element in each pass, regardless of whether the array is sorted or not. Unlike Bubble Sort or Insertion Sort, it does not become faster on already sorted or nearly sorted data.”
5. Quick Comparison – Selection Sort vs Other Simple Sorts
| Algorithm | Best Case | Average Case | Worst Case | Adaptive? (faster on sorted data?) |
|---|---|---|---|---|
| Bubble Sort (optimized) | O(n) | O(n²) | O(n²) | Yes |
| Insertion Sort | O(n) | O(n²) | O(n²) | Yes |
| Selection Sort | O(n²) | O(n²) | O(n²) | No – never early exit |
This is why people often say:
“Selection Sort is never faster than O(n²), even on sorted arrays — that’s its biggest weakness.”
6. Space Complexity (bonus)
- O(1) extra space (in-place sorting)
- Only uses a few variables (min_index, temp for swap)
7. Summary – Selection Sort Time Complexity (what to say in interviews)
“Selection Sort performs n(n-1)/2 comparisons in every case — best, average, and worst — because it always scans the entire unsorted portion to find the minimum element in each pass. Therefore, its time complexity is O(n²) in all scenarios. It does not take advantage of already sorted or nearly sorted data, unlike Bubble Sort (optimized) or Insertion Sort, which can reach O(n) in the best case. Space complexity is O(1) as it is in-place.”
Do you want to go deeper into any part?
Common follow-up questions people ask after this:
- Why doesn’t Selection Sort have O(n) best case like Bubble Sort?
- How many swaps exactly in best / average / worst case?
- Selection Sort vs Insertion Sort – which is better in practice?
- Can we make Selection Sort adaptive? (spoiler: not really)
- Mathematical proof of n(n-1)/2 comparisons
Just tell me which one you want — I’ll continue explaining in the same calm, detailed, teacher style with examples 😊
