Chapter 48: DSA Counting Sort Time Complexity
DSA Counting Sort Time Complexity Let’s go through this topic very carefully and honestly — like I’m sitting next to you with a notebook, writing down every number, counting every operation together, and explaining exactly why the time complexity is what it is.
Counting Sort is very different from comparison-based sorts like Quick Sort, Merge Sort, Bubble Sort, etc. That difference is the key to understanding its time complexity.
1. Quick Reminder – What Counting Sort Actually Does
Counting Sort does not compare elements with each other. Instead it:
- Finds the minimum and maximum value in the array
- Creates a count array (or frequency array) of size (max – min + 1)
- Counts how many times each number appears
- Uses those counts to place every number in its correct sorted position
Because it avoids comparisons completely, it can be extremely fast — but only when the range of values is small.
2. The Three Main Phases and Their Costs
| Phase | What happens | Time Complexity |
|---|
- Find min and max | Scan the entire array once | O(n)
- Create count/frequency array | Loop from min to max → usually O(k) | O(k) where k = max – min + 1
- Count frequencies | Loop through all n elements | O(n)
- Compute cumulative counts (positions)| Loop through count array once | O(k)
- Build the output array (place elements) | Loop through original array once (backward) | O(n)
So total time = O(n) + O(k) + O(n) + O(k) + O(n) = O(n + k)
3. Very Important – What is k?
k = range of values = (maximum value – minimum value + 1)
Examples:
Array: [5, 3, 8, 4, 2] min = 2, max = 8 → k = 8 − 2 + 1 = 7 → Time = O(n + 7) ≈ O(n)
Array: marks of 100 students (0 to 100) min = 0, max = 100 → k = 101 → Time ≈ O(n) (very fast)
Array: 1 million random numbers from 1 to 1,000,000,000 min ≈ 1, max ≈ 10⁹ → k ≈ 1 billion → Time = O(n + 10⁹) → extremely slow and uses huge memory
4. Time Complexity in All Cases
| Case | When it happens? | Time Complexity | Memory (space) used | Real feeling |
|---|---|---|---|---|
| Best Case | Small range (k << n) | O(n + k) ≈ O(n) | O(k) | Extremely fast – linear time |
| Average Case | k is reasonable compared to n | O(n + k) | O(k) | Very fast if k is small |
| Worst Case | Very large range (k ≈ 10⁹ or more) | O(n + k) | O(k) | Very slow + huge memory |
Important sentence for interviews:
“Counting Sort has O(n + k) time complexity, where k = max – min + 1 (the range of values). When k is small compared to n (k << n), it becomes linear time O(n) and is one of the fastest sorting algorithms possible. But when k is very large (for example k ≈ 10⁹), it becomes impractical because both time and space become O(k) which can be enormous.”
5. Concrete Numbers – How it really behaves
Example 1 – Student marks (realistic case)
n = 1,000 students marks between 0 and 100 → k = 101 Time = O(1,000 + 101) ≈ O(1,000) → very fast
Example 2 – Large numbers (bad case)
n = 1,000,000 elements values between 1 and 1,000,000,000 → k ≈ 10⁹ Time = O(1,000,000 + 1,000,000,000) ≈ O(10⁹) → very slow Space for count array ≈ 4 GB (if int = 4 bytes) → may not even fit in memory
6. Comparison Table – Counting Sort vs Others
| Sorting Algorithm | Best Case | Average Case | Worst Case | Space Complexity | Stable? | When it wins |
|---|---|---|---|---|---|---|
| Counting Sort | O(n + k) | O(n + k) | O(n + k) | O(n + k) | Yes | k is small (k << n) |
| Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No | General case |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes | Stability needed |
| Insertion Sort | O(n) | O(n²) | O(n²) | O(1) | Yes | Small or nearly sorted |
Key observation:
Counting Sort is the only common sorting algorithm that can achieve O(n) time when the range of values is small.
7. Summary – What to say in interviews
“Counting Sort has O(n + k) time complexity, where n is the number of elements and k is the range of values (max – min + 1). When k is much smaller than n, it becomes effectively O(n) — linear time — which is faster than any comparison-based sort (whose lower bound is O(n log n)). However, when k is very large (for example k ≈ 10⁹), both time and space become impractical. Space complexity is also O(n + k) because we need a count array of size k and an output array of size n. It is stable and very fast for small-range integers (like marks 0–100, ages 0–120, digits 0–9, etc.).”
Do you want to continue deeper into any part?
Common next questions people ask:
- Why is Counting Sort stable while Quick Sort is not?
- How to handle negative numbers in Counting Sort?
- Counting Sort vs Radix Sort – when to use which
- Mathematical proof why it’s O(n + k)
- When interviewers ask “Can we sort in linear time?” – how to answer
Just tell me which one you want — I’ll continue explaining in the same calm, detailed, teacher style with clear examples 😊
