Chapter 6: DSA Counting Sort

DSA Counting Sort – Let’s learn this one properly, like I’m sitting next to you explaining it step-by-step with real examples, drawings in words, code, and most importantly — when and why you should actually use it (because unlike QuickSort or MergeSort, Counting Sort is very special and has very clear use-cases).

What is Counting Sort?

Counting Sort is a non-comparison-based sorting algorithm.

That sentence is very important.

Most sorting algorithms we have seen so far (Bubble, Insertion, Selection, Quick, Merge, Heap…) compare elements with each other to decide order.

Counting Sort does not compare elements at all.

Instead it:

  1. Counts how many times each value appears
  2. Uses that count to place every element in its correct final position

Because it avoids comparisons, it can be extremely fast — but only under certain conditions.

Core Idea in one simple sentence

If the range of possible values is small, we can count the frequency of each value and then rebuild the sorted array using those counts.

When is Counting Sort very good?

Ideal conditions (must remember these for interviews):

  • The values are integers (or can be mapped to integers)
  • The range of values is small compared to number of elements (example: values between 0–100 but n = 1,000,000)
  • We want stable sorting (same values keep their original relative order)

Classic Example – Step by Step

Array to sort: [4, 2, 4, 1, 0, 4, 2, 3, 1]

Range of values = 0 to 4 (very small!)

Step 1: Find the range min = 0, max = 4 range size = max – min + 1 = 5

Step 2: Create count array count = [0] × 5 (indices 0,1,2,3,4)

Now count frequency:

  • 0 → 1 time
  • 1 → 2 times
  • 2 → 2 times
  • 3 → 1 time
  • 4 → 3 times

So count = [1, 2, 2, 1, 3]

Step 3: (Important – for stable sort) Convert count to position array

We change count[i] → number of elements ≤ i

count[0] = 1 count[1] = 1+2 = 3 count[2] = 3+2 = 5 count[3] = 5+1 = 6 count[4] = 6+3 = 9

Now count = [1, 3, 5, 6, 9]

This array now tells us: “Where should the last occurrence of each number go in the final array?”

Step 4: Build the output array (from right to left – for stability)

We create a result array of same size.

We go through original array from right to left:

Original: 4, 2, 4, 1, 0, 4, 2, 3, 1 (we read from end)

  1. Take 1 → count[1] = 3 → put 1 at index 3-1=2 → count[1]– result: [ , , 1, , , , , , ]
  2. Take 3 → count[3] = 6 → put 3 at index 6-1=5 → count[3]– result: [ , , 1, , , 3, , , ]
  3. Take 2 → count[2] = 5 → put 2 at index 5-1=4 → count[2]– result: [ , , 1, , 2, 3, , , ]
  4. Take 4 → count[4] = 9 → put 4 at index 9-1=8 → count[4]– result: [ , , 1, , 2, 3, , , 4]
  5. Take 4 → count[4] = 8 → put 4 at index 8-1=7 → count[4]– result: [ , , 1, , 2, 3, , 4, 4]
  6. Take 0 → count[0] = 1 → put 0 at index 1-1=0 → count[0]– result: [0, , 1, , 2, 3, , 4, 4]
  7. Take 4 → count[4] = 7 → put 4 at index 7-1=6 → count[4]– result: [0, , 1, , 2, 3, 4, 4, 4]
  8. Take 1 → count[1] = 2 → put 1 at index 2-1=1 → count[1]– result: [0, 1, 1, , 2, 3, 4, 4, 4]
  9. Take 2 → count[2] = 4 → put 2 at index 4-1=3 → count[2]– result: [0, 1, 1, 2, 2, 3, 4, 4, 4]

Done!

Final sorted array: [0, 1, 1, 2, 2, 3, 4, 4, 4]

Notice: all same values kept their relative orderstable sort

Counting Sort Code (Most Common Interview Version – Stable)

Python

Time & Space Complexity (Very Important)

Aspect Complexity Comment
Time O(n + k) n = number of elements, k = range size
Space O(n + k) count array + output array
Comparison-based No This is the key point
Stable Yes When implemented from right to left
In-place No Needs extra space

When to Use Counting Sort (Interview Answer – Must Know)

Use Counting Sort when:

  • Range of values (k) is small compared to n → k << n or k ≈ n (still good)
  • Values are non-negative integers (or can be mapped easily)
  • You need stable sorting
  • You want linear time sorting

Examples of good use-cases:

  • Sorting exam marks (0–100)
  • Sorting ages of people (0–120)
  • Sorting letters in a string (a–z → k=26)
  • Radix Sort’s individual digit sorting
  • Sorting pixels by brightness (0–255)
  • Sorting small-range IDs, priorities, etc.

When NOT to use Counting Sort:

  • Range is very large (example: numbers up to 10⁹)
  • Floating point numbers
  • Very large negative numbers (without shifting)
  • When memory is very limited and k is large

Quick Comparison Table

Algorithm Best / Avg Time Worst Time Stable? Space Best Use Case
Counting Sort O(n + k) O(n + k) Yes O(n + k) Small range integers
Quick Sort O(n log n) O(n²) No O(log n) General purpose
Merge Sort O(n log n) O(n log n) Yes O(n) When stability needed
Insertion Sort O(n) O(n²) Yes O(1) Nearly sorted

Summary – Counting Sort in one line each

  • Non-comparison based linear time sorting
  • Counts frequency of each value
  • Uses counts to place elements in correct positions
  • Time = O(n + k) where k = range of values
  • Stable when implemented correctly
  • Very fast when range is small
  • Very common in interviews when they ask: “Can you sort in linear time?” “Sort array of marks 0–100 efficiently”

Do you want to continue with related topics?

Very natural next topics:

  • Radix Sort (uses Counting Sort as subroutine)
  • How to handle negative numbers in Counting Sort
  • Bucket Sort (similar idea but for floating points)
  • Common interview questions using Counting Sort idea
  • When interviewers ask “Sort in O(n) time”

Just tell me which one you want next — I’ll explain in the same detailed, friendly way! 😊

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *