Chapter 26: DSA Array Implementation
DSA Array Implementation — Let’s understand this topic properly, like I’m sitting next to you with a notebook, explaining slowly, clearly, with real examples, memory visuals, code patterns, common mistakes, interview context, and most importantly — how arrays are actually implemented under the hood and why that matters.
What does “Array Implementation” really mean?
When people say “array implementation” in DSA context, they usually mean one (or more) of these three things:
- How arrays are stored in memory (the real low-level implementation)
- How to implement array-like behavior using other structures (linked list style array, dynamic array)
- Common array-based implementations of other data structures (stack, queue, hash table, etc.)
Today I’ll cover all three in detail — starting from the most fundamental one.
1. How Arrays are Actually Implemented in Memory (most important part)
In almost every modern programming language, an array is a contiguous block of memory.
That means:
- All elements are stored one after another with no gaps
- Each element takes the same amount of memory (fixed size)
- You can calculate the memory address of any element instantly using math
Visual example (suppose we have int arr[5]):
|
0 1 2 3 4 5 6 7 8 |
Memory address: 1000 1004 1008 1012 1016 Value: 7 14 23 19 8 Index: 0 1 2 3 4 |
Key facts:
- Size of each int = 4 bytes (on most 32/64-bit systems)
- Address of arr[i] = base address + i × size_of_element
- So arr[3] = 1000 + 3 × 4 = 1012
This is why:
- Random access is O(1) — just calculate address and go there
- Cache performance is excellent (sequential memory access)
- Insert/delete in middle is O(n) — you have to shift all remaining elements
2. Static Array vs Dynamic Array (very important distinction)
Static Array (fixed size)
Example in C/C++:
|
0 1 2 3 4 5 6 |
int arr[100]; // size fixed at compile time |
- Size decided at compile time
- Cannot grow or shrink
- Memory allocated on stack (fast but limited size)
- Very fast, no overhead
Dynamic Array (resizable) — most common in modern languages
Examples:
- C++ → std::vector
- Java → ArrayList
- Python → list
- JavaScript → Array
- C# → List<T>
How dynamic array grows internally (very important interview concept)
|
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
Initial capacity = 4 Index: 0 1 2 3 Value: 10 20 30 40 ← size=4, capacity=4 Now we add 50 → capacity full → create NEW array with double size (usually 2×) → copy all old elements → add new element New array: Index: 0 1 2 3 4 5 6 7 Value: 10 20 30 40 50 ← size=5, capacity=8 |
Amortized time complexity:
- append (push back) → O(1) amortized (sometimes O(n) when resizing, but over many operations averages to O(1))
3. Common Array-based Implementations of Other Data Structures
Many data structures are implemented using arrays because of fast random access.
a) Stack using Array
|
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
class Stack: def __init__(self): self.arr = [] def push(self, val): self.arr.append(val) # O(1) amortized def pop(self): if not self.arr: return None return self.arr.pop() # O(1) amortized def top(self): return self.arr[-1] if self.arr else None |
b) Queue using Array (Circular Queue – most efficient)
|
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 |
class CircularQueue: def __init__(self, k): self.queue = [0] * k self.front = 0 self.rear = -1 self.size = 0 self.capacity = k def enQueue(self, value): if self.size == self.capacity: return False self.rear = (self.rear + 1) % self.capacity self.queue[self.rear] = value self.size += 1 return True def deQueue(self): if self.size == 0: return False self.front = (self.front + 1) % self.capacity self.size -= 1 return True |
c) Hash Table (separate chaining)
|
0 1 2 3 4 5 6 7 8 9 10 11 12 |
class HashTable: def __init__(self): self.buckets = [[] for _ in range(100)] # array of lists def put(self, key, value): index = hash(key) % len(self.buckets) self.buckets[index].append((key, value)) |
Summary Table – Array Implementation Facts
| Question / Property | Answer / Detail |
|---|---|
| Are array elements contiguous? | Yes — in memory |
| Random access time | O(1) |
| Insert at beginning / middle | O(n) — shift elements |
| Insert at end (dynamic array) | O(1) amortized |
| How dynamic array grows | Usually double capacity when full |
| Amortized complexity of append | O(1) |
| Worst-case single append | O(n) (during resize) |
| Most common array-based structures | Stack, Queue, Hash Table, Heap, Matrix |
Common Interview Questions about Array Implementation
- Explain how dynamic array (vector/ArrayList) grows internally
- What is amortized time complexity? Prove that append is O(1) amortized
- Why is insert at beginning O(n) in array but O(1) in linked list?
- Implement circular queue using array
- Implement stack using array
- When would you prefer array over linked list? (and vice versa)
Quick Summary – Array Implementation in one line each
- Arrays store elements in contiguous memory locations
- Random access = O(1) because of direct address calculation
- Static arrays → fixed size, fast
- Dynamic arrays → resizable, amortized O(1) append
- Insert/delete in middle → O(n) because of shifting
- Most common building block for stack, queue, hash table, heap, matrix
- Understanding memory layout is key to mastering arrays
Would you like to go deeper into any particular aspect of arrays?
Very common follow-up topics:
- Dynamic Array resizing detailed dry run with time analysis
- Circular Queue implementation step-by-step
- Array vs Linked List deep comparison
- Common array-based problems (two pointers, sliding window, prefix sum)
- How Python list / C++ vector actually works internally
Just tell me which direction you want to go next — I’ll explain it in the same detailed, friendly, teacher style with clear examples! 😊
