⏱️ Big O Notation
The first thing interviewers evaluate. Get this wrong = instant red flag.
TL;DR
Big O describes how runtime/space grows with input size.
O(1) → O(log n) → O(n) → O(n log n) → O(n²) → O(2ⁿ) → O(n!)
↑ ↑
Best Worst
The Visual Guide
Runtime
↑
│ ╱ O(n!)
│ ╱
│ ╱ O(2ⁿ)
│╱
│ ╱ O(n²)
│ ╱
│ ╱ O(n log n)
│ ╱
│ ╱───── O(n)
│╱
│──────── O(log n)
│════════ O(1)
└────────────────────→ Input Size (n)
Complexity Breakdown
O(1) - Constant Time
Doesn't grow with input size.
def get_first(arr):
return arr[0] # O(1) - same time for 10 or 10M elements
def is_even(n):
return n % 2 == 0 # O(1)
Examples: Array access, HashMap lookup, push/pop stack
O(log n) - Logarithmic
Input halved each step. The magic of divide and conquer.
def binary_search(arr, target):
left, right = 0, len(arr) - 1
while left <= right:
mid = (left + right) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
left = mid + 1 # Eliminate left half
else:
right = mid - 1 # Eliminate right half
return -1
# n=1000: ~10 steps
# n=1,000,000: ~20 steps
Examples: Binary search, balanced BST, heap operations
Sorted array? → Think O(log n) with binary search
O(n) - Linear
Visit each element once.
def find_max(arr):
max_val = arr[0]
for x in arr: # n iterations
if x > max_val:
max_val = x
return max_val
# n=1000: 1000 ops
# n=1,000,000: 1,000,000 ops
Examples: Linear search, single pass, array copy
O(n log n) - Linearithmic
Divide and conquer with linear work at each level.
def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid]) # log n levels
right = merge_sort(arr[mid:])
return merge(left, right) # O(n) work per level
# Total: n × log n
Examples: Merge sort, quicksort (average), heap sort
Why it matters: This is the lower bound for comparison-based sorting.
O(n²) - Quadratic
Nested loops over input.
def has_duplicate_pair(arr):
for i in range(len(arr)): # n
for j in range(i+1, len(arr)): # n
if arr[i] == arr[j]:
return True
return False
# n=1000: 1,000,000 ops
# n=10,000: 100,000,000 ops 😱
Examples: Bubble sort, selection sort, nested loops
O(n²) on n=10⁵ = 10¹⁰ ops ≈ timeout. Optimize to O(n log n) or O(n).
O(2ⁿ) - Exponential
Doubles with each addition to input.
def fibonacci(n):
if n <= 1:
return n
return fibonacci(n-1) + fibonacci(n-2) # 2 calls per call
# n=30: 1 billion calls
# n=40: 1 trillion calls
Examples: Subsets, naive recursion without memoization
Fix: Use dynamic programming to bring to O(n)
O(n!) - Factorial
Every permutation of input.
def permutations(arr):
if len(arr) <= 1:
return [arr]
result = []
for i in range(len(arr)):
rest = arr[:i] + arr[i+1:]
for perm in permutations(rest):
result.append([arr[i]] + perm)
return result
# n=10: 3,628,800 permutations
# n=15: 1.3 trillion 😱
Examples: Permutations, traveling salesman (brute force)
Quick Reference Table
| Complexity | Name | Examples | n=1000 ops |
|---|---|---|---|
| O(1) | Constant | Array access, hash lookup | 1 |
| O(log n) | Logarithmic | Binary search, BST | 10 |
| O(n) | Linear | Linear scan, single loop | 1,000 |
| O(n log n) | Linearithmic | Merge sort, heap sort | 10,000 |
| O(n²) | Quadratic | Nested loops | 1,000,000 |
| O(n³) | Cubic | Triple nested loops | 1,000,000,000 |
| O(2ⁿ) | Exponential | Subsets, backtracking | 10³⁰⁰ |
| O(n!) | Factorial | Permutations | 10²⁵⁶⁷ |
Space Complexity
Don't forget about memory usage!
# O(1) space - constant
def sum_array(arr):
total = 0
for x in arr:
total += x
return total
# O(n) space - linear
def double_array(arr):
return [x * 2 for x in arr] # New array of size n
# O(n) space - recursion stack
def factorial(n):
if n <= 1:
return 1
return n * factorial(n-1) # Call stack depth = n
Common Space Complexity:
| Space | Examples |
|---|---|
| O(1) | In-place sort, two pointers |
| O(log n) | Balanced recursion depth |
| O(n) | Copy array, HashMap, recursion depth |
| O(n²) | 2D matrix |
How to Analyze Code
Step 1: Count the Loops
# O(n)
for i in range(n):
...
# O(n²)
for i in range(n):
for j in range(n):
...
# O(n × m)
for i in range(n):
for j in range(m):
...
Step 2: Check for Halving
# O(log n) - halving
while n > 0:
n = n // 2
# O(n log n) - n iterations, log n work each
for i in range(n):
binary_search(arr, target) # O(log n)
Step 3: Recursion → Draw the Tree
def recurse(n):
if n <= 0:
return
recurse(n - 1) # One branch → O(n) depth
recurse(n - 1) # Two branches → O(2ⁿ) nodes
# Total = nodes × work per node
Step 4: Apply the Rules
- Drop constants: O(2n) → O(n)
- Drop lower terms: O(n² + n) → O(n²)
- Parallel branches: Add O(n) + O(m)
- Nested operations: Multiply O(n) × O(m)
Interview Cheat Sheet
| If you see... | Think... |
|---|---|
| Sorted array | O(log n) binary search |
| Unsorted array, find something | O(n) or use HashMap for O(1) |
| Two nested loops | Can we reduce from O(n²)? |
| Recursion with overlapping subproblems | DP to avoid O(2ⁿ) |
| "Find all subsets" | O(2ⁿ) is expected |
| "Find all permutations" | O(n!) is expected |
| Heap operations | O(log n) per operation |
| Graph with V vertices, E edges | O(V + E) for BFS/DFS |
Constraints Guide
What complexity can you afford based on input constraints?
| n | Max Complexity | Why |
|---|---|---|
| n ≤ 10 | O(n!) | Permutations OK |
| n ≤ 20 | O(2ⁿ) | Subsets OK |
| n ≤ 500 | O(n³) | Triple loops OK |
| n ≤ 5,000 | O(n²) | Nested loops OK |
| n ≤ 10⁶ | O(n log n) | Need efficient algo |
| n ≤ 10⁸ | O(n) | Single pass only |
| n > 10⁸ | O(log n) or O(1) | Binary search, math |
Always check constraints first! If n ≤ 20, brute force O(2ⁿ) might be intended.
Common Optimizations
| From | To | How |
|---|---|---|
| O(n²) | O(n) | HashMap, two pointers |
| O(n²) | O(n log n) | Sort first |
| O(2ⁿ) | O(n) or O(n²) | Dynamic programming |
| O(n) | O(log n) | Binary search on sorted |
| O(n) | O(1) | Math formula, constant space |
Next: Interview Best Practices → - The meta-game of coding interviews.