Skip to main content

⏱️ Big O Notation

The first thing interviewers evaluate. Get this wrong = instant red flag.

TL;DR

Big O describes how runtime/space grows with input size.

O(1) → O(log n) → O(n) → O(n log n) → O(n²) → O(2ⁿ) → O(n!)
↑ ↑
Best Worst

The Visual Guide

Runtime

│ ╱ O(n!)
│ ╱
│ ╱ O(2ⁿ)
│╱
│ ╱ O(n²)
│ ╱
│ ╱ O(n log n)
│ ╱
│ ╱───── O(n)
│╱
│──────── O(log n)
│════════ O(1)
└────────────────────→ Input Size (n)

Complexity Breakdown

O(1) - Constant Time

Doesn't grow with input size.

def get_first(arr):
return arr[0] # O(1) - same time for 10 or 10M elements

def is_even(n):
return n % 2 == 0 # O(1)

Examples: Array access, HashMap lookup, push/pop stack


O(log n) - Logarithmic

Input halved each step. The magic of divide and conquer.

def binary_search(arr, target):
left, right = 0, len(arr) - 1
while left <= right:
mid = (left + right) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
left = mid + 1 # Eliminate left half
else:
right = mid - 1 # Eliminate right half
return -1

# n=1000: ~10 steps
# n=1,000,000: ~20 steps

Examples: Binary search, balanced BST, heap operations

Interview Pattern

Sorted array? → Think O(log n) with binary search


O(n) - Linear

Visit each element once.

def find_max(arr):
max_val = arr[0]
for x in arr: # n iterations
if x > max_val:
max_val = x
return max_val

# n=1000: 1000 ops
# n=1,000,000: 1,000,000 ops

Examples: Linear search, single pass, array copy


O(n log n) - Linearithmic

Divide and conquer with linear work at each level.

def merge_sort(arr):
if len(arr) <= 1:
return arr

mid = len(arr) // 2
left = merge_sort(arr[:mid]) # log n levels
right = merge_sort(arr[mid:])
return merge(left, right) # O(n) work per level

# Total: n × log n

Examples: Merge sort, quicksort (average), heap sort

Why it matters: This is the lower bound for comparison-based sorting.


O(n²) - Quadratic

Nested loops over input.

def has_duplicate_pair(arr):
for i in range(len(arr)): # n
for j in range(i+1, len(arr)): # n
if arr[i] == arr[j]:
return True
return False

# n=1000: 1,000,000 ops
# n=10,000: 100,000,000 ops 😱

Examples: Bubble sort, selection sort, nested loops

Interview Alert

O(n²) on n=10⁵ = 10¹⁰ ops ≈ timeout. Optimize to O(n log n) or O(n).


O(2ⁿ) - Exponential

Doubles with each addition to input.

def fibonacci(n):
if n <= 1:
return n
return fibonacci(n-1) + fibonacci(n-2) # 2 calls per call

# n=30: 1 billion calls
# n=40: 1 trillion calls

Examples: Subsets, naive recursion without memoization

Fix: Use dynamic programming to bring to O(n)


O(n!) - Factorial

Every permutation of input.

def permutations(arr):
if len(arr) <= 1:
return [arr]
result = []
for i in range(len(arr)):
rest = arr[:i] + arr[i+1:]
for perm in permutations(rest):
result.append([arr[i]] + perm)
return result

# n=10: 3,628,800 permutations
# n=15: 1.3 trillion 😱

Examples: Permutations, traveling salesman (brute force)


Quick Reference Table

ComplexityNameExamplesn=1000 ops
O(1)ConstantArray access, hash lookup1
O(log n)LogarithmicBinary search, BST10
O(n)LinearLinear scan, single loop1,000
O(n log n)LinearithmicMerge sort, heap sort10,000
O(n²)QuadraticNested loops1,000,000
O(n³)CubicTriple nested loops1,000,000,000
O(2ⁿ)ExponentialSubsets, backtracking10³⁰⁰
O(n!)FactorialPermutations10²⁵⁶⁷

Space Complexity

Don't forget about memory usage!

# O(1) space - constant
def sum_array(arr):
total = 0
for x in arr:
total += x
return total

# O(n) space - linear
def double_array(arr):
return [x * 2 for x in arr] # New array of size n

# O(n) space - recursion stack
def factorial(n):
if n <= 1:
return 1
return n * factorial(n-1) # Call stack depth = n

Common Space Complexity:

SpaceExamples
O(1)In-place sort, two pointers
O(log n)Balanced recursion depth
O(n)Copy array, HashMap, recursion depth
O(n²)2D matrix

How to Analyze Code

Step 1: Count the Loops

# O(n)
for i in range(n):
...

# O(n²)
for i in range(n):
for j in range(n):
...

# O(n × m)
for i in range(n):
for j in range(m):
...

Step 2: Check for Halving

# O(log n) - halving
while n > 0:
n = n // 2

# O(n log n) - n iterations, log n work each
for i in range(n):
binary_search(arr, target) # O(log n)

Step 3: Recursion → Draw the Tree

def recurse(n):
if n <= 0:
return
recurse(n - 1) # One branch → O(n) depth
recurse(n - 1) # Two branches → O(2ⁿ) nodes

# Total = nodes × work per node

Step 4: Apply the Rules

  1. Drop constants: O(2n) → O(n)
  2. Drop lower terms: O(n² + n) → O(n²)
  3. Parallel branches: Add O(n) + O(m)
  4. Nested operations: Multiply O(n) × O(m)

Interview Cheat Sheet

If you see...Think...
Sorted arrayO(log n) binary search
Unsorted array, find somethingO(n) or use HashMap for O(1)
Two nested loopsCan we reduce from O(n²)?
Recursion with overlapping subproblemsDP to avoid O(2ⁿ)
"Find all subsets"O(2ⁿ) is expected
"Find all permutations"O(n!) is expected
Heap operationsO(log n) per operation
Graph with V vertices, E edgesO(V + E) for BFS/DFS

Constraints Guide

What complexity can you afford based on input constraints?

nMax ComplexityWhy
n ≤ 10O(n!)Permutations OK
n ≤ 20O(2ⁿ)Subsets OK
n ≤ 500O(n³)Triple loops OK
n ≤ 5,000O(n²)Nested loops OK
n ≤ 10⁶O(n log n)Need efficient algo
n ≤ 10⁸O(n)Single pass only
n > 10⁸O(log n) or O(1)Binary search, math
Interview Tip

Always check constraints first! If n ≤ 20, brute force O(2ⁿ) might be intended.

Common Optimizations

FromToHow
O(n²)O(n)HashMap, two pointers
O(n²)O(n log n)Sort first
O(2ⁿ)O(n) or O(n²)Dynamic programming
O(n)O(log n)Binary search on sorted
O(n)O(1)Math formula, constant space


Next: Interview Best Practices → - The meta-game of coding interviews.