Algorithms are step-by-step procedures for solving problems. Mastering algorithms helps you write efficient code, ace technical interviews, and think systematically about complex problems.
Why Learn Algorithms?
Understanding algorithms allows you to:
- Write efficient code: Choose the best approach for your problem
- Optimize performance: Reduce time and memory usage
- Solve complex problems: Break down challenges into manageable steps
- Succeed in interviews: Algorithm questions are a core part of technical interviews
Big O Notation
Big O notation describes how an algorithm's runtime or space requirements grow as input size increases. It helps us compare algorithms and predict performance at scale.
Common Complexities (Fastest to Slowest)
| Notation | Name | Example |
|---|---|---|
| O(1) | Constant | Array access by index |
| O(log n) | Logarithmic | Binary search |
| O(n) | Linear | Simple loop through array |
| O(n log n) | Linearithmic | Merge sort, Quick sort |
| O(n²) | Quadratic | Nested loops, Bubble sort |
| O(2ⁿ) | Exponential | Recursive Fibonacci |
| O(n!) | Factorial | Generating permutations |
# O(1) - Constant time
def get_first(arr):
return arr[0] # Always one operation
# O(n) - Linear time
def find_max(arr):
max_val = arr[0]
for num in arr: # Loops through n elements
if num > max_val:
max_val = num
return max_val
# O(n²) - Quadratic time
def find_pairs(arr):
pairs = []
for i in range(len(arr)): # n iterations
for j in range(len(arr)): # n iterations each
pairs.append((arr[i], arr[j]))
return pairs Sorting Algorithms
Sorting is one of the most fundamental operations in computer science. Different algorithms have different trade-offs in terms of speed, memory usage, and stability.
Bubble Sort - O(n²)
Repeatedly swaps adjacent elements if they're in the wrong order. Simple but inefficient for large datasets.
def bubble_sort(arr):
n = len(arr)
for i in range(n):
# Flag to optimize if no swaps occur
swapped = False
for j in range(0, n - i - 1):
if arr[j] > arr[j + 1]:
arr[j], arr[j + 1] = arr[j + 1], arr[j]
swapped = True
if not swapped:
break
return arr Merge Sort - O(n log n)
A divide-and-conquer algorithm that splits the array, sorts each half, then merges them. Consistently efficient but requires extra space.
def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
return merge(left, right)
def merge(left, right):
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] <= right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result.extend(left[i:])
result.extend(right[j:])
return result Quick Sort - O(n log n) average
Picks a pivot element and partitions the array around it. Very fast in practice but can degrade to O(n²) with poor pivot choices.
def quick_sort(arr):
if len(arr) <= 1:
return arr
pivot = arr[len(arr) // 2]
left = [x for x in arr if x < pivot]
middle = [x for x in arr if x == pivot]
right = [x for x in arr if x > pivot]
return quick_sort(left) + middle + quick_sort(right) Sorting Algorithm Comparison
| Algorithm | Best | Average | Worst | Space | Stable |
|---|---|---|---|---|---|
| Bubble Sort | O(n) | O(n²) | O(n²) | O(1) | Yes |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes |
| Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No |
| Heap Sort | O(n log n) | O(n log n) | O(n log n) | O(1) | No |
Searching Algorithms
Linear Search - O(n)
Checks each element one by one. Works on any array but is slow for large datasets.
def linear_search(arr, target):
for i, element in enumerate(arr):
if element == target:
return i
return -1 Binary Search - O(log n)
Repeatedly divides the search space in half. Requires a sorted array but is extremely fast.
def binary_search(arr, target):
left, right = 0, len(arr) - 1
while left <= right:
mid = (left + right) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
left = mid + 1
else:
right = mid - 1
return -1
# Recursive version
def binary_search_recursive(arr, target, left, right):
if left > right:
return -1
mid = (left + right) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
return binary_search_recursive(arr, target, mid + 1, right)
else:
return binary_search_recursive(arr, target, left, mid - 1) Recursion
Recursion is when a function calls itself to solve smaller instances of the same problem. Every recursive solution needs a base case to stop the recursion.
Anatomy of Recursion
def factorial(n):
# Base case: stops recursion
if n <= 1:
return 1
# Recursive case: calls itself with smaller input
return n * factorial(n - 1)
# factorial(5) = 5 * factorial(4)
# = 5 * 4 * factorial(3)
# = 5 * 4 * 3 * factorial(2)
# = 5 * 4 * 3 * 2 * factorial(1)
# = 5 * 4 * 3 * 2 * 1
# = 120 Fibonacci - Classic Example
# Naive recursive - O(2^n) - Very slow!
def fib_naive(n):
if n <= 1:
return n
return fib_naive(n - 1) + fib_naive(n - 2)
# Memoized - O(n) - Much faster
def fib_memo(n, memo={}):
if n in memo:
return memo[n]
if n <= 1:
return n
memo[n] = fib_memo(n - 1, memo) + fib_memo(n - 2, memo)
return memo[n]
# Iterative - O(n) time, O(1) space - Best
def fib_iterative(n):
if n <= 1:
return n
a, b = 0, 1
for _ in range(2, n + 1):
a, b = b, a + b
return b Dynamic Programming
Dynamic Programming (DP) solves complex problems by breaking them into overlapping subproblems, solving each once, and storing the results. It's an optimization over plain recursion.
Two Approaches
- Top-down (Memoization): Start with the main problem, recurse down, cache results
- Bottom-up (Tabulation): Start with smallest subproblems, build up to the solution
Example: Climbing Stairs
You can climb 1 or 2 steps at a time. How many distinct ways can you climb n steps?
# Top-down with memoization
def climb_stairs_memo(n, memo={}):
if n in memo:
return memo[n]
if n <= 2:
return n
memo[n] = climb_stairs_memo(n - 1, memo) + climb_stairs_memo(n - 2, memo)
return memo[n]
# Bottom-up with tabulation
def climb_stairs_tab(n):
if n <= 2:
return n
dp = [0] * (n + 1)
dp[1] = 1
dp[2] = 2
for i in range(3, n + 1):
dp[i] = dp[i - 1] + dp[i - 2]
return dp[n]
# Space-optimized bottom-up
def climb_stairs_optimized(n):
if n <= 2:
return n
a, b = 1, 2
for _ in range(3, n + 1):
a, b = b, a + b
return b Example: Longest Common Subsequence
def lcs(text1, text2):
m, n = len(text1), len(text2)
dp = [[0] * (n + 1) for _ in range(m + 1)]
for i in range(1, m + 1):
for j in range(1, n + 1):
if text1[i - 1] == text2[j - 1]:
dp[i][j] = dp[i - 1][j - 1] + 1
else:
dp[i][j] = max(dp[i - 1][j], dp[i][j - 1])
return dp[m][n]
# lcs("abcde", "ace") = 3 # "ace" Greedy Algorithms
Greedy algorithms make the locally optimal choice at each step, hoping to find a global optimum. They don't always give the best solution, but when they do, they're often simple and efficient.
Example: Coin Change (Greedy)
def coin_change_greedy(coins, amount):
"""
Works for standard coin systems (1, 5, 10, 25)
May not work for arbitrary coin denominations
"""
coins.sort(reverse=True)
count = 0
for coin in coins:
while amount >= coin:
amount -= coin
count += 1
return count if amount == 0 else -1
# For coins = [25, 10, 5, 1] and amount = 41
# Result: 25 + 10 + 5 + 1 = 4 coins Example: Activity Selection
def activity_selection(activities):
"""
Select maximum number of non-overlapping activities.
activities: list of (start, end) tuples
"""
# Sort by end time
sorted_activities = sorted(activities, key=lambda x: x[1])
selected = [sorted_activities[0]]
last_end = sorted_activities[0][1]
for start, end in sorted_activities[1:]:
if start >= last_end:
selected.append((start, end))
last_end = end
return selected
# activities = [(1, 4), (3, 5), (0, 6), (5, 7), (8, 9), (5, 9)]
# Result: [(1, 4), (5, 7), (8, 9)] - 3 activities Common Problem-Solving Patterns
Two Pointers
Use two pointers moving through an array to solve problems efficiently.
def two_sum_sorted(arr, target):
"""Find two numbers that add up to target in sorted array."""
left, right = 0, len(arr) - 1
while left < right:
current_sum = arr[left] + arr[right]
if current_sum == target:
return [left, right]
elif current_sum < target:
left += 1
else:
right -= 1
return [] Sliding Window
Maintain a window of elements as you iterate through an array.
def max_sum_subarray(arr, k):
"""Find maximum sum of subarray with length k."""
if len(arr) < k:
return None
# Calculate sum of first window
window_sum = sum(arr[:k])
max_sum = window_sum
# Slide the window
for i in range(k, len(arr)):
window_sum += arr[i] - arr[i - k]
max_sum = max(max_sum, window_sum)
return max_sum Fast and Slow Pointers
Useful for cycle detection and finding middle elements.
def has_cycle(head):
"""Detect if a linked list has a cycle."""
slow = fast = head
while fast and fast.next:
slow = slow.next
fast = fast.next.next
if slow == fast:
return True
return False
def find_middle(head):
"""Find the middle node of a linked list."""
slow = fast = head
while fast and fast.next:
slow = slow.next
fast = fast.next.next
return slow Pattern Summary
| Pattern | When to Use | Example Problems |
|---|---|---|
| Two Pointers | Sorted arrays, finding pairs | Two Sum, Container With Water |
| Sliding Window | Contiguous subarrays/substrings | Max Sum Subarray, Longest Substring |
| Fast/Slow | Linked lists, cycle detection | Cycle Detection, Find Middle |
| BFS/DFS | Trees, graphs, mazes | Level Order, Path Finding |
| Binary Search | Sorted data, search space | Search, Find Minimum |
| Dynamic Programming | Overlapping subproblems | Fibonacci, Knapsack, LCS |
Next Steps
- Data Structures - Review the data structures these algorithms operate on
- Interview Prep - Practice algorithm problems with strategies