Algorithms — Glossary
Key definitions and short diagrams for core algorithms topics (sourced from the uploaded PDF). Use the index below to jump to a definition.
Source: uploaded PDF. :contentReference[oaicite:1]{index=1}
Searching Algorithms
Definition: Methods to locate an item in a collection. Two common examples:
Linear Search — Check each element one-by-one in an unsorted collection. Time complexity: O(n).
Binary Search — For sorted arrays: compare target with middle element and eliminate half each step. Time complexity: O(log n).
Sorting Algorithms
Definition: Algorithms to rearrange elements into a specified order (e.g., ascending).
There are many: QuickSort, MergeSort, HeapSort, InsertionSort, CountingSort, etc. Algorithms differ by stability, best/worst/average time complexity, and space usage.
Hashing
Definition: Technique that maps data keys to indices in a table using a hash function to allow (average) O(1) insert/search/delete.
Common uses: sets, dictionaries, caches. Needs collision handling (chaining or open addressing).
Asymptotic Notations
Definition: Mathematical tools to describe algorithm growth as input size → ∞.
- Big-O (O) — upper bound (worst-case). Example: O(n²).
- Omega (Ω) — lower bound (best-case). Example: Ω(n).
- Theta (Θ) — tight bound (both upper & lower). Example: Θ(n log n).
Algorithm Design Techniques
Definition: Strategies used to create algorithms. Important ones include:
- Greedy — make locally optimal choices hoping for global optimum (e.g., Fractional Knapsack).
- Divide & Conquer — split problem, solve subproblems, combine results (e.g., MergeSort, QuickSort).
- Dynamic Programming (DP) — like divide & conquer but reuse solutions of overlapping subproblems via memoization/tabulation (e.g., 0-1 Knapsack).
- Backtracking — explore all possibilities with pruning (e.g., N-Queens).
- Branch & Bound — search with bounds to prune suboptimal branches (e.g., TSP variants).
- Reduction/Transform & Conquer — convert to a known problem.
Greedy Algorithms
Definition: Algorithms that take the best immediate (local) choice at each step. Works when local choices lead to global optimum.
Examples: Dijkstra (when used as shortest-path for non-negative edges is greedy via a priority queue), Kruskal and Prim (MST), Huffman coding, Fractional Knapsack.
Dynamic Programming (DP) vs Divide & Conquer
Core idea: Both break problems into subproblems. DP is applicable when subproblems overlap and optimal substructure exists — then caching (memoization or tabulation) is used to avoid repeated work.
Memoization — store results of recursive calls (top-down).
Tabulation — fill a table iteratively (bottom-up).
Graph Search Algorithms
Definition: Methods to traverse or find paths in graphs. Important algorithms:
- DFS (Depth-First Search) — explore as far as possible, then backtrack. Useful for cycle detection, topological sort.
- BFS (Breadth-First Search) — explore level by level; finds shortest path in unweighted graphs.
- Dijkstra’s Algorithm — shortest path for nonnegative weights; uses a priority queue.
- Greedy Best-First — uses only heuristic to choose nodes (fast, not guaranteed optimal).
- A* — combines current cost + heuristic; optimal if heuristic is admissible.
- IDDFS — iterative deepening DFS: repeated depth-limited DFS increasing limit to combine BFS completeness with DFS memory.
- Bidirectional Search — search forward from start and backward from goal until the frontiers meet.
Minimum Spanning Tree (MST)
Definition: A spanning tree of a connected undirected weighted graph with minimum possible total edge weight. A spanning tree connects all vertices with exactly V−1 edges and no cycles.
Common algorithms: Kruskal’s (sort edges, add smallest if it doesn’t form a cycle — use Disjoint Set Union), Prim’s (grow tree by adding smallest edge from tree to outside vertex), Borůvka’s (iteratively add cheapest edge from each component).
Shortest Path Algorithms
Definition: Algorithms to find minimum-cost path between nodes in graphs. Variants depend on graph properties (weights, negative edges, directed/undirected).
- DFS/BFS — DFS only for tree-like graphs; BFS finds shortest path in unweighted graphs.
- Dijkstra — single-source shortest path for nonnegative weights (priority queue).
- Bellman–Ford — handles negative-weight edges, detects negative cycles (O(VE)).
- Floyd–Warshall — all-pairs shortest paths (O(V³)).
- A* — guided search using admissible heuristic for faster optimal search.
- Johnson’s Algorithm — all-pairs shortest paths for sparse graphs using reweighting + Dijkstra.