algorithms






Algorithms — Glossary


Algorithms — Glossary

Key definitions and short diagrams for core algorithms topics (sourced from the uploaded PDF). Use the index below to jump to a definition.

Source: uploaded PDF. :contentReference[oaicite:1]{index=1}

Searching Algorithms

Definition: Methods to locate an item in a collection. Two common examples:

Linear Search — Check each element one-by-one in an unsorted collection. Time complexity: O(n).

Binary Search — For sorted arrays: compare target with middle element and eliminate half each step. Time complexity: O(log n).

Back to top ↑

Sorting Algorithms

Definition: Algorithms to rearrange elements into a specified order (e.g., ascending).

There are many: QuickSort, MergeSort, HeapSort, InsertionSort, CountingSort, etc. Algorithms differ by stability, best/worst/average time complexity, and space usage.


MergeSort (Divide & Conquer): array

left half

right half

merge

Back to top ↑

Hashing

Definition: Technique that maps data keys to indices in a table using a hash function to allow (average) O(1) insert/search/delete.

Common uses: sets, dictionaries, caches. Needs collision handling (chaining or open addressing).

Hash table mapping key

hash function

table (buckets)

Back to top ↑

Asymptotic Notations

Definition: Mathematical tools to describe algorithm growth as input size → ∞.

  • Big-O (O) — upper bound (worst-case). Example: O(n²).
  • Omega (Ω) — lower bound (best-case). Example: Ω(n).
  • Theta (Θ) — tight bound (both upper & lower). Example: Θ(n log n).
Basic growth (sketch) O(n) O(n log n)

Back to top ↑

Algorithm Design Techniques

Definition: Strategies used to create algorithms. Important ones include:

  • Greedy — make locally optimal choices hoping for global optimum (e.g., Fractional Knapsack).
  • Divide & Conquer — split problem, solve subproblems, combine results (e.g., MergeSort, QuickSort).
  • Dynamic Programming (DP) — like divide & conquer but reuse solutions of overlapping subproblems via memoization/tabulation (e.g., 0-1 Knapsack).
  • Backtracking — explore all possibilities with pruning (e.g., N-Queens).
  • Branch & Bound — search with bounds to prune suboptimal branches (e.g., TSP variants).
  • Reduction/Transform & Conquer — convert to a known problem.

Back to top ↑

Greedy Algorithms

Definition: Algorithms that take the best immediate (local) choice at each step. Works when local choices lead to global optimum.

Examples: Dijkstra (when used as shortest-path for non-negative edges is greedy via a priority queue), Kruskal and Prim (MST), Huffman coding, Fractional Knapsack.

Greedy: pick best candidate repeatedly candidate pool → pick best → update pool → repeat

Back to top ↑

Dynamic Programming (DP) vs Divide & Conquer

Core idea: Both break problems into subproblems. DP is applicable when subproblems overlap and optimal substructure exists — then caching (memoization or tabulation) is used to avoid repeated work.

Memoization — store results of recursive calls (top-down).

Tabulation — fill a table iteratively (bottom-up).

DP example: tabulation / memo table 0 1 1 1 2

Back to top ↑

Minimum Spanning Tree (MST)

Definition: A spanning tree of a connected undirected weighted graph with minimum possible total edge weight. A spanning tree connects all vertices with exactly V−1 edges and no cycles.

Common algorithms: Kruskal’s (sort edges, add smallest if it doesn’t form a cycle — use Disjoint Set Union), Prim’s (grow tree by adding smallest edge from tree to outside vertex), Borůvka’s (iteratively add cheapest edge from each component).

Kruskal idea sort edges by weight → pick smallest that doesn’t create cycle

Back to top ↑

Shortest Path Algorithms

Definition: Algorithms to find minimum-cost path between nodes in graphs. Variants depend on graph properties (weights, negative edges, directed/undirected).

  • DFS/BFS — DFS only for tree-like graphs; BFS finds shortest path in unweighted graphs.
  • Dijkstra — single-source shortest path for nonnegative weights (priority queue).
  • Bellman–Ford — handles negative-weight edges, detects negative cycles (O(VE)).
  • Floyd–Warshall — all-pairs shortest paths (O(V³)).
  • A* — guided search using admissible heuristic for faster optimal search.
  • Johnson’s Algorithm — all-pairs shortest paths for sparse graphs using reweighting + Dijkstra.

Back to top ↑

Generated glossary based on the uploaded PDF. For full details, examples, and extended sections (e.g., code and proofs) please see the original document. :contentReference[oaicite:2]{index=2}


TECH TALENTS
Average rating:  
 0 reviews
Social Share Buttons and Icons powered by Ultimatelysocial
Scan the code