(concept) data-structure-and-algorithm

Purpose

To solve problems more efficiently, we aim to optimize both time and space complexity to the point where further improvements are not possible.

Concept

We can decompose all the programming logics into four actions: create, read, update, delete. The time complexity of CRUD an element in specific data structure is as follow: (The order is based on popularity)

Data StructureCreateReadUpdateDelete
ArraysO(1) - O(n)O(1)O(1)O(1) - O(n)
Linked ListsO(1)O(n)O(n)O(1)
Hash TablesO(1)O(1)O(1)O(1)
TreesO(log n)O(log n)O(log n)O(log n)
GraphsO(1)O(v+e)O(1)O(v+e)
StacksO(1)O(n)O(n)O(1)
QueuesO(1)O(n)O(n)O(1)
HeapsO(log n)O(1)O(log n)O(log n)
TriesO(k)O(k)O(k)O(k)

In this table, n is the number of elements in the data structure, v is the number of vertices, e is the number of edges, and k is the length of the word for tries. Note that these are average case complexities, and the actual time complexity can vary based on the specific implementation and usage of the data structure.

Example

  • Array: Fast index access, fixed size.
  • Linked List: Frequent inserts/deletes.
  • Hash Table: Fast lookup by key.
  • Tree: Sorted data, range queries.
  • Graph: Represent networks.
  • Stack: LIFO (e.g., recursion).
  • Queue: FIFO (e.g., tasks).
  • Heap: Get min/max fast.
  • Trie: Prefix searches.
  • Union-Find: Group tracking, dynamic connectivity.

reference