mirror of
https://github.com/krahets/hello-algo.git
synced 2025-11-02 04:31:55 +08:00
Bug fixes and improvements (#1205)
* Add Ruby code blocks to documents * Remove Ruby code from en/docs * Remove "center-table" class in index.md * Add "data-toc-label" to handle the latex heading during the build process * Use normal JD link instead. * Bug fixes
This commit is contained in:
@ -1,11 +1,7 @@
|
||||
# Complexity Analysis
|
||||
|
||||
<div class="center-table" markdown>
|
||||
|
||||

|
||||
|
||||
</div>
|
||||
|
||||
!!! abstract
|
||||
|
||||
Complexity analysis is like a space-time navigator in the vast universe of algorithms.
|
||||
|
||||
@ -736,7 +736,7 @@ $$
|
||||
|
||||

|
||||
|
||||
### Constant Order $O(1)$ {data-toc-label="Constant Order"}
|
||||
### Constant Order $O(1)$
|
||||
|
||||
Constant order is common in constants, variables, objects that are independent of the size of input data $n$.
|
||||
|
||||
@ -746,7 +746,7 @@ Note that memory occupied by initializing variables or calling functions in a lo
|
||||
[file]{space_complexity}-[class]{}-[func]{constant}
|
||||
```
|
||||
|
||||
### Linear Order $O(n)$ {data-toc-label="Linear Order"}
|
||||
### Linear Order $O(n)$
|
||||
|
||||
Linear order is common in arrays, linked lists, stacks, queues, etc., where the number of elements is proportional to $n$:
|
||||
|
||||
@ -762,7 +762,7 @@ As shown below, this function's recursive depth is $n$, meaning there are $n$ in
|
||||
|
||||

|
||||
|
||||
### Quadratic Order $O(n^2)$ {data-toc-label="Quadratic Order"}
|
||||
### Quadratic Order $O(n^2)$
|
||||
|
||||
Quadratic order is common in matrices and graphs, where the number of elements is quadratic to $n$:
|
||||
|
||||
@ -778,7 +778,7 @@ As shown below, the recursive depth of this function is $n$, and in each recursi
|
||||
|
||||

|
||||
|
||||
### Exponential Order $O(2^n)$ {data-toc-label="Exponential Order"}
|
||||
### Exponential Order $O(2^n)$
|
||||
|
||||
Exponential order is common in binary trees. Observe the below image, a "full binary tree" with $n$ levels has $2^n - 1$ nodes, occupying $O(2^n)$ space:
|
||||
|
||||
@ -788,7 +788,7 @@ Exponential order is common in binary trees. Observe the below image, a "full bi
|
||||
|
||||

|
||||
|
||||
### Logarithmic Order $O(\log n)$ {data-toc-label="Logarithmic Order"}
|
||||
### Logarithmic Order $O(\log n)$
|
||||
|
||||
Logarithmic order is common in divide-and-conquer algorithms. For example, in merge sort, an array of length $n$ is recursively divided in half each round, forming a recursion tree of height $\log n$, using $O(\log n)$ stack frame space.
|
||||
|
||||
|
||||
@ -962,7 +962,7 @@ $$
|
||||
|
||||

|
||||
|
||||
### Constant Order $O(1)$ {data-toc-label="Constant Order"}
|
||||
### Constant Order $O(1)$
|
||||
|
||||
Constant order means the number of operations is independent of the input data size $n$. In the following function, although the number of operations `size` might be large, the time complexity remains $O(1)$ as it's unrelated to $n$:
|
||||
|
||||
@ -970,7 +970,7 @@ Constant order means the number of operations is independent of the input data s
|
||||
[file]{time_complexity}-[class]{}-[func]{constant}
|
||||
```
|
||||
|
||||
### Linear Order $O(n)$ {data-toc-label="Linear Order"}
|
||||
### Linear Order $O(n)$
|
||||
|
||||
Linear order indicates the number of operations grows linearly with the input data size $n$. Linear order commonly appears in single-loop structures:
|
||||
|
||||
@ -986,7 +986,7 @@ Operations like array traversal and linked list traversal have a time complexity
|
||||
|
||||
It's important to note that **the input data size $n$ should be determined based on the type of input data**. For example, in the first example, $n$ represents the input data size, while in the second example, the length of the array $n$ is the data size.
|
||||
|
||||
### Quadratic Order $O(n^2)$ {data-toc-label="Quadratic Order"}
|
||||
### Quadratic Order $O(n^2)$
|
||||
|
||||
Quadratic order means the number of operations grows quadratically with the input data size $n$. Quadratic order typically appears in nested loops, where both the outer and inner loops have a time complexity of $O(n)$, resulting in an overall complexity of $O(n^2)$:
|
||||
|
||||
@ -1004,7 +1004,7 @@ For instance, in bubble sort, the outer loop runs $n - 1$ times, and the inner l
|
||||
[file]{time_complexity}-[class]{}-[func]{bubble_sort}
|
||||
```
|
||||
|
||||
### Exponential Order $O(2^n)$ {data-toc-label="Exponential Order"}
|
||||
### Exponential Order $O(2^n)$
|
||||
|
||||
Biological "cell division" is a classic example of exponential order growth: starting with one cell, it becomes two after one division, four after two divisions, and so on, resulting in $2^n$ cells after $n$ divisions.
|
||||
|
||||
@ -1024,7 +1024,7 @@ In practice, exponential order often appears in recursive functions. For example
|
||||
|
||||
Exponential order growth is extremely rapid and is commonly seen in exhaustive search methods (brute force, backtracking, etc.). For large-scale problems, exponential order is unacceptable, often requiring dynamic programming or greedy algorithms as solutions.
|
||||
|
||||
### Logarithmic Order $O(\log n)$ {data-toc-label="Logarithmic Order"}
|
||||
### Logarithmic Order $O(\log n)$
|
||||
|
||||
In contrast to exponential order, logarithmic order reflects situations where "the size is halved each round." Given an input data size $n$, since the size is halved each round, the number of iterations is $\log_2 n$, the inverse function of $2^n$.
|
||||
|
||||
@ -1054,7 +1054,7 @@ Logarithmic order is typical in algorithms based on the divide-and-conquer strat
|
||||
|
||||
This means the base $m$ can be changed without affecting the complexity. Therefore, we often omit the base $m$ and simply denote logarithmic order as $O(\log n)$.
|
||||
|
||||
### Linear-Logarithmic Order $O(n \log n)$ {data-toc-label="Linear-Logarithmic Order"}
|
||||
### Linear-Logarithmic Order $O(n \log n)$
|
||||
|
||||
Linear-logarithmic order often appears in nested loops, with the complexities of the two loops being $O(\log n)$ and $O(n)$ respectively. The related code is as follows:
|
||||
|
||||
@ -1068,7 +1068,7 @@ The image below demonstrates how linear-logarithmic order is generated. Each lev
|
||||
|
||||
Mainstream sorting algorithms typically have a time complexity of $O(n \log n)$, such as quicksort, mergesort, and heapsort.
|
||||
|
||||
### Factorial Order $O(n!)$ {data-toc-label="Factorial Order"}
|
||||
### Factorial Order $O(n!)$
|
||||
|
||||
Factorial order corresponds to the mathematical problem of "full permutation." Given $n$ distinct elements, the total number of possible permutations is:
|
||||
|
||||
|
||||
Reference in New Issue
Block a user