mirror of
https://github.com/krahets/hello-algo.git
synced 2025-11-02 12:58:42 +08:00
translation: Capitalize all the headers, list headers and figure captions (#1206)
* Capitalize all the headers, list headers and figure captions * Fix the term "LRU" * Fix the names of source code link in avl_tree.md * Capitalize only first letter for nav trees in mkdocs.yml * Update code comments * Update linked_list.md * Update linked_list.md
This commit is contained in:
@ -1,12 +1,12 @@
|
||||
# Arrays
|
||||
# Array
|
||||
|
||||
An "array" is a linear data structure that operates as a lineup of similar items, stored together in a computer's memory in contiguous spaces. It's like a sequence that maintains organized storage. Each item in this lineup has its unique 'spot' known as an "index". Please refer to the figure below to observe how arrays work and grasp these key terms.
|
||||
|
||||

|
||||

|
||||
|
||||
## Common Operations on Arrays
|
||||
## Common operations on arrays
|
||||
|
||||
### Initializing Arrays
|
||||
### Initializing arrays
|
||||
|
||||
Arrays can be initialized in two ways depending on the needs: either without initial values or with specified initial values. When initial values are not specified, most programming languages will set the array elements to $0$:
|
||||
|
||||
@ -119,11 +119,11 @@ Arrays can be initialized in two ways depending on the needs: either without ini
|
||||
var nums = [_]i32{ 1, 3, 2, 5, 4 };
|
||||
```
|
||||
|
||||
### Accessing Elements
|
||||
### Accessing elements
|
||||
|
||||
Elements in an array are stored in contiguous memory spaces, making it simpler to compute each element's memory address. The formula shown in the Figure below aids in determining an element's memory address, utilizing the array's memory address (specifically, the first element's address) and the element's index. This computation streamlines direct access to the desired element.
|
||||
|
||||

|
||||

|
||||
|
||||
As observed in the above illustration, array indexing conventionally begins at $0$. While this might appear counterintuitive, considering counting usually starts at $1$, within the address calculation formula, **an index is essentially an offset from the memory address**. For the first element's address, this offset is $0$, validating its index as $0$.
|
||||
|
||||
@ -133,11 +133,11 @@ Accessing elements in an array is highly efficient, allowing us to randomly acce
|
||||
[file]{array}-[class]{}-[func]{random_access}
|
||||
```
|
||||
|
||||
### Inserting Elements
|
||||
### Inserting elements
|
||||
|
||||
Array elements are tightly packed in memory, with no space available to accommodate additional data between them. Illustrated in Figure below, inserting an element in the middle of an array requires shifting all subsequent elements back by one position to create room for the new element.
|
||||
|
||||

|
||||

|
||||
|
||||
It's important to note that due to the fixed length of an array, inserting an element will unavoidably result in the loss of the last element in the array. Solutions to address this issue will be explored in the "List" chapter.
|
||||
|
||||
@ -145,11 +145,11 @@ It's important to note that due to the fixed length of an array, inserting an el
|
||||
[file]{array}-[class]{}-[func]{insert}
|
||||
```
|
||||
|
||||
### Deleting Elements
|
||||
### Deleting elements
|
||||
|
||||
Similarly, as depicted in the figure below, to delete an element at index $i$, all elements following index $i$ must be moved forward by one position.
|
||||
|
||||

|
||||

|
||||
|
||||
Please note that after deletion, the former last element becomes "meaningless," hence requiring no specific modification.
|
||||
|
||||
@ -159,11 +159,11 @@ Please note that after deletion, the former last element becomes "meaningless,"
|
||||
|
||||
In summary, the insertion and deletion operations in arrays present the following disadvantages:
|
||||
|
||||
- **High Time Complexity**: Both insertion and deletion in an array have an average time complexity of $O(n)$, where $n$ is the length of the array.
|
||||
- **Loss of Elements**: Due to the fixed length of arrays, elements that exceed the array's capacity are lost during insertion.
|
||||
- **Waste of Memory**: Initializing a longer array and utilizing only the front part results in "meaningless" end elements during insertion, leading to some wasted memory space.
|
||||
- **High time complexity**: Both insertion and deletion in an array have an average time complexity of $O(n)$, where $n$ is the length of the array.
|
||||
- **Loss of elements**: Due to the fixed length of arrays, elements that exceed the array's capacity are lost during insertion.
|
||||
- **Waste of memory**: Initializing a longer array and utilizing only the front part results in "meaningless" end elements during insertion, leading to some wasted memory space.
|
||||
|
||||
### Traversing Arrays
|
||||
### Traversing arrays
|
||||
|
||||
In most programming languages, we can traverse an array either by using indices or by directly iterating over each element:
|
||||
|
||||
@ -171,7 +171,7 @@ In most programming languages, we can traverse an array either by using indices
|
||||
[file]{array}-[class]{}-[func]{traverse}
|
||||
```
|
||||
|
||||
### Finding Elements
|
||||
### Finding elements
|
||||
|
||||
Locating a specific element within an array involves iterating through the array, checking each element to determine if it matches the desired value.
|
||||
|
||||
@ -181,7 +181,7 @@ Because arrays are linear data structures, this operation is commonly referred t
|
||||
[file]{array}-[class]{}-[func]{find}
|
||||
```
|
||||
|
||||
### Expanding Arrays
|
||||
### Expanding arrays
|
||||
|
||||
In complex system environments, ensuring the availability of memory space after an array for safe capacity extension becomes challenging. Consequently, in most programming languages, **the length of an array is immutable**.
|
||||
|
||||
@ -191,26 +191,26 @@ To expand an array, it's necessary to create a larger array and then copy the e
|
||||
[file]{array}-[class]{}-[func]{extend}
|
||||
```
|
||||
|
||||
## Advantages and Limitations of Arrays
|
||||
## Advantages and limitations of arrays
|
||||
|
||||
Arrays are stored in contiguous memory spaces and consist of elements of the same type. This approach provides substantial prior information that systems can leverage to optimize the efficiency of data structure operations.
|
||||
|
||||
- **High Space Efficiency**: Arrays allocate a contiguous block of memory for data, eliminating the need for additional structural overhead.
|
||||
- **Support for Random Access**: Arrays allow $O(1)$ time access to any element.
|
||||
- **Cache Locality**: When accessing array elements, the computer not only loads them but also caches the surrounding data, utilizing high-speed cache to enchance subsequent operation speeds.
|
||||
- **High space efficiency**: Arrays allocate a contiguous block of memory for data, eliminating the need for additional structural overhead.
|
||||
- **Support for random access**: Arrays allow $O(1)$ time access to any element.
|
||||
- **Cache locality**: When accessing array elements, the computer not only loads them but also caches the surrounding data, utilizing high-speed cache to enchance subsequent operation speeds.
|
||||
|
||||
However, continuous space storage is a double-edged sword, with the following limitations:
|
||||
|
||||
- **Low Efficiency in Insertion and Deletion**: As arrays accumulate many elements, inserting or deleting elements requires shifting a large number of elements.
|
||||
- **Fixed Length**: The length of an array is fixed after initialization. Expanding an array requires copying all data to a new array, incurring significant costs.
|
||||
- **Space Wastage**: If the allocated array size exceeds the what is necessary, the extra space is wasted.
|
||||
- **Low efficiency in insertion and deletion**: As arrays accumulate many elements, inserting or deleting elements requires shifting a large number of elements.
|
||||
- **Fixed length**: The length of an array is fixed after initialization. Expanding an array requires copying all data to a new array, incurring significant costs.
|
||||
- **Space wastage**: If the allocated array size exceeds the what is necessary, the extra space is wasted.
|
||||
|
||||
## Typical Applications of Arrays
|
||||
## Typical applications of arrays
|
||||
|
||||
Arrays are fundamental and widely used data structures. They find frequent application in various algorithms and serve in the implementation of complex data structures.
|
||||
|
||||
- **Random Access**: Arrays are ideal for storing data when random sampling is required. By generating a random sequence based on indices, we can achieve random sampling efficiently.
|
||||
- **Sorting and Searching**: Arrays are the most commonly used data structure for sorting and searching algorithms. Techniques like quick sort, merge sort, binary search, etc., are primarily operate on arrays.
|
||||
- **Lookup Tables**: Arrays serve as efficient lookup tables for quick element or relationship retrieval. For instance, mapping characters to ASCII codes becomes seamless by using the ASCII code values as indices and storing corresponding elements in the array.
|
||||
- **Machine Learning**: Within the domain of neural networks, arrays play a pivotal role in executing crucial linear algebra operations involving vectors, matrices, and tensors. Arrays serve as the primary and most extensively used data structure in neural network programming.
|
||||
- **Data Structure Implementation**: Arrays serve as the building blocks for implementing various data structures like stacks, queues, hash tables, heaps, graphs, etc. For instance, the adjacency matrix representation of a graph is essentially a two-dimensional array.
|
||||
- **Random access**: Arrays are ideal for storing data when random sampling is required. By generating a random sequence based on indices, we can achieve random sampling efficiently.
|
||||
- **Sorting and searching**: Arrays are the most commonly used data structure for sorting and searching algorithms. Techniques like quick sort, merge sort, binary search, etc., are primarily operate on arrays.
|
||||
- **Lookup tables**: Arrays serve as efficient lookup tables for quick element or relationship retrieval. For instance, mapping characters to ASCII codes becomes seamless by using the ASCII code values as indices and storing corresponding elements in the array.
|
||||
- **Machine learning**: Within the domain of neural networks, arrays play a pivotal role in executing crucial linear algebra operations involving vectors, matrices, and tensors. Arrays serve as the primary and most extensively used data structure in neural network programming.
|
||||
- **Data structure implementation**: Arrays serve as the building blocks for implementing various data structures like stacks, queues, hash tables, heaps, graphs, etc. For instance, the adjacency matrix representation of a graph is essentially a two-dimensional array.
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
# Arrays and Linked Lists
|
||||
# Arrays and linked lists
|
||||
|
||||

|
||||

|
||||
|
||||
!!! abstract
|
||||
|
||||
|
||||
@ -1,4 +1,4 @@
|
||||
# Linked Lists
|
||||
# Linked list
|
||||
|
||||
Memory space is a shared resource among all programs. In a complex system environment, available memory can be dispersed throughout the memory space. We understand that the memory allocated for an array must be continuous. However, for very large arrays, finding a sufficiently large contiguous memory space might be challenging. This is where the flexible advantage of linked lists becomes evident.
|
||||
|
||||
@ -6,7 +6,7 @@ A "linked list" is a linear data structure in which each element is a node objec
|
||||
|
||||
The design of linked lists allows for their nodes to be distributed across memory locations without requiring contiguous memory addresses.
|
||||
|
||||

|
||||

|
||||
|
||||
As shown in the figure, we see that the basic building block of a linked list is the "node" object. Each node comprises two key components: the node's "value" and a "reference" to the next node.
|
||||
|
||||
@ -20,7 +20,7 @@ As the code below illustrates, a `ListNode` in a linked list, besides holding a
|
||||
|
||||
```python title=""
|
||||
class ListNode:
|
||||
"""Linked List Node Class"""
|
||||
"""Linked list node class"""
|
||||
def __init__(self, val: int):
|
||||
self.val: int = val # Node value
|
||||
self.next: ListNode | None = None # Reference to the next node
|
||||
@ -29,7 +29,7 @@ As the code below illustrates, a `ListNode` in a linked list, besides holding a
|
||||
=== "C++"
|
||||
|
||||
```cpp title=""
|
||||
/* Linked List Node Structure */
|
||||
/* Linked list node structure */
|
||||
struct ListNode {
|
||||
int val; // Node value
|
||||
ListNode *next; // Pointer to the next node
|
||||
@ -40,7 +40,7 @@ As the code below illustrates, a `ListNode` in a linked list, besides holding a
|
||||
=== "Java"
|
||||
|
||||
```java title=""
|
||||
/* Linked List Node Class */
|
||||
/* Linked list node class */
|
||||
class ListNode {
|
||||
int val; // Node value
|
||||
ListNode next; // Reference to the next node
|
||||
@ -51,7 +51,7 @@ As the code below illustrates, a `ListNode` in a linked list, besides holding a
|
||||
=== "C#"
|
||||
|
||||
```csharp title=""
|
||||
/* Linked List Node Class */
|
||||
/* Linked list node class */
|
||||
class ListNode(int x) { // Constructor
|
||||
int val = x; // Node value
|
||||
ListNode? next; // Reference to the next node
|
||||
@ -61,7 +61,7 @@ As the code below illustrates, a `ListNode` in a linked list, besides holding a
|
||||
=== "Go"
|
||||
|
||||
```go title=""
|
||||
/* Linked List Node Structure */
|
||||
/* Linked list node structure */
|
||||
type ListNode struct {
|
||||
Val int // Node value
|
||||
Next *ListNode // Pointer to the next node
|
||||
@ -79,7 +79,7 @@ As the code below illustrates, a `ListNode` in a linked list, besides holding a
|
||||
=== "Swift"
|
||||
|
||||
```swift title=""
|
||||
/* Linked List Node Class */
|
||||
/* Linked list node class */
|
||||
class ListNode {
|
||||
var val: Int // Node value
|
||||
var next: ListNode? // Reference to the next node
|
||||
@ -93,7 +93,7 @@ As the code below illustrates, a `ListNode` in a linked list, besides holding a
|
||||
=== "JS"
|
||||
|
||||
```javascript title=""
|
||||
/* Linked List Node Class */
|
||||
/* Linked list node class */
|
||||
class ListNode {
|
||||
constructor(val, next) {
|
||||
this.val = (val === undefined ? 0 : val); // Node value
|
||||
@ -105,7 +105,7 @@ As the code below illustrates, a `ListNode` in a linked list, besides holding a
|
||||
=== "TS"
|
||||
|
||||
```typescript title=""
|
||||
/* Linked List Node Class */
|
||||
/* Linked list node class */
|
||||
class ListNode {
|
||||
val: number;
|
||||
next: ListNode | null;
|
||||
@ -119,7 +119,7 @@ As the code below illustrates, a `ListNode` in a linked list, besides holding a
|
||||
=== "Dart"
|
||||
|
||||
```dart title=""
|
||||
/* 链表节点类 */
|
||||
/* Linked list node class */
|
||||
class ListNode {
|
||||
int val; // Node value
|
||||
ListNode? next; // Reference to the next node
|
||||
@ -132,7 +132,7 @@ As the code below illustrates, a `ListNode` in a linked list, besides holding a
|
||||
```rust title=""
|
||||
use std::rc::Rc;
|
||||
use std::cell::RefCell;
|
||||
/* Linked List Node Class */
|
||||
/* Linked list node class */
|
||||
#[derive(Debug)]
|
||||
struct ListNode {
|
||||
val: i32, // Node value
|
||||
@ -143,7 +143,7 @@ As the code below illustrates, a `ListNode` in a linked list, besides holding a
|
||||
=== "C"
|
||||
|
||||
```c title=""
|
||||
/* Linked List Node Structure */
|
||||
/* Linked list node structure */
|
||||
typedef struct ListNode {
|
||||
int val; // Node value
|
||||
struct ListNode *next; // Pointer to the next node
|
||||
@ -168,7 +168,7 @@ As the code below illustrates, a `ListNode` in a linked list, besides holding a
|
||||
=== "Zig"
|
||||
|
||||
```zig title=""
|
||||
// Linked List Node Class
|
||||
// Linked list node class
|
||||
pub fn ListNode(comptime T: type) type {
|
||||
return struct {
|
||||
const Self = @This();
|
||||
@ -185,9 +185,9 @@ As the code below illustrates, a `ListNode` in a linked list, besides holding a
|
||||
}
|
||||
```
|
||||
|
||||
## Common Operations on Linked Lists
|
||||
## Common operations on linked lists
|
||||
|
||||
### Initializing a Linked List
|
||||
### Initializing a linked list
|
||||
|
||||
Constructing a linked list is a two-step process: first, initializing each node object, and second, forming the reference links between the nodes. After initialization, we can traverse all nodes sequentially from the head node by following the `next` reference.
|
||||
|
||||
@ -404,31 +404,31 @@ Constructing a linked list is a two-step process: first, initializing each node
|
||||
|
||||
The array as a whole is a variable, for instance, the array `nums` includes elements like `nums[0]`, `nums[1]`, and so on, whereas a linked list is made up of several distinct node objects. **We typically refer to a linked list by its head node**, for example, the linked list in the previous code snippet is referred to as `n0`.
|
||||
|
||||
### Inserting a Node
|
||||
### Inserting nodes
|
||||
|
||||
Inserting a node into a linked list is very easy. As shown in the figure, let's assume we aim to insert a new node `P` between two adjacent nodes `n0` and `n1`. **This can be achieved by simply modifying two node references (pointers)**, with a time complexity of $O(1)$.
|
||||
|
||||
By comparison, inserting an element into an array has a time complexity of $O(n)$, which becomes less efficient when dealing with large data volumes.
|
||||
|
||||

|
||||

|
||||
|
||||
```src
|
||||
[file]{linked_list}-[class]{}-[func]{insert}
|
||||
```
|
||||
|
||||
### Deleting a Node
|
||||
### Deleting nodes
|
||||
|
||||
As shown in the figure, deleting a node from a linked list is also very easy, **involving only the modification of a single node's reference (pointer)**.
|
||||
|
||||
It's important to note that even though node `P` continues to point to `n1` after being deleted, it becomes inaccessible during linked list traversal. This effectively means that `P` is no longer a part of the linked list.
|
||||
|
||||

|
||||

|
||||
|
||||
```src
|
||||
[file]{linked_list}-[class]{}-[func]{remove}
|
||||
```
|
||||
|
||||
### Accessing Nodes
|
||||
### Accessing nodes
|
||||
|
||||
**Accessing nodes in a linked list is less efficient**. As previously mentioned, any element in an array can be accessed in $O(1)$ time. In contrast, with a linked list, the program involves starting from the head node and sequentially traversing through the nodes until the desired node is found. In other words, to access the $i$-th node in a linked list, the program must iterate through $i - 1$ nodes, resulting in a time complexity of $O(n)$.
|
||||
|
||||
@ -436,7 +436,7 @@ It's important to note that even though node `P` continues to point to `n1` afte
|
||||
[file]{linked_list}-[class]{}-[func]{access}
|
||||
```
|
||||
|
||||
### Finding Nodes
|
||||
### Finding nodes
|
||||
|
||||
Traverse the linked list to locate a node whose value matches `target`, and then output the index of that node within the linked list. This procedure is also an example of linear search. The corresponding code is provided below:
|
||||
|
||||
@ -444,11 +444,11 @@ Traverse the linked list to locate a node whose value matches `target`, and then
|
||||
[file]{linked_list}-[class]{}-[func]{find}
|
||||
```
|
||||
|
||||
## Arrays vs. Linked Lists
|
||||
## Arrays vs. linked lists
|
||||
|
||||
The table below summarizes the characteristics of arrays and linked lists, and it also compares their efficiencies in various operations. Because they utilize opposing storage strategies, their respective properties and operational efficiencies exhibit distinct contrasts.
|
||||
|
||||
<p align="center"> Table <id> Efficiency Comparison of Arrays and Linked Lists </p>
|
||||
<p align="center"> Table <id> Efficiency comparison of arrays and linked lists </p>
|
||||
|
||||
| | Arrays | Linked Lists |
|
||||
| ------------------ | ------------------------------------------------ | ----------------------- |
|
||||
@ -459,19 +459,19 @@ The table below summarizes the characteristics of arrays and linked lists, and i
|
||||
| Adding Elements | $O(n)$ | $O(1)$ |
|
||||
| Deleting Elements | $O(n)$ | $O(1)$ |
|
||||
|
||||
## Common Types of Linked Lists
|
||||
## Common types of linked lists
|
||||
|
||||
As shown in the figure, there are three common types of linked lists.
|
||||
|
||||
- **Singly Linked List**: This is the standard linked list described earlier. Nodes in a singly linked list include a value and a reference to the next node. The first node is known as the head node, and the last node, which points to null (`None`), is the tail node.
|
||||
- **Circular Linked List**: This is formed when the tail node of a singly linked list points back to the head node, creating a loop. In a circular linked list, any node can function as the head node.
|
||||
- **Doubly Linked List**: In contrast to a singly linked list, a doubly linked list maintains references in two directions. Each node contains references (pointer) to both its successor (the next node) and predecessor (the previous node). Although doubly linked lists offer more flexibility for traversing in either direction, they also consume more memory space.
|
||||
- **Singly linked list**: This is the standard linked list described earlier. Nodes in a singly linked list include a value and a reference to the next node. The first node is known as the head node, and the last node, which points to null (`None`), is the tail node.
|
||||
- **Circular linked list**: This is formed when the tail node of a singly linked list points back to the head node, creating a loop. In a circular linked list, any node can function as the head node.
|
||||
- **Doubly linked list**: In contrast to a singly linked list, a doubly linked list maintains references in two directions. Each node contains references (pointer) to both its successor (the next node) and predecessor (the previous node). Although doubly linked lists offer more flexibility for traversing in either direction, they also consume more memory space.
|
||||
|
||||
=== "Python"
|
||||
|
||||
```python title=""
|
||||
class ListNode:
|
||||
"""Bidirectional linked list node class""""
|
||||
"""Bidirectional linked list node class"""
|
||||
def __init__(self, val: int):
|
||||
self.val: int = val # Node value
|
||||
self.next: ListNode | None = None # Reference to the successor node
|
||||
@ -664,23 +664,23 @@ As shown in the figure, there are three common types of linked lists.
|
||||
}
|
||||
```
|
||||
|
||||

|
||||

|
||||
|
||||
## Typical Applications of Linked Lists
|
||||
## Typical applications of linked lists
|
||||
|
||||
Singly linked lists are frequently utilized in implementing stacks, queues, hash tables, and graphs.
|
||||
|
||||
- **Stacks and Queues**: In singly linked lists, if insertions and deletions occur at the same end, it behaves like a stack (last-in-first-out). Conversely, if insertions are at one end and deletions at the other, it functions like a queue (first-in-first-out).
|
||||
- **Hash Tables**: Linked lists are used in chaining, a popular method for resolving hash collisions. Here, all collided elements are grouped into a linked list.
|
||||
- **Stacks and queues**: In singly linked lists, if insertions and deletions occur at the same end, it behaves like a stack (last-in-first-out). Conversely, if insertions are at one end and deletions at the other, it functions like a queue (first-in-first-out).
|
||||
- **Hash tables**: Linked lists are used in chaining, a popular method for resolving hash collisions. Here, all collided elements are grouped into a linked list.
|
||||
- **Graphs**: Adjacency lists, a standard method for graph representation, associate each graph vertex with a linked list. This list contains elements that represent vertices connected to the corresponding vertex.
|
||||
|
||||
Doubly linked lists are ideal for scenarios requiring rapid access to preceding and succeeding elements.
|
||||
|
||||
- **Advanced Data Structures**: In structures like red-black trees and B-trees, accessing a node's parent is essential. This is achieved by incorporating a reference to the parent node in each node, akin to a doubly linked list.
|
||||
- **Browser History**: In web browsers, doubly linked lists facilitate navigating the history of visited pages when users click forward or back.
|
||||
- **LRU Algorithm**: Doubly linked lists are apt for Least Recently Used (LRU) cache eviction algorithms, enabling swift identification of the least recently used data and facilitating fast node addition and removal.
|
||||
- **Advanced data structures**: In structures like red-black trees and B-trees, accessing a node's parent is essential. This is achieved by incorporating a reference to the parent node in each node, akin to a doubly linked list.
|
||||
- **Browser history**: In web browsers, doubly linked lists facilitate navigating the history of visited pages when users click forward or back.
|
||||
- **LRU algorithm**: Doubly linked lists are apt for Least Recently Used (LRU) cache eviction algorithms, enabling swift identification of the least recently used data and facilitating fast node addition and removal.
|
||||
|
||||
Circular linked lists are ideal for applications that require periodic operations, such as resource scheduling in operating systems.
|
||||
|
||||
- **Round-Robin Scheduling Algorithm**: In operating systems, the round-robin scheduling algorithm is a common CPU scheduling method, requiring cycling through a group of processes. Each process is assigned a time slice, and upon expiration, the CPU rotates to the next process. This cyclical operation can be efficiently realized using a circular linked list, allowing for a fair and time-shared system among all processes.
|
||||
- **Data Buffers**: Circular linked lists are also used in data buffers, like in audio and video players, where the data stream is divided into multiple buffer blocks arranged in a circular fashion for seamless playback.
|
||||
- **Round-robin scheduling algorithm**: In operating systems, the round-robin scheduling algorithm is a common CPU scheduling method, requiring cycling through a group of processes. Each process is assigned a time slice, and upon expiration, the CPU rotates to the next process. This cyclical operation can be efficiently realized using a circular linked list, allowing for a fair and time-shared system among all processes.
|
||||
- **Data buffers**: Circular linked lists are also used in data buffers, like in audio and video players, where the data stream is divided into multiple buffer blocks arranged in a circular fashion for seamless playback.
|
||||
|
||||
@ -11,9 +11,9 @@ To solve this problem, we can implement lists using a "dynamic array." It inheri
|
||||
|
||||
In fact, **many programming languages' standard libraries implement lists using dynamic arrays**, such as Python's `list`, Java's `ArrayList`, C++'s `vector`, and C#'s `List`. In the following discussion, we will consider "list" and "dynamic array" as synonymous concepts.
|
||||
|
||||
## Common List Operations
|
||||
## Common list operations
|
||||
|
||||
### Initializing a List
|
||||
### Initializing a list
|
||||
|
||||
We typically use two initialization methods: "without initial values" and "with initial values".
|
||||
|
||||
@ -141,7 +141,7 @@ We typically use two initialization methods: "without initial values" and "with
|
||||
try nums.appendSlice(&[_]i32{ 1, 3, 2, 5, 4 });
|
||||
```
|
||||
|
||||
### Accessing Elements
|
||||
### Accessing elements
|
||||
|
||||
Lists are essentially arrays, thus they can access and update elements in $O(1)$ time, which is very efficient.
|
||||
|
||||
@ -266,7 +266,7 @@ Lists are essentially arrays, thus they can access and update elements in $O(1)$
|
||||
nums.items[1] = 0; // Update the element at index 1 to 0
|
||||
```
|
||||
|
||||
### Inserting and Removing Elements
|
||||
### Inserting and removing elements
|
||||
|
||||
Compared to arrays, lists offer more flexibility in adding and removing elements. While adding elements to the end of a list is an $O(1)$ operation, the efficiency of inserting and removing elements elsewhere in the list remains the same as in arrays, with a time complexity of $O(n)$.
|
||||
|
||||
@ -502,7 +502,7 @@ Compared to arrays, lists offer more flexibility in adding and removing elements
|
||||
_ = nums.orderedRemove(3); // Remove the element at index 3
|
||||
```
|
||||
|
||||
### Iterating the List
|
||||
### Iterating the list
|
||||
|
||||
Similar to arrays, lists can be iterated either by using indices or by directly iterating through each element.
|
||||
|
||||
@ -691,7 +691,7 @@ Similar to arrays, lists can be iterated either by using indices or by directly
|
||||
}
|
||||
```
|
||||
|
||||
### Concatenating Lists
|
||||
### Concatenating lists
|
||||
|
||||
Given a new list `nums1`, we can append it to the end of the original list.
|
||||
|
||||
@ -798,7 +798,7 @@ Given a new list `nums1`, we can append it to the end of the original list.
|
||||
try nums.insertSlice(nums.items.len, nums1.items); // Concatenate nums1 to the end of nums
|
||||
```
|
||||
|
||||
### Sorting the List
|
||||
### Sorting the list
|
||||
|
||||
Once the list is sorted, we can employ algorithms commonly used in array-related algorithm problems, such as "binary search" and "two-pointer" algorithms.
|
||||
|
||||
@ -891,15 +891,15 @@ Once the list is sorted, we can employ algorithms commonly used in array-related
|
||||
std.sort.sort(i32, nums.items, {}, comptime std.sort.asc(i32));
|
||||
```
|
||||
|
||||
## List Implementation
|
||||
## List implementation
|
||||
|
||||
Many programming languages come with built-in lists, including Java, C++, Python, etc. Their implementations tend to be intricate, featuring carefully considered settings for various parameters, like initial capacity and expansion factors. Readers who are curious can delve into the source code for further learning.
|
||||
|
||||
To enhance our understanding of how lists work, we will attempt to implement a simplified version of a list, focusing on three crucial design aspects:
|
||||
|
||||
- **Initial Capacity**: Choose a reasonable initial capacity for the array. In this example, we choose 10 as the initial capacity.
|
||||
- **Size Recording**: Declare a variable `size` to record the current number of elements in the list, updating in real-time with element insertion and deletion. With this variable, we can locate the end of the list and determine whether expansion is needed.
|
||||
- **Expansion Mechanism**: If the list reaches full capacity upon an element insertion, an expansion process is required. This involves creating a larger array based on the expansion factor, and then transferring all elements from the current array to the new one. In this example, we stipulate that the array size should double with each expansion.
|
||||
- **Initial capacity**: Choose a reasonable initial capacity for the array. In this example, we choose 10 as the initial capacity.
|
||||
- **Size recording**: Declare a variable `size` to record the current number of elements in the list, updating in real-time with element insertion and deletion. With this variable, we can locate the end of the list and determine whether expansion is needed.
|
||||
- **Expansion mechanism**: If the list reaches full capacity upon an element insertion, an expansion process is required. This involves creating a larger array based on the expansion factor, and then transferring all elements from the current array to the new one. In this example, we stipulate that the array size should double with each expansion.
|
||||
|
||||
```src
|
||||
[file]{my_list}-[class]{my_list}-[func]{}
|
||||
|
||||
@ -1,14 +1,14 @@
|
||||
# Memory and Cache *
|
||||
# Memory and cache *
|
||||
|
||||
In the first two sections of this chapter, we explored arrays and linked lists, two fundamental and important data structures, representing "continuous storage" and "dispersed storage" respectively.
|
||||
|
||||
In fact, **the physical structure largely determines the efficiency of a program's use of memory and cache**, which in turn affects the overall performance of the algorithm.
|
||||
|
||||
## Computer Storage Devices
|
||||
## Computer storage devices
|
||||
|
||||
There are three types of storage devices in computers: "hard disk," "random-access memory (RAM)," and "cache memory." The following table shows their different roles and performance characteristics in computer systems.
|
||||
|
||||
<p align="center"> Table <id> Computer Storage Devices </p>
|
||||
<p align="center"> Table <id> Computer storage devices </p>
|
||||
|
||||
| | Hard Disk | Memory | Cache |
|
||||
| ---------- | -------------------------------------------------------------- | ------------------------------------------------------------------------ | ----------------------------------------------------------------------------------------------- |
|
||||
@ -23,7 +23,7 @@ We can imagine the computer storage system as a pyramid structure shown in the f
|
||||
- **Hard disks are difficult to replace with memory**. Firstly, data in memory is lost after power off, making it unsuitable for long-term data storage; secondly, the cost of memory is dozens of times that of hard disks, making it difficult to popularize in the consumer market.
|
||||
- **It is difficult for caches to have both large capacity and high speed**. As the capacity of L1, L2, L3 caches gradually increases, their physical size becomes larger, increasing the physical distance from the CPU core, leading to increased data transfer time and higher element access latency. Under current technology, a multi-level cache structure is the best balance between capacity, speed, and cost.
|
||||
|
||||

|
||||

|
||||
|
||||
!!! note
|
||||
|
||||
@ -33,9 +33,9 @@ Overall, **hard disks are used for long-term storage of large amounts of data, m
|
||||
|
||||
As shown in the figure below, during program execution, data is read from the hard disk into memory for CPU computation. The cache can be considered a part of the CPU, **smartly loading data from memory** to provide fast data access to the CPU, significantly enhancing program execution efficiency and reducing reliance on slower memory.
|
||||
|
||||

|
||||

|
||||
|
||||
## Memory Efficiency of Data Structures
|
||||
## Memory efficiency of data structures
|
||||
|
||||
In terms of memory space utilization, arrays and linked lists have their advantages and limitations.
|
||||
|
||||
@ -43,7 +43,7 @@ On one hand, **memory is limited and cannot be shared by multiple programs**, so
|
||||
|
||||
On the other hand, during program execution, **as memory is repeatedly allocated and released, the degree of fragmentation of free memory becomes higher**, leading to reduced memory utilization efficiency. Arrays, due to their continuous storage method, are relatively less likely to cause memory fragmentation. In contrast, the elements of a linked list are dispersedly stored, and frequent insertion and deletion operations make memory fragmentation more likely.
|
||||
|
||||
## Cache Efficiency of Data Structures
|
||||
## Cache efficiency of data structures
|
||||
|
||||
Although caches are much smaller in space capacity than memory, they are much faster and play a crucial role in program execution speed. Since the cache's capacity is limited and can only store a small part of frequently accessed data, when the CPU tries to access data not in the cache, a "cache miss" occurs, forcing the CPU to load the needed data from slower memory.
|
||||
|
||||
@ -51,17 +51,17 @@ Clearly, **the fewer the cache misses, the higher the CPU's data read-write effi
|
||||
|
||||
To achieve higher efficiency, caches adopt the following data loading mechanisms.
|
||||
|
||||
- **Cache Lines**: Caches don't store and load data byte by byte but in units of cache lines. Compared to byte-by-byte transfer, the transmission of cache lines is more efficient.
|
||||
- **Prefetch Mechanism**: Processors try to predict data access patterns (such as sequential access, fixed stride jumping access, etc.) and load data into the cache according to specific patterns to improve the hit rate.
|
||||
- **Spatial Locality**: If data is accessed, data nearby is likely to be accessed in the near future. Therefore, when loading certain data, the cache also loads nearby data to improve the hit rate.
|
||||
- **Temporal Locality**: If data is accessed, it's likely to be accessed again in the near future. Caches use this principle to retain recently accessed data to improve the hit rate.
|
||||
- **Cache lines**: Caches don't store and load data byte by byte but in units of cache lines. Compared to byte-by-byte transfer, the transmission of cache lines is more efficient.
|
||||
- **Prefetch mechanism**: Processors try to predict data access patterns (such as sequential access, fixed stride jumping access, etc.) and load data into the cache according to specific patterns to improve the hit rate.
|
||||
- **Spatial locality**: If data is accessed, data nearby is likely to be accessed in the near future. Therefore, when loading certain data, the cache also loads nearby data to improve the hit rate.
|
||||
- **Temporal locality**: If data is accessed, it's likely to be accessed again in the near future. Caches use this principle to retain recently accessed data to improve the hit rate.
|
||||
|
||||
In fact, **arrays and linked lists have different cache utilization efficiencies**, mainly reflected in the following aspects.
|
||||
|
||||
- **Occupied Space**: Linked list elements occupy more space than array elements, resulting in less effective data volume in the cache.
|
||||
- **Cache Lines**: Linked list data is scattered throughout memory, and since caches load "by line," the proportion of loading invalid data is higher.
|
||||
- **Prefetch Mechanism**: The data access pattern of arrays is more "predictable" than that of linked lists, meaning the system is more likely to guess which data will be loaded next.
|
||||
- **Spatial Locality**: Arrays are stored in concentrated memory spaces, so the data near the loaded data is more likely to be accessed next.
|
||||
- **Occupied space**: Linked list elements occupy more space than array elements, resulting in less effective data volume in the cache.
|
||||
- **Cache lines**: Linked list data is scattered throughout memory, and since caches load "by line," the proportion of loading invalid data is higher.
|
||||
- **Prefetch mechanism**: The data access pattern of arrays is more "predictable" than that of linked lists, meaning the system is more likely to guess which data will be loaded next.
|
||||
- **Spatial locality**: Arrays are stored in concentrated memory spaces, so the data near the loaded data is more likely to be accessed next.
|
||||
|
||||
Overall, **arrays have a higher cache hit rate and are generally more efficient in operation than linked lists**. This makes data structures based on arrays more popular in solving algorithmic problems.
|
||||
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
# Summary
|
||||
|
||||
### Key Review
|
||||
### Key review
|
||||
|
||||
- Arrays and linked lists are two basic data structures, representing two storage methods in computer memory: contiguous space storage and non-contiguous space storage. Their characteristics complement each other.
|
||||
- Arrays support random access and use less memory; however, they are inefficient in inserting and deleting elements and have a fixed length after initialization.
|
||||
@ -29,7 +29,7 @@ Linked lists consist of nodes connected by references (pointers), and each node
|
||||
In contrast, array elements must be of the same type, allowing the calculation of offsets to access the corresponding element positions. For example, an array containing both int and long types, with single elements occupying 4 bytes and 8 bytes respectively, cannot use the following formula to calculate offsets, as the array contains elements of two different lengths.
|
||||
|
||||
```shell
|
||||
# Element memory address = Array memory address + Element length * Element index
|
||||
# Element memory address = array memory address + element length * element index
|
||||
```
|
||||
|
||||
**Q**: After deleting a node, is it necessary to set `P.next` to `None`?
|
||||
|
||||
Reference in New Issue
Block a user