Algorithm - Insertion Sort
.
2. Shift
-
At the start of each pass, compare the value stored in the temporary variable with the value immediately to the left of the empty position.
-
If the left value is greater than the temporary variable, swap the left value into the empty position.
(This looks like pushing values to the side, which is why it's called a shift.)
- Keep comparing and shifting like this. When the empty position reaches index 0 or the temporary variable is greater than the left value, place the temporary variable in the empty position and complete the pass.
Breaking down the above process into worst-case analysis:
At first glance, since there are two loops, you can predict O(N^2).
For a more precise calculation, using an array of 5 elements as an example:
Pass 1: 1 comparison and 1 shift,
Pass 2: 2 comparisons and 2 shifts,
...
Pass 4: 4 comparisons and 4 shifts,
Adding them all up gives 20 operations.
So we can see it's roughly O(N^2).
Therefore, Insertion Sort falls into the category of slow sorting algorithms, just like Bubble Sort and Selection Sort introduced earlier.
However, we can't only consider worst-case speed for all algorithms.
In practice, on average, Insertion Sort performs about 1.3-2x faster than Selection Sort.
The interesting thing is that no matter how favorable the conditions, Bubble and Selection Sort compare all values on every pass.
Their best-case speed is also O(N^2).
But in Insertion Sort, if the adjacent value is smaller than the current value, it terminates immediately without comparing other values,
completing with just 1 comparison — giving it an O(N) time complexity of 2N operations.
Therefore, judging which algorithm is faster depending on the situation is also important.
A very important point to wrap things up.