Insertion Sort
Updated:
⚠️ This post was written back when I was in high school and is no longer maintained.
1. Pseudocode
Here are both the iterative and recursive versions of insertion sort. They follow the same logic; the only difference is how the array gets processed.
-- Iterative version --
algorithm InsertionSort(array, n):
For j := 2 to n do
key := array[j]
i := j - 1
while i > 0 and array[i] > key:
array[i] = array[i+1]
i -= 1
array[i+1] = key
-- Recursive version --
algorithm InsertionRec(A): --> A == array
InsertionRecAux(A, A.length) --> arbitrarily take the last element as the key
algorithm InsertionRecAux(A, keyIndex):
if keyIndex <= 1 then
return
InsertionRecAux(A, keyIndex-1)
Insert(A, keyIndex)
algorithm Insert(A, keyIndex):
key := A[keyIndex]
i := keyIndex - 1
While i > 0 and A[i] > key do
A[i+1] = A[i]
i -= 1
A[i+1] := key
Remark on the recursive version:
- During
Insert(), we assumeAis already sorted up to the current position. That holds becauseInsertionRecAux()keeps recursing until it hits the base case (keyIndex <= 1). Only then does the call stack unwind. - When insertion starts, the key index is 1 because a single-element array is already sorted.
- The recursion goes down first; the actual insertions happen as the stack returns.
2. Time Complexity
(1) Upper Bound
\(T(n) = for + for \cdot while \approx c_0\cdot n + c_1\cdot\textstyle\sum_2^n(j-1)\) \(T(n) \in O(n^2)\)
(2) Θ and O Notation
The worst-case time complexity of insertion sort is
\(O(n^2)\space or\space \Theta(n^2)\)
but the time complexity of insertion sort is actually
\(O(n^2)\)
but NOT in
\(Θ(n^2)\)
Recall the definition of Theta and O()

The best-case running time of insertion sort is \(Θ(n)\)
Worst-case In-Depth Analysis

Here, I(n) is the time to insert the nth element into a sorted array of length (n − 1). In the worst case we shift every element, which is linear time, so I(n) = c1 · n + c2 for some constants c1 and c2. That leads to this recurrence for T(n):

Using the substitution method we get:

We hit the base case when n − i = 1, i.e., i = n − 1. Substitute to find:

Hence, T(n) is in Theta(n^2^)
Tip: Unlike merge sort, insertion sort’s recursive version keeps recursing until only one element remains (which is already sorted), and only then starts inserting elements on the way back up the call stack.
3. Space Complexity
Insertion sort is an in-place sorting algorithm, meaning it sorts the array using only a constant amount of extra memory. All operations—comparisons, shifts, and insertions—are performed within the original array, so the space complexity is:
O(1) auxiliary space.
4. Code
# python:
def InsertionSort(array, n):
for j in range(n):
key = array[j]
i = j - 1
while i >= 0 and array[i] > key:
array[i+1] = array[i]
i -= 1
array[i+1] = key
if __name__ == "__main__":
array = [4234,234234,23,234,-134,4123,41,578,-342,55,0.2323,0.325]
print("\nBefore sorting:", array)
InsertionSort(array, len(array))
print("\nAfter sorting:", array)
// Scala:
object InsersionSortAlgo {
def InsersionSort(array: Array[Double], n: Int): Array[Double] = {
for (j <- 1 until n) {
var key: Double = array(j)
var i: Int = j - 1
while (i >= 0 && array(i) > key) {
array(i+1) = array(i)
i -= 1
}
array(i+1) = key
}
array
}
def main(args:Array[String]): Unit = {
var array = Array(4234,234234,23,234,-134,4123,41,578,-342,55,0.2323,0.325)
println("\nBefore sorting:")
println(array.deep.mkString(" "))
println("\nAfter sorting:")
println(InsersionSort(array, array.length).deep.mkString(" "))
}
}
Leave a comment