mindstalk: (thoughtful)
Of the standard fast sorting algorithms, heap and quick are in-place but unstable, merge is stable (or can be) but take O(n) extra space. You can decorate input with the original sequencing in order to break ties and make a stable sort, but now that takes O(n) extra space. Insertion sort is stable and in-place, but takes O(n^2) time; if you insert into a tree it becomes faster, but takes space again (as you're not just copying the data, but creating two pointers per item, too.)

If I understand radix sort right, it has worst case O(n*log(n)) performance (if there are n distinct entries, they take log(n) bits to be represented, and radix is O(n*numbits), and the LSD version is stable, but I think takes extra space. MSD can be stable but takes extra space for that.

So, seems like a pattern! Except Wikipedia intimates of in-place stable mergesorts in O(n*log n * log n) -- okay, that still takes extra time -- or in just O(n log n) but with high constant values. And there's blocksort, allegedly an in-place, stable, worst case O(n log n), best case O(n) algorithm. Which sounds like the Ultimate Sorting Algorithm, doing everything you'd want. Why haven't I heard of it more? Apart from seeming pretty complicated.

Profile

mindstalk: (Default)
mindstalk

January 2026

S M T W T F S
    1 2 3
45 6 7 8 910
11 12131415 1617
18 19 2021 222324
25262728293031

Page Summary

Most Popular Tags

Expand Cut Tags

No cut tags

Style Credit

Page generated 2026-01-27 13:37
Powered by Dreamwidth Studios