Back

Unconventional Sorting Algorithms

89 points2 yearscodingkaiser.blog
nayuki2 years ago

My favorite is still slowsort. https://en.wikipedia.org/wiki/Slowsort

> It is a reluctant algorithm based on the principle of multiply and surrender. It was published in 1986 by Andrei Broder and Jorge Stolfi in their paper Pessimal Algorithms and Simplexity Analysis.

https://www.mipmip.org/tidbits/pasa.pdf

> The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

cratermoon2 years ago
robotresearcher2 years ago

SleepSort is a subject of the posted article.

Sohcahtoa822 years ago

I like Intelligent Design sort:

> The probability of the original input list being in the exact order it's in is 1/(n!). There is such a small likelihood of this that it's clearly absurd to say that this happened by chance, so it must have been consciously put in that order by an intelligent Sorter. Therefore it's safe to assume that it's already optimally sorted in some way that transcends our naïve mortal understanding of "ascending order". Any attempt to change that order to conform to our own preconceptions would actually make it less sorted.

https://www.dangermouse.net/esoteric/intelligentdesignsort.h...

kjeetgill2 years ago

Ohhhhhh... fascinating! Is there a good metric for how well sorted a list is? Maybe for every pair of elements, how many are in the correct order?

Given a random shuffle of a list, what are the chances that another shuffle would be more sorted? There's a life lesson in here, I know it...

michaelmior2 years ago

Quantum bogosort is probably my personal favorite. Randomly permute the input using a quantum source of entropy. If the input is not sorted, destroy the universe. The only universe remaining is one where the input was already sorted. Runs in O(n) time.

contravariant2 years ago

Even without destroying the universe you can sort your list with a quantum source of entropy, just use:

    def cosmicraysort(list):
        while not sorted(list):
            pass
flipflip2 years ago

You can just kill the observer. Quantum immortality sort:

     def qisort(list):
        qshuffle(list)
        if not sorted(list):
            kill_observer()
Only in the universe in which th the list is sorted, the user will survive. There might be no such universe, but then the user is no longer waiting on the result.
Sharlin2 years ago

(the problem of destroying the universe left as an exercise to the reader)

wahern2 years ago

For the operation of this algorithm, is there any meaningful difference between destroying the universe and destroying the observer(s)? https://en.wikipedia.org/wiki/Quantum_suicide_and_immortalit...

EDIT: If you allow for the objects to spontaneously sort themselves, the runtime is O(1).

drdeca2 years ago

Surely O(n) for checking whether it is sorted?

+1
contravariant2 years ago
btheshoe2 years ago

(the problem of destroying which universe is also left as an exercise to the reader)

cevi2 years ago

Unfortunately, randomly permuting the input takes O(nlog(n)) steps (you need at least this much time just to read in sufficiently many random bits). Perhaps a clever parallel architecture could reduce the runtime?

michaelmior2 years ago

This is not true if you can generate a random number in constant time which is probably the more practical of the constraints to satisfy when implementing this algorithm.

forinti2 years ago

My contribution: Power Sort.

Start with p=0 and add 2^n for each n. Then subtract the largest power of 2 successively to get your integers ordered.

Con: p will be very big.

Pro: you don't need ifs!

Paedor2 years ago

Fun fact, when you consider the bits required to contain p, this is equivalent to bucket sort with a bucket size of 1.

Snild2 years ago

Ooh, that's a fun one.

Another con: you'd better hope your input contains no duplicates. :)

dbaupp2 years ago

Duplicates could also be handled by using (L+1)^n, rather than 2^n, where L is the length of the input: even if all elements are identical one will end up with a unique value p = L * (L+1)^n.

For an efficient implementation, one might want to round L+1 up to the nearest power of 2 to get crucial micro-optimisations based on instructions for bit scanning.

(I think this ends up being a very complicated phrasing of a counting sort.)

forinti2 years ago

Easily solvable with a hash.

I have thought this through (I'm ashamed to admit).

dragontamer2 years ago

Bogosort is always a fun one, because I think its one of the better introductions to NP-complete problems.

If you imagine that there's no "easy" algorithm to return a sorted list (but sorted lists do exist), then the only methodology possible is to try all combinations. And you do this either systematically, or randomly.

Random NP-complete algorithms, such as WalkSAT, are quite good in practice. Systematic NP-complete algorithms, such as DPLL-SAT, are more "obvious" in how they work, and are sufficient for smaller sizes.

scarmig2 years ago

  def everett_sort(list):
    shuffle(list)
    if is_sorted(list):
      return
    else:
      suicide()
boothby2 years ago

I recently found a fun linear-time systolic sorting algorithm. I'd describe the serial implementation as unconventionally obvious. It's like insertion sort, without all the bother of swapping.

  def index_sort(a):
    n = len(a)
    output = [None] * n
    for i in range(n): #can run in parallel, damn the GIL
      index = 0
      for j in range(n):
        if a[j] < a[i]:
          index += 1
        elif a[j] == a[i] and j < i:
          index += 1
      output[index] = a[i]
    return output
Snild2 years ago

> linear-time

But... there are two nested `range(n)` for loops. Typo?

boothby2 years ago

That's the serial implementation, which is clearly O(n^2). The systolic algorithm runs in linear time on specialty hardware (similar to a tensor core, which can do comparisons in a square matrix, and can perform row-sums). Or, if we ignore communication overheads, it can run in O(n) time with O(n) parallel workers.

lifthrasiir2 years ago

    import ctypes
    
    def mutation_sort(a):
        for i, v in enumerate(a):
            ctypes.c_int.from_address(id(v) + 24).value = i # for x86-64, adjust the offset for your platform
        return a
    
    def is_sorted(a):
        prev = None
        for v in a:
            if prev is not None and prev > v: return False
            prev = v
        return True
    
    is_sorted(mutation_sort([3000, 1000, 7000, 8000, 2000])) # prints True
fjfaase2 years ago

Usage sort: About sorting a sequence of numbered books while you are picking books out of the sequence when you need them and insert them in afterwards (moving the books in between). What is the best strategy for where to put the books back? for a discussion see: https://www.iwriteiam.nl/Dpuzzle.html#usagesort

nickcw2 years ago

Here is my offering

https://gist.github.com/ncw/5419af0e255d2fb62b98

It's a Go channel based quicksort. It sorts a channel of ints using O(n) go routines! Interestingly it doesn't need to know how many ints are in the channel at the start.

Not a practical sort method but fun to see the Go concurrency primitives in use.

danpalmer2 years ago

I like BTP-Sort. It’s O(n).

Really easy: you see it, then you say it, then it’s sorted.

https://www.btp.police.uk/police-forces/british-transport-po... (this is a famous national campaign on the British railways over the past few years)

unnouinceput2 years ago

I know this article is meant as a joke but the first variant of bogosort, which is actually randomsort, it will be the fastest algorithm once we start using quantum computing.

Having an array and allocating "n"=="array length" qubits for it then implementing parallel version of randomsort (no while, while is serialization!) will be the fastest sorting algorithm.

r0f12 years ago

What about this one? This one was mentioned a couple of weeks back and is really funny.

https://news.ycombinator.com/item?id=28758106

forgotmypw172 years ago
axiosgunnar2 years ago

Slightly off-topic, the article was fine except for one nitbit...

I wonder what the author, "hilariously" making one sorting function print "<val> sent to gulag" and calling it "stalin-sort", would think of "hitler-sort" which prints "<val> burned in oven"?

Please don't normalize mass-murderers, even for memes.

axiosgunnar2 years ago

If you downvote, you must write what is wrong with that statement.

karaterobot2 years ago

It's an obvious joke, the point of which is that the Stalin Sort is destructive and doesn't actually get you what you want, and you'd never want to use it.

googlryas2 years ago

I think OP gets the joke and is just saying it is in poor taste. Is it particularly different than their example of Hitler-sort burning stuff in an oven on failure? Or Columbine-sort, where the 15 coolest numbers get shot and then the rest of the numbers can get sorted.

karaterobot2 years ago

But if the point is to mock the concept, how would it be considered normalizing anything?

googlryas2 years ago

I disagree that the point was to mock stalinism. It used Stalinism to joke about this type of sorting, but that doesn't mean it was mocking Stalinism.

NotAnOtter2 years ago

Isn't 'sleep sort' linear with respect to array size?

O(n) sort achieved?

dfox2 years ago

It is if you ignore how the underlying OS implements multitasking and sleep() in particular. And the kernel-side implementation usually involves some kind of priority queue of sleeping threads. So taken as a whole sleep-sort is just an convoluted implementation of heap-sort that outsources the actual sorting to OS.

dmurray2 years ago

The kernel could use a different implementation of multitasking, e.g. keep the threads in a circular doubly linked list instead of a priority queue. This wouldn't be optimal for a general-purpose OS, but works for our custom O(n) sorting machine.

dfox2 years ago

Then the resulting complexity would be O(n^2). With some kind of heap-like priority queue as used in typical general purpose OS you get O(n*log(n)) or something close to that.

thehappypm2 years ago

I really like sleep sort because it trades (fundamentally) cycles and memory for time.

In theory you could do this in hardware. Hardware timers are pretty trivial, you could imagine a hardware chip that accepts a number, then writes the value to the next free slot in an array when the time has elapsed. As timers go off, the array fills up. Obviously you need to handle collisions and whatnot but nothing in this screams that you need to go O(n^2).

Sharlin2 years ago

You need n separate programmable timers, though. If you only have a constant number of timers, then you’re back to implementing something approaching a priority queue to schedule the events (possibly in hardware if you want)

boothby2 years ago

It's perversely linear with respect to the largest element in the array, if that's greater than the runtime of the underlying heapsort. Good luck with negative entries.

Response to dead comment:

> Couldn't this be solved by scaling everything by the largest entry. Of course that requires first a pass to find the largest entry but that's not as expensive?

One presumes that we don't have infinite precision timers, so you should only scale it down so the smallest gap between elements is a few clock cycles (unless you're satisfied with breaking the fiction that sleepsort is not heapsort). This seems like a mildly interesting question. My intuition is that finding the smallest abs(a[i]-a[j]) with i != j should be just as hard as sorting the list on a classical computer, but it seems like a quantum computer might find the smallest gap faster.

kruxigt2 years ago

Couldn't this be solved by scaling everything by the largest entry. Of course that requires first a pass to find the largest entry but that's not as expensive?

throw38492 years ago

Australia sorting algorithm:

Stay at your index!!!

DeathArrow2 years ago

for i = 1 to n do

for j = 1 to n do

if A[i] < A[j] then

swap A[i] and A[j

unnouinceput2 years ago

fastest swap implementation, assembler:

PUSH A[j]; PUSH A[i]; POP A[j]; POP A[i];

fauscist19842 years ago

NIH admits Fauci lied about funding Wuhan gain-of-function experiments

https://www.washingtonexaminer.com/opinion/nih-admits-fauci-...

WJW2 years ago

Not related to sorting algorithms.

edgyquant2 years ago

This has been posted in multiple threads today

t34zesrg2 years ago

NIH admits Fauci lied to congress and funded covid-19:

https://www.nationalreview.com/news/nih-admits-to-funding-ga...