Programing

Bogosort (일명 Monkey Sort)보다 나쁜 정렬 알고리즘이 있습니까?

lottogame 2020. 5. 29. 07:54
반응형

Bogosort (일명 Monkey Sort)보다 나쁜 정렬 알고리즘이 있습니까? [닫은]


저의 동료들은 오늘 아침 정렬 알고리즘에 대한 토론으로 제 대학 시절로 시간을 되돌 렸습니다. 우리는 StupidSort같은 즐겨 찾기를 연상 시켰으며 , 우리 중 한 명이 정렬 알고리즘을 보았을 것이라고 확신했다 O(n!). 그것은 내가 찾을 수있는 "가장 최악의"정렬 알고리즘을 찾기 시작했다.

우리는 완전히 임의의 정렬이 꽤 나쁘다고 가정했습니다 (즉, 요소를 무작위로 분류하십시오-순서대로? 아니오? 무작위로 다시 무작위 화하십시오). 그리고 나는 그것을 둘러보고 분명히 BogoSort 또는 Monkey Sort 또는 랜덤 정렬 이라고 불렀다는 것을 알았습니다. .

Monkey Sort는 최악의 성능 O(∞), 최상의 성능 O(n)및 평균 성능 을 가진 것으로 보입니다 O(n·n!).

평균 성능이 나쁜 명명 된 알고리즘이 O(n·n!)있습니까? 아니면 일반적으로 Monkey Sort보다 더 조용합니까?


에서 데이비드 모건-월 의 비전 알고리즘 페이지 : 지적 설계 정렬

소개

지능형 디자인 정렬은 지능형 디자인 이론을 기반으로 한 정렬 알고리즘입니다.

알고리즘 설명

원래 입력 목록이 정확한 순서로있을 확률은 1 / (n!)입니다. 이런 일이 우연히 발생했다는 말은 명백하게 부적절 할 가능성이 아주 적으므로 지능적인 분류기에 의해 의식적으로 그 순서대로 놓여 졌을 것입니다. 따라서 "오름차순"에 대한 순진한 필사자의 이해를 초월하는 방식으로 이미 최적으로 정렬되어 있다고 가정하는 것이 안전합니다. 우리 자신의 선입견을 따르기 위해 그 순서를 바꾸려고하면 실제로는 덜 분류 될 것입니다.

분석

이 알고리즘은 시간이 일정하며 추가 메모리가 필요하지 않고 목록을 적절하게 정렬합니다. 사실, 그것은 의심스러운 기술적 인 컴퓨터를 필요로하지 않습니다. 분류기를 찬양하라!

피드백

게리 로저스의 글을 참고하세요 :

정렬 시간을 일정하게 설정하면 분류기의 성능이 거부됩니다. 분류기는 시간이 지남에 따라 정렬되므로 시간이 없습니다. 정렬을 확인하는 데 시간이 걸리면 정렬 기의 역할이 어려워집니다. 따라서 ...이 특정 종류는 결함이 있으며 'The Sorter'에 기인 할 수 없습니다.

이교!


몇 년 전, 나는 MiracleSort를 발명했지만 (실제로는 구현하지는 않았습니다)

Start with an array in memory.
loop:
    Check to see whether it's sorted.
    Yes? We're done.
    No? Wait a while and check again.
end loop

결국, 메모리 칩에서 비트를 뒤집는 알파 입자는 성공적으로 정렬되어야합니다.

안정성을 높이려면 어레이를 차폐 위치에 복사하고 잠재적으로 정렬 된 어레이를 원본과 비교하십시오.

그렇다면 잠재적으로 정렬 된 배열을 원본과 어떻게 비교합니까? 각 배열을 정렬하고 일치하는지 확인하십시오. MiracleSort는이 단계에서 사용할 확실한 알고리즘입니다.

편집 : 엄밀히 말하면, 이것은 종료가 보장되지 않기 때문에 알고리즘이 아닙니다. "알고리즘이 아닌"이 "나쁜 알고리즘"으로 인정됩니까?


양자보고 르 소트

양자 역학에 대한 많은 세계 해석이 정확하다고 가정하는 정렬 알고리즘 :

  1. 목록이 정렬되어 있는지 확인하십시오. 그렇지 않다면 우주를 파괴하십시오.

알고리즘의 결론에 따라, 목록은 유일하게 남아있는 유일한 우주에서 정렬됩니다. 이 알고리즘은 최악의 경우 O (N) 및 평균의 경우 O (1) 시간이 걸립니다. 실제로 수행되는 평균 비교 수는 2입니다. 두 번째 요소에서 우주가 파괴 될 확률은 50 %이고 세 번째 요소에서 우주가 파괴 될 확률은 25 %입니다.


아직 아무도 sleepsort를 언급하지 않은 것에 대해 놀랐습니다 ... 또는 눈치 채지 못했습니까? 어쨌든:

#!/bin/bash
function f() {
    sleep "$1"
    echo "$1"
}
while [ -n "$1" ]
do
    f "$1" &
    shift
done
wait

사용법 예 :

./sleepsort.sh 5 3 6 3 6 3 1 4 7
./sleepsort.sh 8864569 7

성능면에서 끔찍합니다 (특히 두 번째 예). 거의 3.5 개월을 기다려서 2 개의 숫자를 정렬하는 것은 다소 나쁘다.


여기에 설명 된대로 징글 정렬 .

당신은 당신의 목록에있는 각 가치를 크리스마스에 다른 어린이에게줍니다. 끔찍한 인간 인 아이들은 선물의 가치를 비교하고 그에 따라 분류합니다.


한 번 무작위 배열 생성을 제안한 강사가 정렬되어 있는지 확인한 다음 정렬 할 배열과 데이터가 동일한 지 확인했습니다.

최고의 사례 O (N) (처음으로 베이비!) 최악의 경우 O (Never)


알고리즘을 어떤 식 으로든 의미있게 유지한다면 O(n!)달성 할 수있는 최악의 상한입니다.

세트의 순열을 정렬 할 수있는 가능성을 확인 n!하면 단계가 수행되므로 그보다 더 나빠질 수는 없습니다.

If you're doing more steps than that then the algorithm has no real useful purpose. Not to mention the following simple sorting algorithm with O(infinity):

list = someList
while (list not sorted):
    doNothing

You should do some research into the exciting field of Pessimal Algorithms and Simplexity Analysis. These authors work on the problem of developing a sort with a pessimal best-case (your bogosort's best case is Omega(n), while slowsort (see paper) has a non-polynomial best-case time complexity).


Bogobogosort. Yes, it's a thing. to Bogobogosort, you Bogosort the first element. Check to see if that one element is sorted. Being one element, it will be. Then you add the second element, and Bogosort those two until it's sorted. Then you add one more element, then Bogosort. Continue adding elements and Bogosorting until you have finally done every element. This was designed never to succeed with any sizable list before the heat death of the universe.


Here's 2 sorts I came up with my roommate in college

1) Check the order 2) Maybe a miracle happened, go to 1

and

1) check if it is in order, if not 2) put each element into a packet and bounce it off a distant server back to yourself. Some of those packets will return in a different order, so go to 1


There is a sort that's called bogobogosort. First, it checks the first 2 elements, and bogosorts them. Next it checks the first 3, bogosorts them, and so on. Should the list be out of order at any time, it restarts by bogosorting the first 2 again. Regular bogosort has a average complexity of O(N!), this algorithm has a average complexity of O(N!1!2!3!...N!) Edit: To give you an idea of how large this number is, for 20 elements, this algorithm takes an average of 3.930093*10^158 years, well above the proposed heat death of the universe(if it happens) of 10^100 years, whereas merge sort takes around .0000004 seconds, bubble sort .0000016 seconds, and bogosort takes 308 years, 139 days, 19 hours, 35 minutes, 22.306 seconds, assuming a year is 365.242 days and a computer does 250,000,000 32 bit integer operations per second. Edit2: This algorithm is not as slow as the "algorithm" miracle sort, which probably, like this sort, will get the computer sucked in the black hole before it successfully sorts 20 elemtnts, but if it did, I would estimate an average complexity of 2^(32(the number of bits in a 32 bit integer)N)(the number of elements)(a number <=10^40 years, since gravity speeds up the chips alpha moving, and there are 2^N states, which is 2^640*10^40, or about 5.783*10^216.762162762 years, though if the list started out sorted, its complexity would only be O(N), faster than merge sort, which is only N log N even at the worst case. Edit3: This algorithm is actually slower than miracle sort as the size gets very big, say 1000, since my algorithm would have a run time of 2.83*10^1175546 years, while the miracle sort algorithm would have a run time of 1.156*10^9657 years.


There's always the Bogobogosort (Bogoception!). It performs Bogosort on increasingly large subsets of the list, and then starts all over again if the list is ever not sorted.

for (int n=1; n<sizeof(list); ++n) {
  while (!isInOrder(list, 0, n)) {
    shuffle(list, 0, n);
  }
  if (!isInOrder(list, 0, n+1)) { n=0; }
}

1 Put your items to be sorted on index cards
2 Throw them into the air on a windy day, a mile from your house.
2 Throw them into a bonfire and confirm they are completely destroyed.
3 Check your kitchen floor for the correct ordering.
4 Repeat if it's not the correct order.

Best case scenerio is O(∞)

Edit above based on astute observation by KennyTM.


The "what would you like it to be?" sort

  1. Note the system time.
  2. Sort using Quicksort (or anything else reasonably sensible), omitting the very last swap.
  3. Note the system time.
  4. Calculate the required time. Extended precision arithmetic is a requirement.
  5. Wait the required time.
  6. Perform the last swap.

Not only can it implement any conceivable O(x) value short of infinity, the time taken is provably correct (if you can wait that long).


Nothing can be worse than infinity.


Bozo sort is a related algorithm that checks if the list is sorted and, if not, swaps two items at random. It has the same best and worst case performances, but I would intuitively expect the average case to be longer than Bogosort. It's hard to find (or produce) any data on performance of this algorithm.


Segments of π

Assume π contains all possible finite number combinations. See math.stackexchange question

  1. Determine the number of digits needed from the size of the array.
  2. Use segments of π places as indexes to determine how to re-order the array. If a segment exceeds the size boundaries for this array, adjust the π decimal offset and start over.
  3. Check if the re-ordered array is sorted. If it is woot, else adjust the offset and start over.

A worst case performance of O(∞) might not even make it an algorithm according to some.

An algorithm is just a series of steps and you can always do worse by tweaking it a little bit to get the desired output in more steps than it was previously taking. One could purposely put the knowledge of the number of steps taken into the algorithm and make it terminate and produce the correct output only after X number of steps have been done. That X could very well be of the order of O(n2) or O(nn!) or whatever the algorithm desired to do. That would effectively increase its best-case as well as average case bounds.

But your worst-case scenario cannot be topped :)


My favorite slow sorting algorithm is the stooge sort:

void stooges(long *begin, long *end) {
   if( (end-begin) <= 1 ) return;
   if( begin[0] < end[-1] ) swap(begin, end-1);
   if( (end-begin) > 1 ) {
      int one_third = (end-begin)/3;
      stooges(begin, end-one_third);
      stooges(begin+one_third, end);
      stooges(begin, end-one_third);
   }
}

The worst case complexity is O(n^(log(3) / log(1.5))) = O(n^2.7095...).

Another slow sorting algorithm is actually named slowsort!

void slow(long *start, long *end) {
   if( (end-start) <= 1 ) return;
   long *middle = start + (end-start)/2;
   slow(start, middle);
   slow(middle, end);
   if( middle[-1] > end[-1] ) swap(middle-1, end-1);
   slow(start, end-1);
}

This one takes O(n ^ (log n)) in the best case... even slower than stoogesort.


Recursive Bogosort (probably still O(n!){
if (list not sorted)
list1 = first half of list.
list 2 = second half of list.
Recursive bogosort (list1);
Recursive bogosort (list2);
list = list1 + list2
while(list not sorted)
    shuffle(list);
}

This page is a interesting read on the topic: http://home.tiac.net/~cri_d/cri/2001/badsort.html

My personal favorite is Tom Duff's sillysort:

/*
 * The time complexity of this thing is O(n^(a log n))
 * for some constant a. This is a multiply and surrender
 * algorithm: one that continues multiplying subproblems
 * as long as possible until their solution can no longer
 * be postponed.
 */
void sillysort(int a[], int i, int j){
        int t, m;
        for(;i!=j;--j){
                m=(i+j)/2;
                sillysort(a, i, m);
                sillysort(a, m+1, j);
                if(a[m]>a[j]){ t=a[m]; a[m]=a[j]; a[j]=t; }
        }
}

Double bogosort

Bogosort twice and compare results (just to be sure it is sorted) if not do it again


You could make any sort algorithm slower by running your "is it sorted" step randomly. Something like:

  1. Create an array of booleans the same size as the array you're sorting. Set them all to false.
  2. Run an iteration of bogosort
  3. Pick two random elements.
  4. If the two elements are sorted in relation to eachother (i < j && array[i] < array[j]), mark the indexes of both on the boolean array to true. Overwise, start over.
  5. Check if all of the booleans in the array are true. If not, go back to 3.
  6. Done.

Yes, SimpleSort, in theory it runs in O(-1) however this is equivalent to O(...9999) which is in turn equivalent to O(∞ - 1), which as it happens is also equivalent to O(∞). Here is my sample implementation:

/* element sizes are uneeded, they are assumed */
void
simplesort (const void* begin, const void* end)
{
  for (;;);
}

One I was just working on involves picking two random points, and if they are in the wrong order, reversing the entire subrange between them. I found the algorithm on http://richardhartersworld.com/cri_d/cri/2001/badsort.html, which says that the average case is is probably somewhere around O(n^3) or O(n^2 log n) (he's not really sure).

I think it might be possible to do it more efficiently, because I think it might be possible to do the reversal operation in O(1) time.

Actually, I just realized that doing that would make the whole thing I say maybe because I just realized that the data structure I had in mind would put accessing the random elements at O(log n) and determining if it needs reversing at O(n).


Randomsubsetsort.

Given an array of n elements, choose each element with probability 1/n, randomize these elements, and check if the array is sorted. Repeat until sorted.

Expected time is left as an exercise for the reader.

참고URL : https://stackoverflow.com/questions/2609857/are-there-any-worse-sorting-algorithms-than-bogosort-a-k-a-monkey-sort

반응형