Hi guys,
i am really sorry for making offtopic, hope you will not kill me, but this is for me life important problem which needs to be solved within next 12 hours..
I have to create stable algorithm for sorting n numbers from interval [1,n^2] with time complexity O(n) .
Can someone please give me a hint. Would be very very thankful!
Thanks in advance! D.
On Sat, 2008-12-13 at 19:24 +0100, David Hláčik wrote:
Hi guys,
i am really sorry for making offtopic, hope you will not kill me, but this is for me life important problem which needs to be solved within next 12 hours..
I have to create stable algorithm for sorting n numbers from interval [1,n^2] with time complexity O(n) .
Can someone please give me a hint. Would be very very thankful!
Merge sort? Insertion sort? Selection sort? timsort?
Well, something with linear complexity O(n) which i have to prove,
Merge sort, Insertion sort or selection sort does not have O(n) complexity.
I believe that something like RadixSort, CountingSort, BucketSort altought i am not sure
Thanks!
D.
On Sat, Dec 13, 2008 at 7:51 PM, Ignacio Vazquez-Abrams ivazqueznet@gmail.com wrote:
On Sat, 2008-12-13 at 19:24 +0100, David Hláčik wrote:
Hi guys,
i am really sorry for making offtopic, hope you will not kill me, but this is for me life important problem which needs to be solved within next 12 hours..
I have to create stable algorithm for sorting n numbers from interval [1,n^2] with time complexity O(n) .
Can someone please give me a hint. Would be very very thankful!
Merge sort? Insertion sort? Selection sort? timsort?
-- Ignacio Vazquez-Abrams ivazqueznet@gmail.com
PLEASE don't CC me; I'm already subscribed
CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
David Hláčik wrote:
Well, something with linear complexity O(n) which i have to prove,
Merge sort, Insertion sort or selection sort does not have O(n) complexity.
I believe that something like RadixSort, CountingSort, BucketSort altought i am not sure
I'm not especially inclined to do someone's homework for them. But since you asked for a hint...My mathematical intuition suggests starting by mapping the data into an array of n buckets where each bucket is determined by the integer part of the square root of each number in the dataset.
-- Benjamin Franz
Hi guys,
i am really sorry for making offtopic, hope you will not kill me, but this is for me life important problem which needs to be solved within next 12 hours..
I have to create stable algorithm for sorting n numbers from interval [1,n^2] with time complexity O(n) .
Can someone please give me a hint. Would be very very thankful!
Merge sort? Insertion sort? Selection sort? timsort?
Well, something with linear complexity O(n) which i have to prove,
Merge sort, Insertion sort or selection sort does not have O(n) complexity.
I believe that something like RadixSort, CountingSort, BucketSort altought i am not sure
Radix is O(n ln n), so no-go for you. I'm not familiar with other names. However, what you need is what is called the "generation counter" algorithm (afaik its name).
Basically, count the number of appearances of every number in your set. If you have a set a priori bounded from above and below --- which you do, [1, n^2] --- you first allocate an array of integers of length n^2. Set all elements to zero, go through your set that is to be sorted, and whenever you find a number k go and increment the k-th element of your generation array. You need one for-loop to go through your set, and after that another for loop to print the output from the generation set --- go through the whole set from 1 to n^2, and if the value of k-th element is nonzero, print number k appropriate number of times.
For example, if you have the set {1,8,5,3,5,9,6,2,2,4,3,5} and you are told that items are always in the interval [1,10], you form a generation set:
| 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | <- labels | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | <- initial values
and you fill it by counting the number of appearances for every element of your set (first for-loop):
| 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | <- labels | 1 | 2 | 2 | 1 | 3 | 1 | 0 | 1 | 1 | 0 | <- final values
Then you go through it and type each "label", "field value" number of times (second for-loop):
{1,2,2,3,3,4,5,5,5,6,8,9}.
The algorithm is basically consisted of two consequent for-loops, which should make its complexity O(n), afaik. However, I'll leave the proof to you. ;-)
Btw, it is a well known fact that given a set of n elements and no other information, the best algorithm for sorting is going to be at least O(n ln n) or worse. However, if you are given some other information about your set (like the extrema of its image or some such info), you can in improve the sorting speed, O(n) being the linear, best-case scenario. Generation counter algorithm is a textbook example of this.
Or you can wait for quantum computers to be invented. They have more powerful algorithms, due to the use of qubits instead of bits... ;-)
HTH, :-) Marko
Marko Vojinovic wrote: [...]
Basically, count the number of appearances of every number in your set. If you have a set a priori bounded from above and below --- which you do, [1, n^2] --- you first allocate an array of integers of length n^2.
By definition, your proposed algorithm is O(n^2), not O(n).
;)
On Sunday 14 December 2008 03:33, Jerry Franz wrote:
Marko Vojinovic wrote: [...]
Basically, count the number of appearances of every number in your set. If you have a set a priori bounded from above and below --- which you do, [1, n^2] --- you first allocate an array of integers of length n^2.
By definition, your proposed algorithm is O(n^2), not O(n).
;)
Oh, you mean because the upper bound is n^2, right? Sure, of course, this particular case is O(n^2). Your proposal in your other post with the square roots would probably improve that in this case.
However, I was just giving the OP a hint in the general direction of a typical O(n) algorithm, didn't have an intention to provide a full working solution for his specific case. It's his homework, not mine. ;-)
:-) Marko
Marko Vojinovic wrote:
On Sunday 14 December 2008 03:33, Jerry Franz wrote:
[...] By definition, your proposed algorithm is O(n^2), not O(n).
;)
Oh, you mean because the upper bound is n^2, right? Sure, of course, this particular case is O(n^2). Your proposal in your other post with the square roots would probably improve that in this case.
However, I was just giving the OP a hint in the general direction of a typical O(n) algorithm, didn't have an intention to provide a full working solution for his specific case. It's his homework, not mine. ;-)
Fair enough.
Jerry Franz wrote:
Marko Vojinovic wrote: [...]
Basically, count the number of appearances of every number in your set. If you have a set a priori bounded from above and below --- which you do, [1, n^2] --- you first allocate an array of integers of length n^2.
By definition, your proposed algorithm is O(n^2), not O(n).
No it isn't, it's O(n) in time. O(n^2) in memory but that wasn't the question, right?
Nicolas Thierry-Mieg wrote:
Jerry Franz wrote:
Marko Vojinovic wrote: [...]
Basically, count the number of appearances of every number in your set. If you have a set a priori bounded from above and below --- which you do, [1, n^2] --- you first allocate an array of integers of length n^2.
By definition, your proposed algorithm is O(n^2), not O(n).
No it isn't, it's O(n) in time. O(n^2) in memory but that wasn't the question, right?
Look closer at it.
"[...] you first allocate an array of integers of length n^2. Set all elements to zero," [...] go through the whole set from 1 to n^2, and if the value of k-th element is nonzero, print number k appropriate number of times. [...]"
O(n^2) operations are required. It is O(n^2) both in time and memory as described.
Jerry Franz wrote:
Nicolas Thierry-Mieg wrote:
Jerry Franz wrote:
Marko Vojinovic wrote: [...]
Basically, count the number of appearances of every number in your set. If you have a set a priori bounded from above and below --- which you do, [1, n^2] --- you first allocate an array of integers of length n^2.
By definition, your proposed algorithm is O(n^2), not O(n).
No it isn't, it's O(n) in time. O(n^2) in memory but that wasn't the question, right?
Look closer at it.
"[...] you first allocate an array of integers of length n^2. Set all elements to zero," [...] go through the whole set from 1 to n^2, and if the value of k-th element is nonzero, print number k appropriate number of times. [...]"
O(n^2) operations are required. It is O(n^2) both in time and memory as described.
oh, I see thanks for explaining!
Thank you guys for help and support! My homework is done and waiting for grading.
Here it comes - bucket sort with time complexity O(n) == linear complexity
#! /usr/bin/python def sort(numbers): "sort n positive integers in O(n) provided that they are all from interval [1, n^2]" N = len(numbers) # get size of test numbers
buckets_mod = [[] for i in xrange(N)] buckets_sorted = [[] for i in xrange(N+1)]
# group numbers to buckets (list of numbers) with common modulus for n in numbers: buckets_mod[n % N].append(n) print "buckets_mod: %s" % buckets_mod
# check numbers in buckets for l in buckets_mod: for n in l: # place number into bucket number grouped by result of division buckets_sorted[n / N].append(n) print "buckets_sorted: %s" % buckets_sorted
# search through sorted buckets and return list of sorted numbers return [n for l in buckets_sorted for n in l]
Regards,
David
On Sun, Dec 14, 2008 at 7:49 PM, Nicolas Thierry-Mieg Nicolas.Thierry-Mieg@imag.fr wrote:
Jerry Franz wrote:
Nicolas Thierry-Mieg wrote:
Jerry Franz wrote:
Marko Vojinovic wrote: [...]
Basically, count the number of appearances of every number in your set. If you have a set a priori bounded from above and below --- which you do, [1, n^2] --- you first allocate an array of integers of length n^2.
By definition, your proposed algorithm is O(n^2), not O(n).
No it isn't, it's O(n) in time. O(n^2) in memory but that wasn't the question, right?
Look closer at it.
"[...] you first allocate an array of integers of length n^2. Set all elements to zero," [...] go through the whole set from 1 to n^2, and if the value of k-th element is nonzero, print number k appropriate number of times. [...]"
O(n^2) operations are required. It is O(n^2) both in time and memory as described.
oh, I see thanks for explaining! _______________________________________________ CentOS mailing list CentOS@centos.org http://lists.centos.org/mailman/listinfo/centos
On Sat, Dec 13, 2008 at 10:24 AM, David Hláčik david@hlacik.eu wrote:
Hi guys,
i am really sorry for making offtopic, hope you will not kill me, but this is for me life important problem which needs to be solved within next 12 hours..
I have to create stable algorithm for sorting n numbers from interval [1,n^2] with time complexity O(n) .
Can someone please give me a hint. Would be very very thankful!
Knuth.
Art of Computer Programming Volume 3: Sorting and Searching (2nd Edition) by Donald E. Knuth