I am writing a demo class in Java to analyze the following sorting algorithms:
- InsertionSort
- SelectionSort
- BubbleSort
- MergeSort
- QuickSort
which I have implemented as static methods in another class named Sort.
I want to compare the Best-, Average- and Worst-Cases of each algorithm by determining the runtime with the analytical komplexity using the omicron formula.
In the demo class, I only want to determine the time (in nanoseconds) each algorithm needs to sort an Integer Array with different lengths in the Best-, Average- and Worst-Case order of numbers in the Array.
//Best-Case
int[] arrbc0 = {1};
int[] arrbc1 = {1, 2};
int[] arrbc2 = {1, 2, 3};
int[] arrbc3 = {1, 2, 3, 4, 5};
int[] arrbc4 = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10};
int[] arrbc5 = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15};
//Average-Case
int[] arrac1 = {1, 2};
int[] arrac2 = {3, 1, 2};
int[] arrac3 = {4, 2, 3, 1, 5};
int[] arrac4 = {9, 1, 10, 6, 2, 4, 8, 3, 7, 5};
int[] arrac5 = {13, 12, 1, 15, 5, 6, 7, 2, 14, 10, 3, 8, 4, 9, 11};
//Worst-Case
int[] arrwc1 = {2, 1};
int[] arrwc2 = {3, 2, 1};
int[] arrwc3 = {5, 4, 3, 2, 1};
int[] arrwc4 = {10, 9, 8, 7, 6, 5, 4, 3, 2, 1};
int[] arrwc5 = {15, 14, 13, 12, 11, 10, 9, 8, 7, 6, 5, 4, 3, 2, 1};
//InsertionSort:
isNanoTime(arrbc0); //first load
isNanoTime(arrbc1);
isNanoTime(arrbc2);
//...
public static void isNanoTime(int[] arr) {
long a1 = System.nanoTime();
Sort.insertionSort(arr);
long a2 = System.nanoTime() - a1;
System.out.println(a2);
}
Now I have some questions:
- Can I use these Arrays for all Best-, Average- and Worst-Cases of these Algorithms, or has f.e. the Worst-Case of MergeSort another order?!
- Is there an easy way to unsort the arrays, after sorting them once?
- Is this anyway the "right way" to determine the time-complexity (maybe someone has a better idea)?
-
It can (sort of) demonstrate time complexity, but I don't think it can determine it. Execution time and time complexity are related, but different animals.hatchet - done with SOverflow– hatchet - done with SOverflow2015年01月02日 17:13:42 +00:00Commented Jan 2, 2015 at 17:13
-
1This is going to be a tough one for many reasons, not least of which is stackoverflow.com/questions/504103/…NPE– NPE2015年01月02日 17:22:46 +00:00Commented Jan 2, 2015 at 17:22
3 Answers 3
- Your arrays are way too short: it will take almost no time for any "modern" CPU to sort them, even in your worst case
- To have pertinent time variations based on the shuffleness of the input, you need to set an input size that is fixed and that gives you measurable times (probably in order of seconds)
- You probably need to generate a set of thousands of random arrays, add maybe some specific array to this set (sorted, reversed sorted, ...). Then you can run each algorithm on each array from this set and measure the time needed to sort them. Doing so you can obtain a nice distribution graph for each algorithm on which you can see the behavior of each algorithm (bubble sort goes very high while heapsort is pretty stable ...). The worst input for one algorithm is not necessarily the same for an other algorithm, hence the set.
Comments
@MBo @Jean Logeart
What do you think about that:
//Main:
for(int n = 100_000; n <= 1_000_000; n = n + 100_000) {
//f.e. average case of insertion sort:
int[] arr = randomArray(n);
insertionSortWithRuntime(arr);
}
/**
* For best cases using sorted numbers.
* @param n- the length in which the array should be created.
* @return
*/
public static int[] sortedArray(int n) {
int[] arr = new int[n];
for (int i = 0; i < n; i++) {
arr[i] = i;
}
return arr;
}
/**
* For average cases using random numbers.
* @param n - the length in which the array should be created.
* @return
*/
public static int[] randomArray(int n) {
int[] arr = new int[n];
for (int i = 0; i < n; i++) {
arr[i] = (int) (Math.random() * 9 + 1);
}
return arr;
}
/**
* For worst cases using reversed sorted numbers.
* @param n - the length in which the array should be created.
* @return
*/
public static int[] reversedSortedArray(int n) {
int[] arr = new int[n];
int length = n - 1;
for (int i = 0; i < n; i++) {
arr[i] = length;
length--;
}
return arr;
}
Have you imagined it this way?
7 Comments
Such arrays can demonstrate worst and best cases for InsertionSort and BubbleSort. Typical implementations of MergeSort and SelectionSort have the same complexity for all arrays. The worst case for the simplest implementation of QuickSort is sorted (or back-sorted) array.
Wiki page with useful table
Note that these arrays are too short to notice any difference in run times. Make arrays with 10^3-10^6 elements (for slow and fast algorithms respectively).Look at Fisher-Yates shuffle to get random sequence
Comments
Explore related questions
See similar questions with these tags.