I have a simple integer sorting problem at hand and to solve it, I am planning to write a variation of bubble sort. It seems to be working fine but I am not sure about it's complexity in big-O. What it could be ?
public class TempBubbleSort
{
static Integer[] myArray = {10,9,8,7,6,5,4,3,2,1};
static int counter = 0;
public static void main(String[] args)
{
for(int anchor=0; anchor<myArray.length; anchor++)
{
for(int compare=anchor+1; compare<myArray.length; compare++)
{
counter++;
// sort ascending
if(myArray[anchor] > myArray[compare])
{
int tmp = myArray[compare];
myArray[compare] = myArray[anchor];
myArray[anchor] = tmp;
}
}
}
System.out.println("Comparision Count : "+counter);
for(int i : myArray)
System.out.println(i);
}
}
Output when you run above is :
Comparision Count : 45
1
2
3
4
5
6
7
8
9
10
1 Answer 1
Strictly answering your question about what the \$Big-O\$ run-time complexity,
public class TempBubbleSort
{
static Integer[] myArray = {10,9,8,7,6,5,4,3,2,1};
static int counter = 0;
public static void main(String[] args)
{
// **This is O(n), as we're touching every point in your array.
for(int anchor=0; anchor<myArray.length; anchor++)
{
// **This is O(n-1), but since we don't care about constants as n grows, we consider this as O(n)
for(int compare=anchor+1; compare<myArray.length; compare++)
{
// **Everything in here then is a bunch of constant operations so we'll just ignore them.
counter++;
// sort ascending
if(myArray[anchor] > myArray[compare])
{
int tmp = myArray[compare];
myArray[compare] = myArray[anchor];
myArray[anchor] = tmp;
}
}
}
// **So the total of the brunt of your work would be O(n)*O(n) or O(n^2), because for every element in your array n, you touch every other element in your array, so you can look at it as touching n things in your array, n times.
System.out.println("Comparision Count : "+counter);
// **In case you're curious, this is also O(n)
for(int i : myArray)
System.out.println(i);
// **Which would bring the grand total to O(n^2+n), again with Big O notation, we only care about what happens as n continues to grow, and since n^2 grows faster than n, as our n gets super big, the second term n becomes insignificant, so we look at this overall total as O(n^2). In fact, for any polynomial time complexity, you can safely drop all terms that are not your largest degree.
}
}
In closing, its usually easy to tell the time-complexity of simple iterative functions like this by the number of nested for loops you have, which can be a good rule of thumb for a nice approximate guess for time complexity in a pinch.
static Integer[] myArray = {10,9,8,7,6,5,4,3,2,1};
hits an edge case. \$\endgroup\$