Difference between revisions of "Comparing Algorithms - Big O"

From TRCCompSci - AQA Computer Science
Jump to: navigation, search
(Constant complexity - O(1))
(Logarithmic complexity - O(log n))
Line 11: Line 11:
  
 
===== Logarithmic complexity - O(log n)=====
 
===== Logarithmic complexity - O(log n)=====
An algorithm with logarithmic complexity is better than a linear complexity for searching, because it will grow in time in comparison with items, but not as fast. For example when you search a list, and see that it's not in the middle, it will cut the list in half and then it searches in that half, so the time is reduced as the list gets split.
+
An algorithm with logarithmic complexity is better than linear complexity for searching, as the growth of time for increasing ''n'' decelerates. For example when you search a list, and see that it's not in the middle, it will cut the list in half and then it searches in that half, so the time is reduced as the list gets split.
  
 
===== Linearithmic complexity - O(nlog n)=====
 
===== Linearithmic complexity - O(nlog n)=====

Revision as of 08:09, 18 May 2017

Big O Notation is a measure of how long or how much memory is needed to execute and algorithm. This uses the worst case scenario, so that you get the maximum time and memory usage. It uses n as the number of items.

Time complexities:

Constant complexity - O(1)

An algorithm with constant complexity means that the time taken doesn't increase regardless of the items that you process. a items would take the same amount of time to compute as b items.

Linear complexity - O(n)

An algorithm with linear complexity takes more time, by a constant gradient, as you give it more items to process. The larger the n value, the longer it takes, with the time being n times larger than if n = 1. For example planting n seeds in a large field would take longer than planting n seeds in a smaller field by comparison.

Logarithmic complexity - O(log n)

An algorithm with logarithmic complexity is better than linear complexity for searching, as the growth of time for increasing n decelerates. For example when you search a list, and see that it's not in the middle, it will cut the list in half and then it searches in that half, so the time is reduced as the list gets split.

Linearithmic complexity - O(nlog n)

As the input n grows, the time taken to process the queue, but the time taken will always grow faster than the quantity of the input. Like a comb sort.

Polynomial complexity - O(nk)

The time taken to process the input increases faster than the size of the input n. An example would be shaking hands, between two people only one is needed, however as the amount of people in increases there will have to be many more handshakes between the people. 1/2(n2 - n) would be the time complexity for handshakes.

Exponential complexity - O(kn)
Factorial complexity - O(n!)