Asymptotic Notation is the hardware independent notation used to tell the time and space complexity of an algorithm. Meaning it’s a standardized way of measuring how much memory an algorithm uses or how long it runs for given an input.
The following are the Asymptotic rates of growth from best to worst:
constant growth - O(1)
Runtime is constant and does not grow with n
logarithmic growth - O(log n)
Runtime grows logarithmically in proportion to n
linear growth - O(n)
Runtime grows directly in proportion to n
superlinear growth - O(n log n)
Runtime grows in proportion and logarithmically to n
polynomial growth - O(n^c)
Runtime grows quicker than previous all based on n
exponential growth - O(c^n)
Runtime grows even faster than polynomial growth based on n
factorial growth - O(n!)
Runtime grows the fastest and becomes quickly unusable for even small values of n
(source: Soumyadeep Debnath, Analysis of Algorithms | Big-O analysis) Visualized below; the x-axis representing input size and the y-axis representing complexity: (source: Wikipedia, Computational Complexity of Mathematical Operations)
Big-O refers to the upper bound of time or space complexity of an algorithm, meaning it worst case runtime scenario. An easy way to think of it is that runtime could be better than Big-O but it will never be worse.
Big-Omega refers to the lower bound of time or space complexity of an algorithm, meaning it is the best runtime scenario. Or runtime could worse than Big-Omega, but it will never be better.
Big-Theta refers to the tight bound of time or space complexity of an algorithm. Another way to think of it is the intersection of Big-O and Big-Omega, or more simply runtime is guaranteed to be a given complexity, such as n log n
.