Saturday, October 8, 2022

Oh No...The Big O

 Developing software is the means by which we solve some of our most pressing real world phenomena, through pathways of automated solutions.


 The goal then of the industrious developer, worthy of such a task, isn't simply to automate them, but to express and quantify the culmination of their interworkings, as concisely and to the point as possible.


And here my friends, in hope of quantitative aim and pristine precision, enlie those algorithm analysis techniques making their famed debut, in helping us understand the behavior of an algorithmic process, as its functional data inputs either grow or shrink, contingent on its implementation.


Further, since our most prized innovators and makers of all the world's fancy, have leanings towards nomenclature mania - the glorious world of computer science, has coined such a feat, BIG O ANALYSIS - the algorithmic quantification of a program completion time scenario, pertinent to all processing of functional inputs, and algorithm design trends.


Simply put, in the grand scheme of things, Big Oh algorithm analysis is our best means yet, of forecasting a runtime approximation of an algorithm, hence allowing the seasoned and savvy developer to determine, in comparison to all possible algorithmic solutions, the best suited for a particular task over some other.


This process requires that we not only know both the preliminary and consequential steps involved in an algorithm design process, but that we also know the implications of efficiency within the resources consumed within its process as well - birthing what's commonly known as an algorithmic rate of growth.


Rates of growth help us determine whether the problem size of a number of functional inputs, are either growing or shrinking, doubling or being reduced, linearly, exponentially or even quadratically by some quantitative value, as the application is run successively - or in other words, helps us determine the runtime bound curves for which the program is run.


And since Big O runtimes signify mathematical worst case runtime algorithm scenarios, their derivations can be deduced by mathematical notation, as a function of input size n.


For example, we might say that 

BIG O 0<=f(n)<=Of(g)


Symbolizing  that the O (Big O), upper bound, or worst-case scenario of function f(n), is greater than the function f(n) itself, having an initial program run sequence greater than 0...for wouldn't it prove a challenge to run an application with 0 inputs or even 0 times?... OMNITEKK affirms such a quandary.


Likewise, we can't always consider or prepare for worst case scenarios now can we?...


Not to worry friends,  our mathematical symbologists have devised a few other runtime quantifications to best explain run time bounds as well - namely OMEGA, signifying the best case or lower bound for an algorithm growth rate, and THETA, an expression denoting average rates of growth, or convergences somewhere between Big O and OMEGA, which may be asymptotically written as -


OMEGA Ω = Ω 0<=f(g(n))<=f(n)

THETA θ   =>   θ f(n)<=f(n)<=Of(n)


We say that THETA is both OMEGA and BIG O since it's run time rate of growth indicates an average subset rate of values within a set of upper bound and lower bound runtime values.


For instance, if O(f ) = ( ) and Ω(f) = (n) , then  θ  =  n², since n is also within the set of  , and higher bounds reflect a greater degree of efficiency in evaluating  runtime values. 


As such, we prove either an upper bound,  average bound,  or lower bound growth rate from a variation of possibilities, by finding at least one initial input value n and constant c that proves for any subsequent values greater than or equal to those initial values, the upper bound, average or lower bound scenario is true.

 

We might then prove  that Of(n) = O(n) using the following example


f(n) 100n+5=O(n) 


by simply applying the asymptotic formula


0<=f(n)<=f(O(n)) n0>=0, and c1>= 105

=>100n + 5<= 100n+5n=105n <=105n<=105n,

for all n>=1, and c>=105.


We are equipped to determine the bounds of our application run times.


Similar methods of deduction can be used to find both OMEGA and THETA as well.


Likewise, we tend to prove average bounds by proving both upper and lower bounds within an application, with a subsequent representation of average case growth as the worst case, since both the average and best possible bounds are encapsulated within it, and we are usually concerned with worse growth rates anyways...with the exception of amortization or amortized growth rates, that prove asymptotic bounds in clusters of program functionality as opposed to analyzing their characteristics disjointly.


It should be noted that such an expedition is of rarity, so we usually only deal with BIG O runtime in analyzing algorithm bounds, besides select specialty applications.


And there you have it folks OMNITEKK'S rendition of Oh No, The Big O.


We hope you've enjoyed it, and until our next it adventures my friends, OMNITEKK says be well. 



No comments:

Post a Comment

BEST OF THE BEST

Codes have always been deeply interwoven into the very fabric of our human existence.  We bear witness to them in our daily lives - as diffe...