Breaking Posts

6/trending/recent

Hot Widget

Type Here to Get Search Results !

Time Complexity of Algorithms

mskuthar
Time Complexity of Algorithms

  • The overall time required by a program to run to completion is referred to as its time complexity. The big O notation is most typically used to indicate the temporal complexity of algorithms.
  • The most popular way to evaluate time complexity is to count how many elementary functions the algorithm performs. 
  • We normally utilize the worst-case Time complexity of an algorithm because that is the most time-consuming for any input size because the method's performance may change with different types of input data.

Calculating Time Complexity

  • Let's move on to the next major issue in time complexity, How to Calculate Time Complexity. It can be perplexing at times, but we'll do our best to explain it as simply as possible.
  • Big O notation is now the most widely used measure for calculating time complexity. 
  • As N approaches infinity, all constant factors are removed, allowing the running time to be calculated as a function of N. You can conceive about it this way in general:
statement;
  • Above we have a single statement. Its Time Complexity will be Constant. The running time of the statement will not change in relation to N.
for(i=0; i < N; i++)
{
statement;
}
  • The given algorithm will have a linear time complexity. The loop's running time is directly proportional to N. The running time doubles when N doubles.
for(i=0; i < N; i++)
{
for(j=0; j < N;j++)
{
statement;
}
}
  • The above code's time complexity will be quadratic this time. The time it takes for the two loops to run is proportional to N squared. The running time increases by N * N when N doubles.
while(low <= high)
{
mid = (low + high) / 2;
if (target < list[mid])
high = mid - 1;
else if (target > list[mid])
low = mid + 1;
else break;
}
  • This is an algorithm for splitting a set of integers in half and searching a certain field (we will study this in detail later). 
  • This algorithm's time complexity will now be logarithmic. The number of times N can be divided by 2 determines the algorithm's execution time (N is high-low here). 
  • Because the algorithm divides the working area in half every iteration, this is the case.
void quicksort(int list[], int left, int right)
{
int pivot = partition(list, left, right);
quicksort(list, left, pivot - 1);
quicksort(list, pivot + 1, right);
}
  • Continuing with the previous technique, we have a small Quick Sort logic above (we will study this in detail later). 
  • We now divide the list into half every time in Quick Sort, but we repeat the iteration N times (where N is the size of list). As a result, the temporal complexity will be N*log ( N ). 
  • The running time is made up of N logarithmic loops (iterative or recursive), so the algorithm is a mix of linear and logarithmic.
  • NOTE: Working with one item in one dimension is linear, working with two items is quadratic, and dividing the working area in half is logarithmic.

Types of Notations for Time Complexity

Now we will discuss and understand the various notations used for Time Complexity.
  1. Big Oh denotes "fewer than or the same as" <expression> iterations.
  2. Big Omega denotes "more than or the same as" <expression> iterations.
  3. Big Theta denotes "the same as" <expression> iterations.
  4. Little Oh denotes "fewer than" <expression> iterations.
  5. Little Omega denotes "more than" <expression> iterations.

Understanding Notations of Time Complexity with Example

  • O(expression) is the set of functions that grow slower than or at the same rate as expression.
  • Omega(expression) is the set of functions that grow faster than or at the same rate as expression.
  • Theta(expression) consist of all the functions that lie in both O(expression) and Omega(expression).
  • Suppose you've calculated that an algorithm takes f(n) operations, where,
f(n) = 3*n^2 + 2*n + 4.   // n^2 means square of n
  • Since this polynomial grows at the same rate as n^2, then you could say that the function f lies in the set Theta(n^2). (It also lies in the sets O(n^2) and Omega(n^2) for the same reason.)
  • The simplest explanation is, because Theta denotes the same as the expression. Hence, as f(n) grows by a factor of n^2, the time complexity can be best represented as Theta(n^2).

Post a Comment

0 Comments
* Please Don't Spam Here. All the Comments are Reviewed by Admin.