Started DSA preparation (Day - 1)

Photo by Clark Tibbs on Unsplash

Started DSA preparation (Day - 1)

Table of contents

I have some previous knowledge about Data structures and its implementation but never completed and practised in depth.

I started this blog writing to track my progress and it will also help me to be on track.

Its Day 1, I enrolled in a course for a systematic progress of my dsa knowledge. today i learned asymptotic notation, below i mentioned what i just got to know.

Asymptotic notation

It gives us a relation between time and input, it show how time will grow as our input size grows. There are three main notations.

  1. Big - oh notation (Worst case)

    It is represented as O(). It gives us the upper bound of an algorithm, which means our algorithm can not grow beyond the given time complexity. For example O(n) is the time complexity for linear search.

    In mathematical way we can represent it as f(n)<=O(g(n))

    f(n) <= c*g(n) for some positive value of c and n>=n1.

  2. Big - omega notation (Best case)

    It is represented as omega. It gives us the lower bound of an algorithm, which means that it is the least time our algorithm can take. for example omega(1) (which is constant time) is the time complexity for linear search. In mathematical way we can represent it as f(n) >= omega(g(n))

    f(n)>=c*g(n) for some positive value of c and for all n>=n1.

  3. Theta notation (Average Case)

    It is represented as theta. It gives us the average time an algorithm can take to execute. for example merge sort takes theta(nlogn) time complexity to execute.

    In mathematical way we can represent it as c1*g(n)<=f(n)<=c2*g(n) for some positive value of c1,c2 and n>=n1.

I also done some problem on this topic. Here is a series of increasing time complexity

C < log(log(n)) < logn < n^1/3 < n^1/2 < n < n^2 < n^3 < n^4 < 2^n < n^n . That is all I learned on Day 1. Thanks