Chapter3: Time complexity and Programs

Chapter3: Time complexity and Programs

·

5 min read

Introduction

It is not enough to gain theoretical information. You will become an pro only when you put your knowledge in practical. Let's dive into the coding world where we will be using all that we learnt so far. We will be using functions and methods to solve the program in this chapter. You will get to know various strategies to solve a program in a simple way having time complexity in your minds.

Time complexity

Before diving into the code, let's get to know about what is time complexity? Time complexity is something that you have to consider while coding an program. What are these notations”O(n), O(nlogn), O(n²)” with time complexity of an algorithm?

1. Constant time — O (1)

2. Linear time — O (n)

3. Logarithmic time — O (log n)

4. Quadratic time — O (n²)

5. Cubic time — O (n³)

The notation O(n) is used to describe the time complexity of an algorithm, which is a measure of how the running time of the algorithm grows with the size of the input.

Algorithms with a time complexity of O(1) are often referred to as constant-time algorithms and are considered to be the most efficient algorithms in terms of running time. They take the same amount of time to execute, no matter how big or small the input size is. Examples of algorithms with O(1) time complexity include basic arithmetic operations, accessing an element in an array at a fixed index, and checking if an element is present in a hash table.

The notation O(n) is used to describe the time complexity of an algorithm, which is a measure of how the running time of the algorithm grows with the size of the input.

For example, if an algorithm has a time complexity of O(n), it means that the running time of the algorithm is proportional to the size of the input, represented by “n”. More specifically, it means that the running time grows linearly with the size of the input. So, if the size of the input is doubled, the running time of the algorithm is also roughly doubled.

The notation O(nlogn) is used to describe the time complexity of an algorithm, which is a measure of how the running time of the algorithm grows with the size of the input.

For example, if an algorithm has a time complexity of O(nlogn), it means that the running time of the algorithm grows logarithmically with the size of the input, represented by “n”. More specifically, it means that the running time of the algorithm grows proportional to the size of the input multiplied by the logarithm of the size of the input.

The notation O(n²) is used to describe the time complexity of an algorithm, which is a measure of how the running time of the algorithm grows with the size of the input.

For example, if an algorithm has a time complexity of O(n²), it means that the running time of the algorithm grows quadratically with the size of the input, represented by “n”. More specifically, it means that the running time of the algorithm grows proportional to the square of the size of the input.

The notation O(n³) is used to describe the time complexity of an algorithm, which is a measure of how the running time of the algorithm grows with the size of the input.

For example, if an algorithm has a time complexity of O(n³), it means that the running time of the algorithm grows cubically with the size of the input, represented by “n”. More specifically, it means that the running time of the algorithm grows proportional to the cube of the size of the input.

1. Number is prime or not

Let's check whether the given number is prime or not in a better way.

import java.util.*;
public class Main{
    public static void main(String[] args){
        Scanner sc=new Scanner(System.in);
        int n=sc.nextInt();
        boolean ans=isprime(n);
        System.out.print(ans);
    }

    static boolean isprime(int n){
        for(int i=2; i<=Math.sqrt(n); i++){
            if(n%i==0){
                return false;
            }
        }
        return true;
    }
}

Always remember while solving prime number, the first factor will always lie at the maximum of root 'n' or even before that.

2. Armstrong or Not

Let's check whether the given number is armstrong or not.

import java.util.*;
public class Main{
    public static void main(String[] args){
        Scanner sc=new Scanner(System.in);
        int n=sc.nextInt();
        boolean ans=isArmstrong(n);
        System.out.print(ans);
    }
    static boolean isArmstrong(int n){
        int temp=n;
        int sum=0;
        while(n>0){
            int rem=n%10;
            sum+=rem*rem*rem;
            n/=10;
        }
        return temp==sum;
    }
}

We know that, the cube of individual digit and sum those in a number will be same as the given number.

3.Print all 3 digit armstrong number

import java.util.*;
public class Main{
    public static void main(String[] args){
        Scanner sc=new Scanner(System.in);
        for(int i=100; i<1000; i++){
            if(isArmstrong(i)){
                System.out.print(i + " ");
            }
        }
    }
    static boolean isArmstrong(int n){
        int temp=n;
        int sum=0;
        while(n>0){
            int rem=n%10;
            sum+=rem*rem*rem;
            n/=10;
        }
        return temp==sum;
    }
}
//The output is 153 370 371 407

4.Factorial using Recursion

import java.util.*;
public class Main{

    public static void main(String[] args){
        Scanner sc=new Scanner(System.in);
        int n=sc.nextInt();

        System.out.print(factorial(n));
    }
    static Integer factorial(int n){

        if(n==0 || n==1){
            return 1;
        }
        return n*factorial(n-1);

    }
}

The base case is when n is 0, in which case the factorial is 1. In the recursive case, it calculates the factorial using the formula n! = n * (n-1)!.

Conclusion

In conclusion, understanding time complexity is crucial for analyzing the efficiency of algorithms and making informed decisions about algorithm selection. Time complexity provides us with a quantitative measure of an algorithm's performance, allowing us to predict how the algorithm's execution time will scale with input size. our exploration of programming through functions and methods in Java has provided a solid foundation for building modular and maintainable code. By breaking down complex tasks into smaller, more manageable functions, we enhance code readability, promote reusability, and streamline the debugging process. That's all for today, stay tuned for next chapter. Cheers to all, Happy coding!