Algorithm:
What is an Algorithm?
• Algorithms represent step-by-step methods designed to tackle specific
problems.
• A set of sequential steps usually written in Natural Language to solve a
given problem is called Algorithm.
• Rather than providing direct answers, they offer detailed procedures to arrive
at solutions.
Steps involved in algorithm development
1. Understanding the problem (identification of input/output)
• Understand the problem fully.
• Designing algorithms is a creative skill, not fully automatable.
• An input to an algorithm specifiess the problem instance.
• A correct algorithm works for every input.
2. Check the computer's capabilities •
• Identify the machine's type, speed, and memory. •
• Know if it supports sequential or parallel algorithms.
3. Choose appropriate data structures (array, stack, queue, etc.)
• Data structures are key for designing and analyzing algorithms.
• They organize related data items.
• Basic types include arrays, lists, and sets.
• Algorithm + Data Structure = Program
4. Algorithm Design Technique
• General ways to solve problems with algorithms used in many
computing areas
• Main methods:
• Brute force
• Greedy
• Divide and conquer
• Dynamic programming
• Backtracking
• Linear programming
5. Coding an Algorithm •
• Write the algorithm in a programming language.
• Verify it with small test programs.
• Test and debug to ensure it works.
•
Example:
Find the sum of rst N numbers
Step 1: Input N
Step 2: sum = 0
Step 3: For i = 1 to N, sum = sum + i
Step 4: Output sum
Find the average of three numbers
Step 1 Read the numbers a, b, c
Step 2 sum =a + b +c
Step 3 result=sum/3
Step 4 Print the value of result
Characteristics of an Algorithm
1. Input/Otput: Each algorithm must take zero, one or more quantities asinput
data produce atleast one or more output values.
2. Definiteness: Each step of the algorithm must be precisely and unambiguously
stated.
3. Finiteness: An algorithm must terminate in a finite number of steps.
4. Effectiveness: Each step must be effective, that is basic and executable,
(convert able into program statement) can be performed exactly in a finite
amount of time
5. Generality: The algorithm must be complete in itself so that it can be used to
solve problems of a specific type for any input data.
fi
Analysis of Algorithms
• Analysis of Algorithms is a fundamental aspect of computer science that involves
evaluating performance of algorithms and programs.
• It is to check how good the algorithm is before coding.
• Efficiency is measured in terms of time and space.
• An algorithm should be relatively fast.
• An algorithm should require minimum computer memory to produce the
desired output in an acceptable amount of time.
Ef ciency Analysis of the algorithm
• Efficiency analysis of the algorithm is the way of finding the resources
required by the algorithm.
• This helps to decide the best algorithm among the set of candidate
algorithms.
• The running time of algorithm is proportional to the problem size n.
• The amount of resources required grows in the order of linear, quadratic,
factorial, and so on.
• We don’t worry about performance if we’re solving a problem for a small
number of instances. However, if a problem must be addressed repeatedly
for large size, efficiency analysis is important.
• The amount of time and space required to solve the problem is referred to as
the algorithm’s time and space complexity, respectively.
fi
1. Space Complexity
•Problem-solving using a computer requires memory to hold temporary data or
final result while the program is in execution.
•The amount of memory required by the algorithm to solve a given problem is
called the space complexity of the algorithm.
•The space complexity of the algorithm is controlled by two components:
• Fixed-size components : It includes the programming part whose memory
requirement does not alter program execution. For example,
• Instructions
• Variables
• Constants
• Arrays
• Variable size components : It includes the part of the program which whose
size depends on the problem being solved. For example,
• Size of loop
• Stack required to handle a recursive call
• Dynamic data structures like a linked list
Example,
(1) Addition of two numbers
Read a, b
c=a+b
Print c
The addition of two scalar numbers requires one extra memory location to hold the
result. Thus the space complexity of this algorithm is constant, hence S(n) = O(1).
(2) Addition of n numbers
Read n
For i=1 to n
sum=sum+i
Next i
Print sum
The space complexity of the algorithm is constant and S(n) = O(1).
(3) Addition of two arrays
For i=1 to n
c[i]=a[i]+b[i]
Next i
The space complexity of the above code segment would be S(n) = O(n).
2. Time Complexity
•The time complexity of the algorithm refers to how long it takes the algorithm to
solve a given problem.
Example,
(1) Addition of two numbers
This requires only one operation, so T(n)=O(1)
(2) Addition of n numbers
This will require n operations(comparisons), so T(n)=O(n)
(3) Addition of two arrays
The time complexity of the above code is T(n) = O(n)
3. Measuring Input Size
• The algorithm’s execution time is mostly determined by the size of the input.
• How big the input is for the algorithm.
• Example: For a list, input size = number of elements (n).
• For larger input, the algorithm runs longer.
• The complexity is independent of hardware architecture, available memory,
CPU speed, and so forth.
4. Measuring Running Time
• Actual time taken by an algorithm to execute.
• This depends on:
◦ The algorithm
◦ The size of the input
◦ The computer speed
5. Order of Growth
• The efficiency of the algorithm is expressed in term of input size n.
• The relationship between input size and performance of the algorithm is called
an order of growth.
• The order of increase reflects how rapidly the time required by the algorithm
rises in relation to the size of the input.
Efficiency Order of growth
Example
class rate
Delete the first node from a linked list
Constant 1 Remove the largest element from the max
heap Add two numbers
Binary search Insert/delete element from
Logarithmic log n
binary search tree
Linear search Insert node at the end of the
Linear n linked list Find minimum / maximum
element from an array
Merge sort Binary search Quicksort Heap
nlog n n log n
Sort
Selection Sort Bubble sort Input 2D array
Quadratic n2
Find maximum element form 2D matrix
Cubic n3 Matrix Multiplication
Finding the power set of any set Find the
Exponential 2n optimal solution for the Knapsack problem
Solve TSP using dynamic programming
Generating permutations of a given set
Factorial n! Solve TSP problem using brute force
approach
6. Cases of Complexity
Cases of complexity
Best case analysis:
• If an algorithm takes minimum time to solve the problem for a given set of input,
it is called best case time complexity.
• The best-case running time for removing the first node from the list is constant,
T(n) = O(1).
Worst-case analysis:
• If an algorithm takes maximum time to solve the problem for a given set of input,
it is called worst-case time complexity.
• If the element to be removed is at the last position in linked list, we must compar
e each node value
to a key element. In this situation, we must make n comparisons.
i.e. T(n) = O(n).
Average case analysis:
• Input sequence which is neither best nor worst is called the average case.
• when the node to be removed is neither at the initial nor at the last place. i.e. T(n)
= O(n/2) = O (n).
The Family Getaway Event on