EnggPedia - The Engineering Encyclopedia

Wed03292017

Last update06:04:15 AM

Font Size

Profile

Menu Style

Cpanel
Back Computer Algorithms Algorithms | Definiton And Types of Algorithms

Algorithms | Definiton And Types of Algorithms

Definition:

An algorithm is any well defined computational procedure that takes some value, or a set of values, as input and produces some value, or a set of values, as output. An algorithm is thus a sequence of computational steps that transform the input into the output

Algorithm is a tool for solving a well specified computational problem. The statement of the problem specifies in general terms the desired input/output relationship. The algorithm describes a specific computational procedure for achieving that input/output relationship. Or

Algorithm is a step by step procedure to solve a particular problem within some specific time, with a correct output.

Types of algorithm

Most common types of algorithms are

  1. Divide and conquer approach.
  2. Dynamic programming algorithm.
  3. Greedy algorithm.

1. Divide and conquer approach:

The divide and conquer paradigm involves three steps at each level of recursion.

  1. Divide.
  2. Conquer.
  3. Combine.

DIVIDE:

The problem is divided into a number of sub-problems.

CONQUER:

the sub-problems are than solved recursively. If the sub-problem sizes are small enough, just solve the sub-problems in a straightforward manner.

COMBINE:

in this step, the solutions of the sub-problems are than combined into the solution for the original problem.

2. DYNAMIC PROGRAMMING:

Dynamic programming, like the divide and conquer approach, solves problems by combining the solutions to sub-problems. (“Programming” in the context refers to a tabular method, not to writing computer code). Dynamic programming is applicable when the sub-problems are nor independent, that is when sub-problems share sub-problems.

Dynamic programming is typically applied to optimization problems. In such problems there can be many solutions. Each solution has a value, and we wish to find a solution with the optimal (minimum or maximum) value. We call such a solution an optimal solution to a problem, as opposed to the optimal solution, since there may be several solutions that achieve the optimal value.

The development of a dynamic programming algorithm can be broken into a sequence of four steps.

  1. Characterize the structure of an optimal solution.
  2. Recursively define the value of an optimal solution.
  3. Compute the value of an optimal solution in a bottom up fashion.
  4. Construct an optimal solution from computed information.

3: GREEDY ALGORITHM:

A greedy algorithm always makes the choice in the hope that looks best at the moment. That is, it makes a locally optimal choice in the hope that the choice will lead to a globally optimal solution. Greedy algorithms do not always yield optimal solutions, but for many problems they do. The greedy method is quiet powerful and works well for a wide range of problems. A greedy algorithm obtains an optimal solution to a problem by making a sequence of choices. For each decision point in the algorithm, the choice that seems best at the moment is chosen. This heuristic strategy does not always produce an optimal solution, but in some problems, like activity selection problem, sometimes it does.

Following steps are involved in a greedy algorithm

  1. Determine the optimal structure of the problem.
  2. Develop a recursive solution.
  3. Prove that at any stage of the recursion, one of the optimal choices is the greedy choice, thus, it is always safe to make the greedy choice.
  4. Show that all but one of the sub-problems induced by having made the greedy choice is empty.
  5. Develop a recursive algorithm that implements the greedy strategy.
  6. Convert the recursive algorithm to an iterative algorithm.

GREEDY CHOICE PROPERTY:

The first key ingredient is the greedy choice property: a globally optimal solution can be arrived at by making a locally optimal (greedy) choice. In other words, when we are considering which choice to make, we make the choice that looks best in the current problem, without considering results from sub-problems.

Difference between greedy algorithm and dynamic programming algorithm:

In dynamic programming algorithm, we make a choice at each step, but the choice usually depends on the solution of sub-problems. We typically solve dynamic programming problems in a bottom up manner, progressing from smaller sub-problems to larger sub-problems. In a greedy algorithm, we make whether choice seems best at the moment and then solve the sub-problem arising after the choice is made. The choice made by the greedy algorithm may depend on the choices so far, but it cannot depend on any future choices or on the solutions to sub-problems. Thus unlike dynamic programming, which solves the sub-problems bottom up, a greedy strategy usually, progresses in a top down fashion, making one greedy choice after another, reducing each given problem instance to a smaller one.