compare and contrast the 5 algorithm design approaches (divide and conquer, dyna
ID: 3757007 • Letter: C
Question
compare and contrast the 5 algorithm design approaches (divide and conquer, dynamic programming, greedy approach, backtracking, and branch and bound) and must explain each algorithm design, what kind of problems are best suited to that approach (do not just list them - explain), and how that approach is similar and/or different from the others and why you would want to use one versus the other. Give specific examples using any textbook material to demonstrate your arguments when comparing the algorithms. At the end. Please provide references.
Please provide solution in JAVA
Explanation / Answer
Algorithm types we will consider include:
Simple recursive algorithms
Backtracking algorithms
Divide and conquer algorithms
Dynamic programming algorithms
Greedy algorithms
Branch and bound algorithms
Brute force algorithms
Randomized algorithms
1)Backtracking algorithms are based on a depth-first recursive search
A backtracking algorithm:
If all countries have been colored (n > number of countries) return success; otherwise,
For each color c of four colors,
If country n is not adjacent to a country that has been colored c
Color country n with color c
recursively color country n+1
If successful, return success
If loop exits, return failure
2)Divide and Conquer:
A divide and conquer algorithm consists of two parts:
Examples:
Quicksort:
Mergesort:
example program for QuickSort in java:
package com.java2novice.sorting;
public class MyQuickSort {
private int array[];
private int length;
public void sort(int[] inputArr) {
if (inputArr == null || inputArr.length == 0) {
return;
}
this.array = inputArr;
length = inputArr.length;
quickSort(0, length - 1);
}
private void quickSort(int lowerIndex, int higherIndex) {
int i = lowerIndex;
int j = higherIndex;
// calculate pivot number, I am taking pivot as middle index number
int pivot = array[lowerIndex+(higherIndex-lowerIndex)/2];
// Divide into two arrays
while (i <= j) {
/**
* In each iteration, we will identify a number from left side which
* is greater then the pivot value, and also we will identify a number
* from right side which is less then the pivot value. Once the search
* is done, then we exchange both numbers.
*/
while (array[i] < pivot) {
i++;
}
while (array[j] > pivot) {
j--;
}
if (i <= j) {
exchangeNumbers(i, j);
//move index to next position on both sides
i++;
j--;
}
}
// call quickSort() method recursively
if (lowerIndex < j)
quickSort(lowerIndex, j);
if (i < higherIndex)
quickSort(i, higherIndex);
}
private void exchangeNumbers(int i, int j) {
int temp = array[i];
array[i] = array[j];
array[j] = temp;
}
public static void main(String a[]){
MyQuickSort sorter = new MyQuickSort();
int[] input = {24,2,45,20,56,75,2,56,99,53,12};
sorter.sort(input);
for(int i:input){
System.out.print(i);
System.out.print(" ");
}
}
}
Binary tree lookup:
This is not a divide and conquer algorithm because, although there are two recursive calls, only one is used at each level of the recursion
3)Dynamic programming algorithms:
A dynamic programming algorithm remembers past results and uses them to find new results
Dynamic programming is generally used for optimization problems
Multiple solutions exist, need to find the “best” one
Requires “optimal substructure” and “overlapping subproblems”
Optimal substructure: Optimal solution contains optimal solutions to subproblems
Overlapping subproblems: Solutions to subproblems can be stored and reused in a bottom-up fashion
This differs from Divide and Conquer, where subproblems generally need not overlap
Fibonacci numbers again:To find the nth Fibonacci number:
If n is zero or one, return one; otherwise,
Compute, or look up in a table, fibonacci(n-1) and fibonacci(n-2)
Find the sum of these two numbers
Store the result in a table and return it
Since finding the nth Fibonacci number involves finding all smaller Fibonacci numbers, the second recursive call has little work to do
The table may be preserved and used again later
4)Greedy algorithms:
An optimization problem is one in which you want to find, not just a solution, but the best solution
A “greedy algorithm” sometimes works well for optimization problems
A greedy algorithm works in phases: At each phase:
You take the best you can get right now, without regard for future consequences
You hope that by choosing a local optimum at each step, you will end up at a global optimum
Example: Counting money
Suppose you want to count out a certain amount of money, using the fewest possible bills and coins
A greedy algorithm would do this would be:
At each step, take the largest possible bill or coin that does not overshoot
Example: To make $6.39, you can choose:
a $5 bill
a $1 bill, to make $6
a 25¢ coin, to make $6.25
A 10¢ coin, to make $6.35
four 1¢ coins, to make $6.39
For US money, the greedy algorithm always gives the optimum solution
5)Branch and bound algorithms:
Branch and bound algorithms are generally used for optimization problems
As the algorithm progresses, a tree of subproblems is formed
The original problem is considered the “root problem”
A method is used to construct an upper and lower bound for a given problem
At each node, apply the bounding methods
If the bounds match, it is deemed a feasible solution to that particular subproblem
If bounds do not match, partition the problem represented by that node, and make the two subproblems into children nodes
Continue, using the best known feasible solution to trim sections of the tree, until all nodes have been solved or trimmed
Example branch and bound algorithm:Traveling salesman problem: A salesman has to visit each of n cities (at least) once each, and wants to minimize total distance traveled
Consider the root problem to be the problem of finding the shortest route through a set of cities visiting each city once
Split the node into two child problems:
Shortest route visiting city A first
Shortest route not visiting city A first
Continue subdividing similarly as the tree grows
Output:2 2 12 20 24 45 53 56 56 75 99
Binary tree lookup:
- Here’s how to look up something in a sorted binary tree:
- Compare the key to the value in the root
- If the two values are equal, report success
- If the key is less, search the left subtree
- If the key is greater, search the right subtree
This is not a divide and conquer algorithm because, although there are two recursive calls, only one is used at each level of the recursion
3)Dynamic programming algorithms:
A dynamic programming algorithm remembers past results and uses them to find new results
Dynamic programming is generally used for optimization problems
Multiple solutions exist, need to find the “best” one
Requires “optimal substructure” and “overlapping subproblems”
Optimal substructure: Optimal solution contains optimal solutions to subproblems
Overlapping subproblems: Solutions to subproblems can be stored and reused in a bottom-up fashion
This differs from Divide and Conquer, where subproblems generally need not overlap
Fibonacci numbers again:To find the nth Fibonacci number:
If n is zero or one, return one; otherwise,
Compute, or look up in a table, fibonacci(n-1) and fibonacci(n-2)
Find the sum of these two numbers
Store the result in a table and return it
Since finding the nth Fibonacci number involves finding all smaller Fibonacci numbers, the second recursive call has little work to do
The table may be preserved and used again later
4)Greedy algorithms:
An optimization problem is one in which you want to find, not just a solution, but the best solution
A “greedy algorithm” sometimes works well for optimization problems
A greedy algorithm works in phases: At each phase:
You take the best you can get right now, without regard for future consequences
You hope that by choosing a local optimum at each step, you will end up at a global optimum
Example: Counting money
Suppose you want to count out a certain amount of money, using the fewest possible bills and coins
A greedy algorithm would do this would be:
At each step, take the largest possible bill or coin that does not overshoot
Example: To make $6.39, you can choose:
a $5 bill
a $1 bill, to make $6
a 25¢ coin, to make $6.25
A 10¢ coin, to make $6.35
four 1¢ coins, to make $6.39
For US money, the greedy algorithm always gives the optimum solution
5)Branch and bound algorithms:
Branch and bound algorithms are generally used for optimization problems
As the algorithm progresses, a tree of subproblems is formed
The original problem is considered the “root problem”
A method is used to construct an upper and lower bound for a given problem
At each node, apply the bounding methods
If the bounds match, it is deemed a feasible solution to that particular subproblem
If bounds do not match, partition the problem represented by that node, and make the two subproblems into children nodes
Continue, using the best known feasible solution to trim sections of the tree, until all nodes have been solved or trimmed
Example branch and bound algorithm:Traveling salesman problem: A salesman has to visit each of n cities (at least) once each, and wants to minimize total distance traveled
Consider the root problem to be the problem of finding the shortest route through a set of cities visiting each city once
Split the node into two child problems:
Shortest route visiting city A first
Shortest route not visiting city A first
Continue subdividing similarly as the tree grows
Related Questions
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.