Here on design and analysis of algorithms today’s it is going to be an overview of the course and i want to introduce you to the main problems in the course and also convey a spirit of the course let us start with the family question given a certain problem how do you solve it in a computer .
for many problems it is relatedly easy to design algorithms that will somehow solve them however quite some cleverness is needed and designing algorithms that are all so fast that is you go to switch give answers very pretty .
this will be the major challenge in the course our main course goal as i said is to design algorithms which are fast ,design of fast algorithms as you may realize designing anything be in computers because it closed is an art ,so in some sense you have to be creative and it cannot be taught but in some other sense there are also some very very very very defined design techniques which you have hard for this purpose and the goal of this course is to study these techniques .
Basic Approach to Design Algorithm
Analytical
Build a mathematical model of computer
Study properties of Algorithm on this model
And the most important is Reason about the Algorithm,ie we have to mathematically prove the properties ,Like Prove facts about time taken
Overview of Course
Basic Framwork
Fast?
Techniques for fast algorithms,Optimization,Graph theory
NP Completeness Theory
so I want to give an overview of the course in the next few lectures we will develop the basic framework what i mean by this is that will define the mathematical model they will say what fast means when I say fast algorithm what do I mean this is what we’ll see then level in got will embark on a fairly long stretch which involves techniques for designing faster algorithms by doing this we will also be surveying many problems so we’ll look at problems from optimization graph theory some problems from geometry and also some others it will turn out that there are some problems which do not really quite quite respond to our algorithm design techniques of course will see many problems for which we can design really good algorithms other techniques work beautifully but there are also some problems which are techniques do not work so well.and for these problems a fairly intricate theory has developed over the last over the last several years then spring 10 20 30 years and this theory is the so-called theory of NP Completeness so we’ll be studying these this theory as well
Algorithm types we will consider include:
- Simple recursive algorithms
- Backtracking algorithms
- Divide and conquer algorithms
- Dynamic programming algorithms
- Greedy algorithms
- Branch and bound algorithms
- Brute force algorithms
- Randomized algorithms
Optimization problems
An optimization problem is one in which you want to find, not just a solution, but the best solution
Greedy Algorithms
A greedy algorithm works in phases. At each phase:
- You take the best you can get right now, without regard for future consequences
- You hope that by choosing a local optimum at each step, you will end up at a global optimum
The Greedy Method Technique
The greedy method is a general algorithm design paradigm, built on the following elements:
- configurations: different choices, collections, or values to find
- objective function: a score assigned to configurations, which we want to either maximize or minimize
It works best when applied to problems with the greedy-choice property:
- a globally-optimal solution can always be found by a series of local improvements from a starting configuration.
Lets See Some Example Where we can use Greedy Algorithm
1.Making Change/Counting money
Problem: A dollar amount to reach and a collection of
coin amounts to use to get there.
Configuration: A dollar amount yet to return to a
customer plus the coins already returned
Objective function: Minimize number of coins returned.
Greedy solution: Always return the largest coin you can
- Suppose you want to count out a certain amount of money, using the fewest possible bills and coins
- A greedy algorithm would do this would be:
At each step, take the largest possible bill or coin that does not overshoot
Example: To make $6.39, you can choose:
- a $5 bill
- a $1 bill, to make $6
- a 25¢ coin, to make $6.25
- A 10¢ coin, to make $6.35
- four 1¢ coins, to make $6.39
- For US money, the greedy algorithm always gives the optimum solution
A failure of the greedy algorithm
In some (fictional) monetary system, “krons” come in 1 kron, 7 kron, and 10 kron coins
- Using a greedy algorithm to count out 15 krons, you would get
- A 10 kron piece
- Five 1 kron pieces, for a total of 15 krons
- This requires six coins
- A better solution would be to use two 7 kron pieces and one 1 kron piece
- This only requires three coins
- The greedy algorithm results in a solution, but not in an optimal solution
2.A scheduling problem
You have to run nine jobs, with running times of 3, 5, 6, 10, 11, 14, 15, 18, and 20 minutes
You have three processors on which you can run these jobs
You decide to do the longest-running jobs first, on whatever processor is available
P1 20 10 3
p2 18 11 6
p3 15 14 5
Time to completion: 18 + 11 + 6 = 35 minutes
This solution isn’t bad, but we might be able to do better
Another approach
What would be the result if you ran the shortest job first?
Again, the running times are 3, 5, 6, 10, 11, 14, 15, 18, and 20 minutes
p1 3 10 15
p2 5 11 18
p3 6 14 20
That wasn’t such a good idea; time to completion is now
6 + 14 + 20 = 40 minutes
Note, however, that the greedy algorithm itself is fast
All we had to do at each stage was pick the minimum or maximum
An optimum solution
Better solutions do exist:
p1 20 14
p1 18 11 5
p3 15 10 6 3
This solution is clearly optimal (why?)
Clearly, there are other optimal solutions (why?)
How do we find such a solution?
One way: Try all possible assignments of jobs to processors
Unfortunately, this approach can take exponential time
Huffman encoding
- The Huffman encoding algorithm is a greedy algorithm
- You always pick the two smallest numbers to combine
Minimum spanning tree
A minimum spanning tree is a least-cost subset of the edges of a graph that connects all the nodes
- Start by picking any node and adding it to the tree
- Repeatedly: Pick any least-cost edge from a node in the tree to a node not in the tree, and add the edge and new node to the tree
- Stop when all nodes have been added to the tree
Traveling salesman
A salesman must visit every city (starting from city A), and wants to cover the least possible distance
- He can revisit a city (and reuse a road) if necessary
- He does this by using a greedy algorithm: He goes to the next nearest city from wherever he is
Analysis
A greedy algorithm typically makes (approximately) n choices for a problem of size n
- (The first or last choice may be forced)
Hence the expected running time is:
O(n * O(choice(n))), where choice(n) is making a choice among n objects
- Counting: Must find largest useable coin from among k sizes of coin (k is a constant), an O(k)=O(1) operation; Therefore, coin counting is (n)
- Huffman: Must sort n values before making n choices
- Therefore, Huffman is O(n log n) + O(n) = O(n log n)
- Minimum spanning tree: At each new node, must include new edges and keep them sorted, which is O(n log n) overall.Therefore, MST is O(n log n) + O(n) = O(n log n)
Other greedy algorithms
Dijkstra’s algorithm for finding the shortest path in a graph
- Always takes the shortest edge connecting a known node to an unknown node
Kruskal’s algorithm for finding a minimum-cost spanning tree
- Always tries the lowest-cost remaining edge
Prim’s algorithm for finding a minimum-cost spanning tree
- Always takes the lowest-cost edge between nodes in the spanning tree and nodes not yet in the spanning tree