Sign in to follow this  
chadsxe

Can someone help me better understand the greddy method....

Recommended Posts

I have read that it is useful in determining the suitability of a algorithm for a programming problem. I think it narrows down a useful range of values or solutions that makes that algorithm effective. Am I way off on this...

Share this post


Link to post
Share on other sites
Greedy method is usually used to describe a subset of algorithms (greedy algorithms, then again if this isn't what your talking about I have no clue). The basic concept of these algorithms is that you are given a dataset and you need to find x. Finding the perfect x though takes a REALLY long time, so with the data that we know we randomly pick a value that looks like the best one from our dataset (note that its only random if theres a tie). We then take that value and put it into our solution set. We just keep doing this until the solution set is filled (we have what we think x is) or the dataset is exhausted. Note that at the end we may not find the best x out there but most likely we've found a "good enough" one.

If you want an example, read about minimum spanning trees.

Share this post


Link to post
Share on other sites
Quote:
Original post by TheFez6255
Greedy method is usually used to describe a subset of algorithms (greedy algorithms, then again if this isn't what your talking about I have no clue). The basic concept of these algorithms is that you are given a dataset and you need to find x. Finding the perfect x though takes a REALLY long time, so with the data that we know we randomly pick a value that looks like the best one from our dataset (note that its only random if theres a tie). We then take that value and put it into our solution set. We just keep doing this until the solution set is filled (we have what we think x is) or the dataset is exhausted. Note that at the end we may not find the best x out there but most likely we've found a "good enough" one.

If you want an example, read about minimum spanning trees.


Here is another source about greedy algorithms

Share this post


Link to post
Share on other sites
The fundamental attribute of any greedy algorithm -- the part that makes it a "greedy" algorithm, as opposed to some other kind -- is that at each step, you choose the step that looks the best. Initially, this may seem like a "d'uh!" idea: why would we pick a path that was something less than the best one available to us? In many problem scenarios, however, picking a less-than-perfect choice winds up being an excellent idea. One example is chess, where sacrificing a powerful piece is generally a foolish move. However, in certain scenarios, it's advantageous to sacrifice a powerful piece for the sake of baiting your opponent into a certain position that is extremely advantageous for you.

You may have experienced this in real life too: perhaps you decided to take a shortcut through some back roads one day to avoid rush-hour traffic, only to discover that the bridge is washed out, so that your total trip wound up taking longer than if you had stuck to the traffic-jammed highway. What seemed to be a good idea on the local scale was a poor choice on the global scale.

In other cases, the greedy algorithm does work very well. The canonical example is making change with coins: if, at each step in making change, you pick the largest coin that you possibly can, you will make change using the optimal (fewest) number of coins.

In particular, the greedy algorithm is always optimal for any problem scenario where the steps available to you are independent of previous steps (but not vice versa; if the greedy algorithm is optimal, it doesn't mean the steps were independent). Intuitively, this makes sense. If your actions don't depend on previous actions, and you always pick the best action possible, then you'll wind up picking the best solution. The making-change example bears this out: you always get to select from the same set of coins regardless of your previous actions.

hope that helps,

Share this post


Link to post
Share on other sites
Quote:
Original post by kSquared

hope that helps,


Wow.....that was a thing a beauty. Life would be so much more simpler if everything was explained that clearly. As of now I have no more questions but I am sure something will pop up soon.

Thanks everyone...

Chad

Share this post


Link to post
Share on other sites
Quote:
Original post by TheFez6255

If you want an example, read about minimum spanning trees.


Awesome....Correct me if I am wrong but a minimum spanning tree is a subtree that shares the same vertices correct?

Share this post


Link to post
Share on other sites
Quote:
Original post by chadsxe
Awesome....Correct me if I am wrong but a minimum spanning tree is a subtree that shares the same vertices correct?

No that's a spanning tree in general. A minimum spanning tree has the additional property that there is no other spanning tree which has a smaller total edge weight.

Illco

Share this post


Link to post
Share on other sites
The point was that calculation of minimal spanning trees is a paradigm problem often used to demonstrate the use of greedy algorithms. So don't spend too long reading up on MSTs unless the article eventually gets to greedy strategies [rolleyes].

Admiral

Share this post


Link to post
Share on other sites
Greedy algorithms are also useful at speeding up branch-and-bound solutions to optimization problems. In branch-and-bound you recursively look for the best solution, trying out all possible paths that could lead you there (at each step you try out all of the possible choices), EXCEPT for those that you KNOW will not lead you to a better solution than the one you have already found in a previous recursive branch.

Suppose you're looking for the shortest path from A to B in a non-negative edge weight graph. When you've found a path that takes, for example, 10 steps, you know that whenever the next path you're constructing gets longer than 10 steps even before it reaches point B, that this new path could not possibly improve on the solution/path you've already found, and so you don't even have to try it out.

The branch-and-bound method works quite well at optimizing average run time in most cases, but it sometimes just isn't enough by itself (worst-case complexity stays the same unfortunately). But it turns out that in some cases a quick greedy algorithm pass, to set the upper bound for the solution (if it gets close enough to the real solution on average), before the main branch-and-bound algorithm can significantly improve run-time. So by using a greedy algorithm before branch-and-bound, you can quickly cut down your "list of choices" to limit your search to where it will matter most.

Hope that was helpful :)

[Edited by - Devilogic on September 7, 2006 3:22:59 AM]

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this