Give a linear-time algorithm to test whether a tree has a perfect matching, that is, a set of edges that touches each vertext of the tree exactly once.
This is from Algorithms by S. Dasgupta, and I just can't seem to nail this problem down. I know I need to use a greedy approach in some manner, but I just can't figure this out. Help?
Pseudocode is fine; once I have the idea, I can implement in any language trivially.
The algorithm has to be linear in anything. O( V + E ) is fine.
I think I have the solution. Since we know the graph is a tree, we know of the existance of leaf nodes, nodes with one edge and no children. In order for this node to be included in the perfect matching, that edge MUST exist in the final solution.
Ergo, we can find all edges connecting to a leaf node, add to the solution, and remove the touched edges from the graph. If, at the end of this process, we are left any remaining nodes untounched, there exists no perfect matching.