Branch-and-cut methods are very successful techniques for solving a wide variety of integer programming problems, and they can provide a guarantee optimality. In this paper, we derive the thought of branch-and-cut algorithms from a simple example, giving the framework of the algorithms and analyzing if the algorithms converge or not.