Suppose that f is a continuous function on the interval [a, b] and f(a)f(b) < 0. By intermediate value theorem, f has at least one zero in the interval [a, b]. We next calculate c = (a + b)/2 and test fc). If f(c) = 0, then c is the root and we are done. If not, then either f(a)f(c) < 0 or f(b)f(c) < 0. In the former case, a root lies in [a, c] and we rename c as b and do the same process. In the latter case, we rename c as a and continue the same process. The root now lies in a interval whose length is half of the length of the original interval. The process is repeated and we stop the iteration when f(c) is very nearly zero or length of the interval [a, b] is very small.