In this paper we propose to jointly detect lane and pavement boundaries by fusing information from both optical and radar images acquired with the sensor system mounted on the top of the host vehicle. The boundaries are described with concentric circular models. The optical and radar imaging processes are modeled as Gaussian and log-normal probability densities. The multisensor fusion boundary detection problem is posed in a Bayesian framework and a maximum a posteriori (MAP) estimate is employed to locate the lane and pavement boundaries. Since the circular model parameters possess compatible units and are in the same order of magnitude, the estimation problem is much better conditioned than using the previous parabolic models, whose parameters are incompatible with each other. This fusion algorithm achieves analytical integration of two different sensing modalities due to the validation of the likelihoods that describe both optical and radar imaging processes. Experimental results have shown that the fusion algorithm outperforms single sensor based boundary detection algorithms in a variety of road scenarios.