A recently published idea is to use the A*-Algorithm to optimize the topology of Neural Networks. In this paper, optimization techniques are investigated that combine the A*-Algorithm with diierent parallel training algorithms, namely the backpropagation algorithm and several hybrid algorithms. The hybrid algorithms combine the backpropagation's steepest descent method with diierent sets of genetic operators. The diierent algorithms are compared with respect to the quality of the solution and the computation time. We show that by using the hybrid algorithms , the topology optimization is signiicantly improved.