Tangible Acoustic Interfaces (TAIs) are innovative acoustic Human-Machine Interaction devices. Exploiting a number of contact sensors distributed on a surface, the vibrational signal generated from the interaction between the surface and an object moved by the user is acquired and analyzed to recognize what the user is doing on the device. The usage of vibrational sensors naturally opens the way also to classification and recognition applications. In this paper, a system to perform audio-based interaction object recognition is presented. The aim of the system is to recognize what object the human is using to interact with the TAI, by exploiting feature analysis and classification techniques. In particular, a frame-by-frame SVMbased classifier architecture is used to perform object recognition. The result is then filtered to eliminate the possible classification outliers. By training and testing our system using signals from four interaction objects at different Signal to Noise Ratios we have reached accuracies between 73% and 100% according to the object used, the quality of the acquired signal and the optional use of the classification filtering algorithm.