Resumen
Sound synthesis methods based on physical modelling of acoustic instruments depend on data that require measurements and recordings. If a musical instrument is operated by a human, a difficulty in filtering out variability is introduced due to a lack of repeatability in excitation parameters, or in varying physical contact between a musician and an instrument, resulting in the damping of vibrating elements. Musical robots can solve this problem. Their repeatability and controllability allows studying even subtle phenomena. This paper presents an application of a robot in studying the re-excitation of a string in an acoustic guitar. The obtained results are used to improve a simple synthesis model of a vibrating string, based on the finite difference method. The improved model reproduced the observed phenomena, such as the alteration of the signal spectrum, damping, and ringing, all of which can be perceived by a human, and add up to the final sound of an instrument. Moreover, as it was demonstrated by using two different string plucking mechanisms, musical robots can be redesigned to study other sound production phenomena and, thus, to further improve the behaviours of and sounds produced by models applied in sound synthesis.