With the use of new per-operative Computer-Aided Surgical (CAS) systems, the surgeon is now even more confronted to problems of saturation by excessive feedback information. The aim of this work is to send back the information collected by the CAS system through a channel that is neither the visual nor the auditory channel, both already highly saturated (patient, screens, videos, conversations,...). For this, a system called the Tongue Display Unit, is studied. This unit was originally introduced by Paul Bach-y-Rita (Center for Neuroscience - University of Wisconsin, Madison USA), and proposes to use the tongue tactile proprioception as a feedback channel. A small matrix of 12 by 12 electrodes is put in contact with the lingual surface, and a signal is sent to each electrode in order to stimulate the tongue mucous. This unit was evaluated in the context of sensory substitution, and is introduced here in a Computer-Aided Surgery framework, to per-operatively guide the surgeon gesture. The idea is to project a planned 3D trajectory (for example between a percutaneous incision and a given target) onto the 2D matrix of electrodes. This work is currently in progress.