Auditory and tactile modalities for a non visual representation: “Blind sailing” application.

Mathieu Simonnet1, Jean-Yves Guinard2, Jacques Tisseau3

(1) (2) (3) LISyC EA3883 UBO-ENIB
European Center for virtual reality
BP 38 , F-29280 Plouzané, France

Tel : 02 98 05 89 89 ; fax : 02 98 05 89 79

E-mail :{mathieu.simonnet, jean-yves.guinard} ;


This research has consisted in the elaboration of a spatial strategy for blind sailors. With auditory information, they can locate the sound buoys along the track. Vocal watches allow time measurement during the race. After some experiments that have isolated these different tools, the conclusion that tactile maps allow stocking accurate tactile pictured representation of the race has been drawn. However, these do not allow any adjustment as to the boat position during action. Even with a limited precision, the auditory feed back makes turning around the buoys along the track possible for blind people. Because of a space time relation, they can transform a time value into a distance value on the map. This strategy implicates an important cognitive load. That is why we would like to use virtual reality techniques to up-to-date haptic pieces of information about the position of the boat on a virtual map during sailing.

Key words:

blind sailors, spatial representation, haptic modality, auditory localisation, cognitive map.

1.                  Preamble

The spatial reality an organism can have access to fundamentally depends on his sensorial equipment [1]. Blind people perceive space through vestibular, kinaesthetic, haptic[1], tactile and auditory modalities. Even if these pieces of information are useful to build spatial representation, no other modality can be compared with as for the quality and the quantity of data as concerns spatial properties of the environment [4]. However, blind people elaborate spatial representation of their environment. What are their nature? Are they precise?  Do non visual maps exist?

Situated cognition theory is especially useful in the case of non-visual representation. Many searchers consider that cognition cannot be understood without taking into account the organism inserted in a particular situation with a specific configuration that is to say ecologically situated [5]. Only In Situ experiments can be helpful in our search. In addition, understanding the process used by blind people to locate themselves in space cannot be separated from motion. Moreover, important studies of Université de Technologie de Compiègne in France attribute an essential role to action in an emerging representation [6]. These works are interesting to explain how action and perception interact.

To finish with, we performed experiments on blind sailors in Brest club in France. The latter use the wind and the gliding sensations to steer the sailboats. Actually, maritime environment allows an easier way of moving than in urban environment because of the wide tracks. What is more, they practise match-racing regattas on a sound buoyed track. We are trying to set a spatial localisation strategy using specific tools like sound buoys, tactile maps and vocal watches.

After defining the different components of space, we will study how haptic and auditory modalities can be useful to build cognitive maps during navigation.

In the end, we will examine different possibilities about virtual reality in order to extrapolate our search to the maritime space.

2.                  Introduction

2.1.                      Spatial representation

2.1.1.                      Individual space

Through the body geometry and its motor possibilities we can distinguish two lines which separate three different spaces: the body space, limited by coetaneous tissues, the space close-by, defined by all points an individual can touch without moving, and the distant space which is all the points which cannot be reached without moving [7].

The body space is mostly identified through self-sensitive modalities. The space close-by can be reach through sight and touch. The distant space can be apprehended through vision and auditory information.

Which are the non visual pieces of information relevant to understand the distant space? The further is the space, the more important is the lack of vision in building spatial representation.

The capacity for subjects to figure a mathematical or euclidean space is the final result of the cognitive development of human beings [8]. Until eighteen months old, children only have a topological active space, as a result they can only perceive the relationship between objects without being able to perceive distances. This space is structured through an invariant that is the permanent object. Cognitive evolution allows the construction of a representative space. This one is projective, so the subjects can have a mental representation of objects which are outside their tactile and auditory range. These mental operations which are interiorised actions will eventually result in constructing new invariants such as distance, volume… [9].

Finally when the subject is able to use the metric system in order to measure those invariants, euclidean space construction is accomplished.

How can these representations of distant space be defined?

2.1.2.               Pictured representations

Thorndike and Hayes-Roth [10], Byrne [11], Pailhous [12] and Richard [13] draw a distinction between verbal codes and pictured codes. A graphic coding includes spatial characteristics whereas verbal coding does not. In other words visual representation make finding one’s way in space easier thanks to the use of mathematical data for orientation and distance.

Which pieces of information coming from haptic and auditory modalities do allow building a distant space representation with pictured but non-visual properties? How can action and perception be implied in euclidean spatial operations?

2.2.                    Perception–action

Perception consists in knowing and well using sensory motor contingency laws [14]. These laws are linking rules between motor actions and sensorial transformations [15].

This approach insists in the necessity of the subject perceiving action in order constitute perception. The perception quality is not determined by the implicated kinds of receptors, but by the structure of sensorial transformations produced during the action. The important point is the structure of the involved sensory motor law.

The definition of the perceived object is not obtained by an invariant in the sensation but by an invariant in the sensory motor circles. It means that when a subject is not in the action also he does not perceive anything.

If the spatial location comes from the sensory motor laws, spatial percepts emerge relations contained in the subject exploration. For blind people, map exploration brings the necessity of the kinaesthetic movements in order to obtain tactile feedbacks. The rules of this sensory motor circle are linked to the haptic modality. Here, action and perception are strongly linked.

Eventually, we try to show how sensory motor rules are efficient in order to perceive distant space without vision but with haptic and auditory modalities.

2.3.                 Auditory modality for spatial location in action

Spatial location consists in evaluating direction and distance of the sound source. We work in horizontal plan. The direction is called azimuth. We will explain the location process. The auditory wave comes first into the nearest ear, after than it touches the furthest one. The times difference change in function of the azimuth [16]. In reference of the known sensory motor concept, successive changes of this time difference allow perceiving spatial location during action.

Blind sailors define buoys azimuth with the clock reference in order to get one’s bearing on the auditory track. If the sound is about twelve o’clock, the sailboat will reach the sound source. If the sound is at three o’clock, it comes from forty-five degrees in reference of the axis of the boat. Because of the sound buoys, auditory organs perceived the azimuths variations of the sound sources during motion.

A study from Morrongiello et al. [17] shows that the differences between final position of the blind subjects and the target position are less important with a sound cue. These results explain that auditory modality affords feedbacks on sound events. The sounds of the buoys are essential in order to go all over the regatta track for blind sailors. Although activity seems to show sufficient information coming from the sound buoys, we search to associate some others information in order to increase precision.

Spatial information demands information about distance. A priori, human does not precisely evaluate the distance of the sound source, except if source is very familiar. In this case, the rule is that the intensity decrease with the square of the distance of the source. Even if the exponentially cue helps the organism, environment physic variations like wind, are touchy to limit the evaluation’s precision about distance and orientation of the real sound source. However intensity variations inform at least about the action proceeding. Although intensity does not give a precise distance to the subject, a variation’s intensity succession affords that the sound source is coming closer. If we consider this sensory motor rule, sound buoys are useful without being precise.

We have been suggesting the following hypothesis: the spatial auditory representation is efficient during the action without being precise. This representation affords an action’s regulation during motion that presents a precision inversely proportional of the distance of the sound source.

Could haptic information be efficient where auditory information are not. We mean precision in non visual spatial representation? Can Tactilo-kinaesthetic constitute a support for this spatial representation?

2.4.              Pictured haptic perception

The most important trouble of the haptic spatial treatment in comparison with the visual one is about the sequential nature [18]. Indeed, vision catches simultaneous the spatial cues and permits a relative positioning of the different visual components of the environment. Unlike haptic modality implicates such a successive perception of the information.  This analytic nature presents major difficulty about tactile mapping. Nevertheless, little objects can be overall appreciated.  Ballesteros et al. [19] shows a helping effect in using the both forefingers at the same time in order to build a global space representation. In addition, symmetric marks can appear during this bi manual exploration. Actually, the logic of the regatta course requires knowing the symmetric position in reference of the wind axis (cf. pict.1). In effect tactile map, instead of abstract tactile referential, is able to give an interesting spatial pictured representation. In this case, we have to be careful about the limits of the haptic sense.

In order to better understand haptic perception of space, we suggest studying the limits of the haptic modality in building spatial representation for blind people.

In the middle of the twentieth century, gestaltists extended their studies of visual illusions into haptic illusions. These works were continued by Hatwell [4] [18] and Gentaz [20] and explained some haptic illusions. The “Oblique effect”, the “detour effect” and the consequences of the “speed of exploration” are the most important illusions in haptic perception of distances and orientations. Haptic perceptions of orientations are submitted in “oblique effect”. This illusion consists in a better evaluation of the vertical and horizontal lines than the oblique lines. [21]. A recent study [20] shows that this effect comes from using a subjective gravity reference. It means this illusion results from multi references: external (gravity), and internal mark (the vertical line in reference of the subject body).

After some experiences on the “detour effect” [22], results prove an euclidean distances overvaluation which increases with the quantity of the detours effected with hand. 

Even if the conditions of the “detour effect” apparition are still discussed by different searchers [22] [ 23] [24] [3] [20]. It appears important difficulties for blind people, and mostly the born blind people, to precisely estimate the touched sinuous distances.

The slower is a manual exploration, the more important the overvaluation is [25]. When we use tactile maps, this temporal factor has to be considered in order to build a precise spatial representation. So a speed haptic exploration is more efficient than a slow exploration.

These illusions prove the limits of the spatial precision of the haptic modality.

Nevertheless, some experiences with « Optacon/TVSS » (Tactile Vision Substitution System) [26] which transforms luminosity waves in tactile vibrations, show that the limits of the haptic modality are not the only point. Indeed, blind people have to look for the relevant informations.

The presence of these haptic illusions shows the importance of the exploration strategy in order to precisely estimate distances and orientations. Heller [27] observed that an adapted exploration allows blind people to obtain a precise haptic perception. Even if the illusion is important, it can disappear when stimuli are enough little to be overall touched with one hand. So haptic spatial representation conserve distances and orientations.

These tactile lines show the possibilities of motion of the sailboat.


Departure line





Sound buoy far of the wind source


Sound buoy

Near of the wind source

Rectangle avec flèche vers le bas: Direction du vent

Picture 1:Tactile map of the sound track

Haptic spatial treatment is able to be complementary with auditory non precise information in order to build an overall euclidean representation. Some studies agree with the possibility of scanning mental pictured representations in the same manner than a physic space. This possibility shows the principle and the interest of the utilisation of a virtual map. The maritime space of a sound track can be discovered by touch. Moreover, the motions can be virtually operated. Blind subjects affirm using a spatial haptic representation because of the map.

2.5.              Cartography study

What can we learn from other studies about tactile maps used by blind people?

In a cognitive view, the first difficulty, mostly for born blind people which have not built projective space, is to understand that the paper plane of the maps represents their three dimensional or bi-dimensional space [24].

A case study of a five born blind child between the age of two and five concludes that the capacity to read a tactile map is precocious (as soon as four years old) and does not require any specific learning” [28]. Nevertheless, this conclusion was not admitted by Millar [29]. The latter showed that neither sighted persons nor blind have an innate ability to read a map. A child has to understand that the move of their hands exploring the map let him know about the real moves he has to do in his environment [29]. Espinosa and Ochaita [30] observed a better learning of a new track in a town when subjects used a map rather than through a real exploration. Map is a virtual world presenting analogies with the physical world. But these analogies have to be correctly interpreted. As a matter of fact, the change in scale implied by the passage of tactile space to physical motion space. This requires cognitive skills which can be different from those use in narrow space [24]. In other words reducing the scale to the size of the end is sufficient to make a mental representation as effective as possible. Through which process can these two spaces of euclidean reference be linked thanks to the mental use of cognitive maps.

2.6.              Cognitive map

Rieser et al. [31] shows a learnt correlation between locomotion and progressive changes of distances and orientations. The relations between extern objects and the subjects themselves allow to navigate, or to find his way in the physic space. Haptic and auditory modalities equally permit to build a way, or a space – timing sequence of rectilinear segments and turns in order to go from a point to another. Nevertheless, a way only consists in the repetition of a locomotor’s chain learnt by heart. This does not allow us to create some new ways like shortcuts or supplementary “detours”. This process is automatic and does not let any place for comprehension and initiative.  

In the opposite, the constitution of a “cognitive map” is a kind of euclidean aerial representation. This one is able to help for building spatial cognitive operations and also imagining shortcuts and new ways [4].

In order to increase the precision of the blind sailors’ motions on a sound track, we suggest using the tools that allow to build a track cognitive map in function of the wind direction.

2.7.        Auditory and haptic complimentarily

We have already been using a non-visual aerial representation of the regatta track with a tactile map (cf. pict. 1)

The precise sailboat position is required in order to allow to the subjects to realize spatial operations in real time. Now we only use sound buoys that are not precise enough. That is why we suggest to use others tools to build a strategy based on a euclidean system.

After some first experiences, we can say that the sailboat speed is constant when the wind conditions are stable. Also, in order to answer to our questions we can use the relation where the distance is equal to the speed multiplied by the time.

The strategy we are testing is very simple. Subjects are going all over the track using sound buoys. However, they equally use a vocal watch in order to chronometer time (cf. pict.2). It is a reference. When they would divide the time, they would also find a division of the distance. If they would report this division of the distance on the tactile map of the track, they could find precisely their position during the next round.

In situated cognition, we attempt that subjects could make emerging an overall sensation directly coming from the context. Actually, the time measurement could be adjust with motor sensory rules coming from the different modalities. For example, because of the vestibular system, the subjects would be able to interpret the inclination of the boat like an increase of the sailboat speed. In this case, the subjects could decrease the expected time. Eventually, this strategy is support in order to realize spatial and euclidean operations.

The following experimentations try to verify this hypothesis.

Picture 2. Tactile map and vocal watch using together

3.                  Experimentation

The experimentation part takes place in the maritime environment. We compare situations with wind (more than ten knots) or not. We are either on the beach by feet or on the sea with seven meters long sailboat in the Brest bay. We attempt to show that the wind is a disturbing element when subjects try to locate the sounds. However we think that wind can help blind people to keep an orientation. The sailboat situations compare the following tools: Sound buoys, tactile maps and vocal watches.

3.1.              Subjects

Six blind sailors and sailors eye-banded agreed to participate in experiences in order to check the efficiency of the strategy. We try to increase the precision of the non-visual representation in the course action [32].

3.2.              The pre experiments

The four first experiences are not explained in details. However, the experience about the strategy will be described more precisely.

The first experience shows that the wind is an orientation aid. The subjects take the helm of the sailboat in order to follow a rectilinear trajectory. They steer it in the three following situations: “engine without wind”, then “engine with wind” and naturally “with wind in the sails”. The trajectories coming from G.P.S. (Global Positioning System) allow us to validate the first hypothesis: “In a maritime locomotion task, the presence of the wind gives an orientation mark to the people without vision”.

The second experience shows that the wind implicates equally a decrease of the sounds location precision in distances and orientations. Subjects are on the beach with wind and then without. The latter are in the middle of three circles within one, two and three meters radius. Forty sounds are buzzing at random on the three circles. The role of the subjects consists in indicating the distance and the azimuth (clock analogy: twelve is in front of the wind) [33]. In conclusion, results validate the second hypothesis: “without vision, the number of correct positions with auditory evaluations decreases in presence of the wind”.

In the third experience, we compare the efficiency of a tactile map and a sound buoy in a situation of earth locomotion. The role of the subjects consists in reaching a point somewhere on a forty meters radian circle. The point is signalized on a tactile map aligned with the physic world, or with a sound boy, or both. Eventually, the results validate our third hypothesis: “In a locomotion task with wind and without vision, tactile information of the tactile map aligned with physic world is efficient in constructing pictured spatial representation in order to improve the direction of the initial trajectory, in opposite auditory signals allow blind people to up-to-date their representation and realize a better final trajectory.” Furthermore, auditory signals are a sine qua non condition to reach the arrival point. Taking in account these results, we see that a rule links the reception of a sound and a sensory motor pattern.

The fourth experience is a sail situation. We try to show the efficiency of the time mark with constant speed. After a reference’s round of the track, subjects use vocal watch or not to determine the moment of the turn in order to reach the sound point situated in the wind axis. Trajectories are collected with G.P.S. . We measure the difference between the ideal trajectory and the realized one. In conclusion, the results show that “in a maritime locomotion task, “constant speed time unity” corresponds to a division of the reference time and distance for blind sailors.

3.3.                Strategy experimentation

The correct results of the four first experiences allow us to test a situation that synthesizes all the tools and results in one strategy.

The final experience of this study demands to the blind sailors to realize a defined trajectory on sound track after a reference round. Then the subjects draw on a tactile paper. Each subject participates in this test in the four following situations: first they only hear sound buoys (=condition 1, C1), secondly they equally use a vocal watch (=C2), then they use sound buoys and a tactile map. Eventually, they use all the previous tools (=C4). Each condition requires two trajectories: one with two turns and another one with three turns.

3.4.                Experimentation strategy results (cf. tables 1 & 2)

Difference between the requested trajectories and the realized one

C1 :

Sound buoys

C2 :

Sound buoys and vocal

C3 :

sound buoys and tactile map

C4 :

sound buoys, vocal watch and tactile map

Trajectory 1 : distance in tacks 1 ; 2

4 ; 10

2 ; 1

2 ; 5

3 ; 2

Trajectory 2 : distance in tacks 1 ; 2 ; 3

3 ; 3 ; 1

1 ; 1 ; 2

5 ; 3 ; 3

2 ; 0 ; 2

Ecarts type





Difference Average





Table 1: Distances (without unity) between realized tacks (GPS) and asked tacks in C1, C2, C3 and C4.

Trajectories are produced with G.P.S. for motion but on paper for the subjects’ draws. We first compare the difference between the requested trajectories and the realized trajectories, then between the realized trajectories and the drawn trajectories (cf. tables 1).

On the second table (cf. table 2), we see better trajectories with vocal watch (difference (D) =1.4). Unlike the precision of the drawn trajectories is better with the tactile map (D=1.6) than with the vocal watch (D=2.2). When these both tools are using together, the results are better (D=1.8 & 1.5).We also conclude that they are complementary.

Difference between the drawn trajectories and the realized one

C1 :

Sound buoys

C2 :

Sound buoys and vocal

C3 :

Sound buoys and tactile map

C4 :

Sound buoys, vocal watch and tactile map

Trajectory 1 : distance in tacks 1 ; 2

5 ; 7

3 ; 2

2 ; 1

3 ; 1

Trajectory 2 : distance in tacks 1 ; 2 ; 3

1 ; 2 ; 3

1 ; 2 ; 3

2 ; 1 ; 2

1 ; 0 ; 2

Ecarts type





Difference Average





Table 2: Distances (without unity) between realized tacks (GPS) and drawn tacks in C1, C2, C3 and C4.

3.5.              Interpretation

These results demonstrate to the capacity of the blind sailors to store in long-term memory the pictured representation resulting from the tactile map. Furthermore, they use information of the vocal watch after having referred the course, i.e. measured total times to divide them into units. However, the sound buoys remain essential for the use of the tools "tactile map" and "vocal watch" because they provide feedbacks during action.  Our opinion is that it is interesting to hear testimonies of the concerned subjects. Indeed while using information previously quoted, the subjects explain that they modify the times appointed according to the feelings speed collected. This demonstrates that many sensory motor circles emerge from the overall situation.

Here, the subjects mix with the high level cognitive operations, like the conversion of a time into division of distance, with the emergence of the feelings speed resulting from the vestibular system, kinaesthetic, auditory, tactile and haptic [32]. The tactile map provides a support for a precise representation a priori or a posteriori. However this tool is not usable during action because of the important time which the cognitive operations of positioning require in abstract tactile map. Conversely, time give information during the action but is not easily usable from one regatta to another.

The results show that the use of the tactile map confers on the blind subjects a precise but too slow abstract representation in the course of action, whereas the auditory method is not very precise but effective during motion.

Overall, we can say that haptic and auditory modalities are able to help blind people to build a pictured but non visual representation.

4.                  Discussion

The previous results prove the capacity of the blind sailors to build cognitive maps on a sound and windy maritime space. The cognitive activity uses the tactile map like stable abstract support in time. This representation is favourable to interpretation by the subjects of the euclidean values constantly brought up to date announced by the watch. Only the auditory feed-back of the buoys present a bond between the physical track and its mental repesentation. The sounds of the buoys and the feelings of slips mentioned above, offer to the subjects a capacity to act in a tangible way [34], i.e. capacity to move in a way controlled on the track. Here tangible means that the subject concretely feels its physical motion because of the consecutive variations of the auditory signals and of the feelings of slips. The sound buoys also answer a recent theory of the space memory [35] according to which an intrinsic reference mark with the collection is used to learn the position from a whole of objects in a new environment. Here the sound buoys constitute intrinsic reference marks in the collection "course". Thus, the subjects build couples perception-action suitable in order to anticipate the feelings of motion. This anticipation allows the actualization of this pictured and finally total representation.

In spite of real progress compared to the situation of reference, we remain moderated on the system effectiveness of the sound buoys, tactile map and vocal watch for the space representation of the blind sailors in the course of navigation. Indeed, the weight of the cognitive operations requested from the subjects and the relative precision which results bring us to question us on the setting in a more effective strategy. Theoretical points of Paillard, Honoré and Hatwell [1] [4] [7] converge towards some limits of the auditory and haptic direction for the construction of an euclidean distant space. Our results, although in agreement with work of these authors, let imagine a strategy of location space founded on the capacity to interact with relevant auditory and haptic information. In order to improve the construction of pictured but non-visual representation we have to increase the redundant information with different modalities. If we would combine the redundant and relevant spatial information, we could allow blind people to control their motion in a better way. Thus pieces of haptic and auditory information must be brought up to date in the course of navigation in order to give further information in real time. Could we use the tools of virtual reality to immerse the blind people in a haptic world moving?

5.                 Future work in virtual reality

Virtual reality is interesting for at least one reason. “It can represent data in numerous ways. It is a media that allows transmission of messages of different types, and in multiple formats; it can even lead to the creation of new models of perceptive mediation (p.7)” [36]. In this way, we are trying to set a new auditory and haptic mediation of GPS position on geographic maps. We also have to mix data from map and GPS in implementing new haptic software.

The techniques of virtual reality are founded on the interaction with a virtual world in real time, using behavioural interfaces allowing the "pseudo-natural" immersion of the user in this environment [37]. The behavioural interfaces offer to the user the means of interacting physically, or in a tangible manner, with the virtual world. The haptic interfaces can immerse the blind subjects "in" the virtual map. We speak about immersion if the subjects use these new tools like the prolongation of their body [38].

In order to do that, which hardware system should we choose?

A recent study about the tactile flow mechanism in virtual reality shows that sensing vibrations of 4Hz present a higher spatial resolution when subjects use dynamic exploratory than when they employed the passive manner [39]. Taking into account the hurdles to perform, we seek a haptic multi degrees of freedom interface. Our point is that is important to let blind people the freedom to explore in the way they like in order to let them linking voluntary action and perception

During some previous experiments, “because of the presence of force feedback, the magnitude of unwanted incursions into a virtual wall were reduced by up to eighty percent, as compared to the case with no force feed-back” (p.21) [40]. We also have to choose a haptic interface with force feedback. The most important point is to make tangible the virtual maps. Different kinds are touchy suitable: The PHANToM from Sensable Technologies, the 3-DOF OMEGA from Force Dimension; or force feed back gloves:  DATA GLOVE 14 ULTRA (vibromotors in addition) from 5DT and CYBERGRASP from Immersion Corporation.

With all of these, we imagine software where blind sailors virtually discover Brest bay. They could let their finger(s) slipping into the current before feeling the rough texture of the Quelern rocks.

The virtual world evokes for us a numerical substitute of the paper tactile map always corresponding to a symbolic representation (or abstract) two-dimensional (or three-dimensional) projective, of reduced size, of a real space [24]. An interesting point is virtual worlds can be animated in the real time. Also, this dynamism helps to link perception and action.

The difficulty of this ambition is particular about the interfacing with the users. The problematic of the perception of distant space in real time for blind people implies to transform into tangible what it is not. In the field of research in industrial Design [41], this approach requires to seek the link between the symbolic system and the tangible one by:

 - Identification of the experiments which make sense for the user, i.e. to define which haptic symbols translate most intuitively such or such physical object into item cartographic. 

- translation of these subjective values in technical data, that is to say the use of the vectorial maritime maps and the addition of items haptic and/or auditory symbolic systems.

- the contribution of virtual reality in design of product and evaluation of the interactions produced user in situation of use.

The technology that we need in order to build a up to date virtual map system for blind people does already exist. However, experiences are necessary, to define the necessary rules in order to produce a system really adapted for blind users.

For example, due to the difficulties of the scale variations without vision [24], the new “bubble technique” [42] would be suitable. This technique is useful to interact with a large virtual environment using a haptic device with a limited workspace. Using this software, blind people might also “feel” the inner surface of the bubble, since the spherical workspace is “haptically” displayed by applying an elastic force feedback when crossing the surface of the bubble.

This stage corresponds to the setting in situation of the system (interface haptic, software of reading of vectorial map, blind subject). This experimental logic respects the spatial enaction and the concept of situated cognition [5]. The image according to which we progressively build our external space with our motions, answers the various authors previously quoted. Also, we wish to compare the development process of the cognitive maps of the Iroise sea at the time of controlled displacements carried out In Situ and In Virtuo.

This last part means that we will experiment some situations with paper maps and vocal information only. Then these results will be compared with the collected trajectories (G.P.S) using haptic hardware and virtual map system.

6.                 Conclusion

We have developed a low technologic strategy of space location for the blind sailors. For this public, information resulting from the haptic (tactilo kinaesthetic) and auditory modalities is adapted with the construction of pictured space representations. We used the haptic modality through tactile maps representing the track. The auditory method is exploited by sound buoys which mark out the track and a vocal watch to measure times of regatta. The experiments showed us that the tactile map makes it possible to store in long-term memory a precise tactile image of the track and that the sound buoys facilitate the location in the course of action when the subjects pass in the vicinity. The difficulties of coordination of these two systems lead us to call upon the techniques of virtual reality to facilitate these interactions.

However, several questions arise when about the effectiveness of this future system of virtual reality for maritime numerical tactile maps. How to explore overall a map with the haptic modality which transmits analytical feelings? Could we imagine to reduce the virtual map to the size of the hand? What about the new “bubble technique”? Does one have rather proud with the interest which Ballesteros [19] carries to bi manual exploration? How many feelings of different textures can user discriminate with a producing glove of vibrations? Is it necessary to choose one arm on standard force feed back? In which measurements is it interesting to create a spring (attractive field)?

Eventually, we pose the question how to understand the geographical map in real time for blind people. In a future work, we hope to show that the virtual realities systems are suitable about that.

7.                 References

  • Articles in journal :

[3] Faineteau H., Gentaz E. and Viviani P. The kinaesthesic perception of euclidean distance. In Experimental Brain Research, 152, 166-172, 2003.

[5] Varela J. Entretien avec Francisco J. Varela par Hervé Kempf. In La Recherche, 308, 109-112, 1998.

[7] Honoré J., Richard C. and Mars F. Perception de l’image du corps et action, In Y. Coello and J. Honoré. Percevoir s’orienter et agir dans l’espace, approche pluridisciplinaire des relations perception-action. Solal. Marseille, 2002.

[11] Byrne R. N. Memory for Urban Geography. In Journal of experimental Psychology, 31, 147-156, 1979.

[15] O’Regan J. K. and Noë A. A sensorimotor Account of vision and Visuel Consciousness. In Behavioral and Brain Sciences, Cambridge University Press, 24 , 35, 2001.

[17] Morrongiello B. A., Timney B., Humphrey C. K., Anderson S. and Skory C. Spatial knowledge by sight and blind children. In Journal of experimental child psychology, 59, 211-233, 1995.

[19] Ballesteros S., Millar S. and Reales J. Haptic discrimantion of bilateral symmetry in 2-dimensional and 3-dimensional unfamiliar displays. In Perception and Psychophysics, 59, 37-50, 1998.

[21] Appelle S. Perception and discrimination as a function of stimulus orientation: The “oblique effect” in man and animals. In Psychological Bulletin, 78, 266-278, 1972.

[22] Lederman S. J., Klatzky P. and Barber R. L. Spatial an movement based heuristics for encoding pattern information through touch. In Journal of experimental psychology : General, 114, 33-49, 1985.

[23] Klatzky R. and Lederman S. J. Relative availability of surface and object properties during early haptic processing. Journal of Experimental Psychology : Human perception and performance, 23, 1680-1707, 1997.

[25] Wong T. S. Dynamic properties of radial and tangential movements as determinant of the haptic horizontal-vertical illusion with an L figure. In Journal of Experimental Psychology: Human perception and performance, 3, 151-164, 1977.

[27] Landau B. Early map use as an unlearned ability. Cognition, 22, 201-223, 1986.

[30] Espinosa M. A. and Ochaita E. Using tactile maps to improve the practical spatial knowledge of adulte who are blind. In Journal of Visual Impairment and Blindness, 92, 339-345, 1998.

[31] Rieser J., Ashmead C. R. T., Taylor C.R. & Youngquist T.A. Visual perception and the guidance of locomotion without vision to previously seen targets. Perception, 19, 675-689, 1990.

  • PhD or books :

[1] J. Paillard. Psychophysiologie du comportement. PUF. Paris. 1973.

[2] G. Revesz. The human hand, a psychological study. Routledge and Kegan Paul. London. 1958.

[4] Y. Hatwell. Psychologie cognitive de la cécité précoce. Dunod. Paris. 2004.

[8] J. Piaget and B. Inhelder. La représentation de l’espace chez l’enfant. PUF. Paris. 1977.

[9] J. Piaget and B. Inhelder. L’image mentale chez l’enfant. PUF. Paris. 1966.

[12] J. Pailhous. La représentation de l’espace urbain. PUF. Paris. 1970.

[13] J.-F. Richard. Les activités mentales, comprendre, raisonner, trouver des solutions. Armand Colin. Paris. 1990.

[16] S. Mac Arthur. Audition : physiologie, perception et cognition. In M. Richelle, J. Requin and M. Robert. Traité de psychologie expérimentale. PUF. Paris. 1994.

[18] Y. Hatwell. Toucher l’espace. Presse Universitaire de Lille. 1986.

[20] E. Gentaz. Explorer pour percevoir l’espace avec la main. In J. Bullier and C. Thinus-Blanc. Agir dans l’espace. CNRS. Paris. 2005.

[24] Y. Hatwell, A. Streri and E. Gentaz. Toucher pour connaître. PUF. Paris. 2000.

[26] P. Bach-Y-Rita. Brain mecanism in sensory substitution. Academic Press, New York, 1972.

[28] M. A. Heller. Les illusions perceptives haptiques. In Y. Hatwell, A. Streri & E. Gentaz. Toucher pour Connaître. PUF. Paris. 2000.

[29] S. Millar. Understanding and representing space. Theory and evidence from studies with blind and sighted children, Clarendon Press. Oxford. 1994.

[32] M. Simonnet. Voile et Cécité, les repères utilisés par un sujet non-voyant tardif pour barrer un voilier au près. Mémoire de Maîtrise S.T.A.P.S. Brest : Université de Bretagne Occidentale. 2002.

[33] D. Mason.. Sailing Blind, an instruction manual of sailing techniques for blind and vision impaired people, their sighted helpers and instructors. New Zealand Council for blind sailors. New Zealand. 2000.

[35] T.P. Mc Namara. How are the locations of objects in the environment represented memory. In Spatial Cognition III, Springer-Verlag. Mifflin. 2003.

[36] D. Mellet-d’Huart. De l’intention à l’attention. Contributions à une demarche de conception d’environnements virtuels pour apprendre à partir d’un modèle de l’(én)action. Thèse de doctorat en informatique. Université du Maine. 2004.

[37] Ph. Fuchs, B. Arnaldi and J. Tisseau. La réalité virtuelle et ses applications, Traité de la réalité virtuelle, 2ème édition, volume 1, chapitre 1, pages 3-52, Les Presses de l'Ecole des Mines de Paris. 2003.

[38] G. Vigarello. Une histoire culturelle du sport, techniques d'hier et d'aujourd'hui. Robert Laffont et E.P.S. Paris. 1988.

  • Conferences :

[6] Lenay C. Mouvement et perception : médiation technique et constitution de la spatialisation in Le mouvement. Des boucles sensori-motrices aux représentations cognitives et langagières, actes de la Sixième école d'été de l'Association pour la Recherche Cognitive, Bonas, Communication orale, 1997.

[10] Thorndyke, P. W., Hayes-Ross B. Spatial knowledge acquisition from Maps and Navigation. San Antonio: Communication at the Congress of Psychonomic Society, 1978.

[14] Declerck, G. La perception du tangible comme obstruction: approches expérimentales minimalistes. Actes du colloque « Le virtuel et le tangible : ce qui résiste ». Compiègne Tech U. 2005.

[34] Lenay, C. 2005. Design of a tactil prothesis for blind users with a virtual reality system, Actes du colloque «  Le virtuel et le tangible : ce qui résiste ». Compiègne Tech U.

[39] Bicchi A., Dente D., Scilingo P.: Haptic Illusions induced by Tactile Flow. Conference Proceeding, EuroHaptics 2003 314-329

[40] Wagner C.R., Howe R. D. 2005.Mechanisms of Performance Enhancement With Force Feedback. World Haptic conference. CNR. University of Pisa.

[41] Guénand A. Sémantique de l’interaction en design industriel. Actes du colloque « Le virtuel et le tangible : ce qui résiste ». Compiègne Tech U. 2005.

[42] Dominjon L., Lécuyer A., Burkhardt J.-M. Andrade-Barroso G., Simon R. 2005. The “Bubble” Technique: Interacting with Large Virtual Environments Using Haptic Devices with Limited Workspace World Haptic conferen


[1]The « Haptic » word was introduced by Revesz [2] in order to characterize perception that implies cutaneous and kinaesthetic signals. This term is used for manual exploratory perceptions of material objects. In this paper, we employ the haptic word for the whole tactile-kinaesthetic sensations. In a trivial way, we do not precisely distinguish tactile and kinaesthetic perceptions. However, some studies [3] show interesting results.