Effect of Interpersonal Touch on Third-party Observers

Observing Touch Boosts Social Interests

Sun Yating

Best services for writing your paper according to Trustpilot

Premium Partner
From $18.00 per page
4,8 / 5
4,80
Writers Experience
4,80
Delivery
4,90
Support
4,70
Price
Recommended Service
From $13.90 per page
4,6 / 5
4,70
Writers Experience
4,70
Delivery
4,60
Support
4,60
Price
From $20.00 per page
4,5 / 5
4,80
Writers Experience
4,50
Delivery
4,40
Support
4,10
Price
* All Partners were chosen among 50+ writing services by our Customer Satisfaction Team

Abstract

The goal of this study is to investigate the effect interpersonal touch on third-party observers. Specifically, we aim to test whether observing interactions with touch biases individuals towards social information in their environment. To this end, participants will be presented with prime images that entail touch and no-touch interactions. Primes will be followed by a set of target images comprising social (i.e., faces) and non-social (i.e., vehicles) elements. Participants’ processing of these images will be explored using behavioural measures (e.g., recognition accuracy) and eye-gaze data obtained through eye-tracking. We expect stimulus recognition and gazing to be enhanced when presented with face than with vehicle stimuli. Moreover this difference should be more pronounced for stimuli primed with touch as compared with no-touch images.

Keywords: Interpersonal touch; Social information; Facial bias; Eye-gazing.

Observing touch boosts social interests

Touch is crucial to our daily life, as it allows us to communicate with the external world (Barnett, 1972). By doing so, we are able to feel the warmth of a human hand; to tap the screen of a smart phone; to send messages to a friend; to feel the sharpness of the tip of a pencil; or to feel the softness of a comfortable mattress. Among different types of touch, the interpersonal touch is the “social” sub-area of touch that mainly includes our interaction with another people (Field, 2001). For example, a strong handshake, an encouraging pat on one’s back, a short tap on one’s forearm, or a comforting pat on one’s shoulder. Research has focused on how interpersonal touch influences our social perception, social behaviour, as well as our social brain.

For the social perception, Fisher, Rytting and Heslin (1976) have conducted a behavioural study: when handing back the library card, library clerk either casually touched the forearm of the subject or did not touch the subject. The results illustrated that subjects felt higher level of affective and gave more positive evaluation towards the clerk, when they had been casually touched, in compared with the no touch situation. In other words, interpersonal touch enhances our positive social perception towards others. For the social behaviour, Cruso and Wetzel (1984) have asked waitresses to either briefly touch customers’ hands and shoulders, or behaving in the controlled manner (no touch). They have found that, regardless of the place being touched (hand or shoulder), customers gave higher rate of tipping fee when they had been touched by the waitresses, compared with the control group (where customers had not been touched). The above results suggested that interpersonal touch affects our social behaviour, which maybe a practical knowledge that we could use in our daily life.

For the social brain research area, both neuroimaging studies and EEG studies have shed light on the role of interpersonal touch. For example, Olausson et al. (2010) conducted an fMRI study and identified a system of unmyelinated low threshold mechanoreceptors (C tactile, CT afferents) which contribute to pleasant touch and provide an important sensory underpinning of social behaviour. It is important to note that their experiment was not strictly an interpersonal touch. In fact, it was a robot arm that held a brush that was stimulating participants. The brush touch was a slow, dynamic property of light touch in hairy skin. However, this kind of touch has been suggested to be salient in tactile interactions between individuals (Gallace and Spence, 2010; Vallbo et al., 1999). Also, as social processing include multiple mechanisms (e.g. vocal, facial, and olfactory), Olausson et al. (2010) have only suggested the CT afferents’ function in interpersonal touch. Thus, it is possible that the CT afferents do not wholly support the social processing, but only partially contribute to one specific aspect (i.e. the interpersonal touch aspect). Although having the above issues, the study done by Olausson et al. (2010) generally suggested that CT afferents (in posterior and middle insular cortex) is the brain system which works for both interpersonal touch and social processing. For the EEG evidence, Maria et al. (in progress) have conducted an experiment with facial stimuli. They asked participants to focus on the facial images displayed on a computer screen. At the meantime, they used a brush to briefly touch the forearm area of participants. According to the results, being touched by the brush enhanced the N170 ERP component, which has been proved to reflect the processes of neutral faces (Rossion et al., 2000). Again, the above study used a brush instead of a real human hand to touch participants, which may be not good for the consideration of validity. However, previous study has shown the similarity between being touched by a brush and a human hand. Therefore, this EEG study suggested that our social perception of faces could be enhanced after being touched.

With the knowledge of how interpersonal touch influences our social life, it is interesting to find out whether such influence could be extended, so that individuals are not being touched with one’s own experience, but simply observing a vicarious interpersonal touching interaction performed by another two people. It is important to figure out the vicarious interpersonal touch problem, as this is the social interaction that we have to process in everyday life. For example, the mess media propose thousands of news, which including pictures capturing the interpersonal touch interactions between politicians. When observing such pictures, readers may form stereotypes towards both politicians within in this interaction. Sometimes, being the toucher or the receiver in such a simple picture can determine the public’s social-emotional attitudes towards politicians, which will later greatly affect the voting rates toward them. Therefore, having the knowledge of vicarious interpersonal touch is important for us to guide our behaviour pattern in our social life.

To support both the real and vicarious experiences, a same neural system should be activated for both procedures, which is called the “Mirror system”. According to Blakemore et al. (2005), the mirror system lies in our Primary Somatosensory Cortext (SI). Moreover, Keyser, Kaas and Gazzola (2010) suggested that different sub-regions in SI make different contributions: BA2 functions to the perception of other’s experiences, whilst BA3 processes signals originate in our own body. Furthermore, Scharfer, Heinze and Rotte (2012) conducted an fMRI experiment that they assigned participants to either observe a painful vicarious touch (using a paintbrush to touch a hand), a non-painful vicarious touch, or become in the real touch condition (where participant watched no visual display and was touched by a paintbrush). The results suggested that, for the contrast between observed/ real touch, there was a significant overlap in SI. In other words, SI is the mirror neuron system for non-painful touch. Also, among all sub-regions within SI, BA2 made around 50% contribution to the overlap activation.

However, this fMRI study, as previous studies, risked validity by using paintbrush instead of a more realistic interpersonal communication stimulus. Also, although this study hinted the influence of vicarious touch on the social perception, it did not directly test any social-emotional issue as dependent variable. A recent study by Schirmer et al. (2014) has developed the Social Touch Picture Set (SToPS), which contain two characters within each image. The interaction between the two characters can be either touch or no-touch. Also, the interaction can be reciprocal (where there is no toucher and receiver) or non-reciprocal (where one character is a toucher while the other one is a receiver). This picture set enables the presentation of a more realistic vicarious interpersonal touch scene, compared with the previously used paintbrush one. In their study, they asked participants to watch the SToPS images, then rate the perceived valence, arousal, and likeability of the characters and the interaction procedure. They found that, when participants observed touch images, the image characters and interaction procedure seemed more positive, aroused, and likable, compared to the observation of no-touch images. Moreover, they used eye-tracking system to record the gazing pattern when participants were observing the touch/ no-touch images. The results illustrated more and longer fixations towards the upper body area rather than the expected touching area.

With such unexpected results, it is interesting to locate the exact position(s) within the upper body area that people focus on, after they observed vicarious touch images. Among all human upper body areas, perhaps the facial area is the most important one for our social information processing. As Allison, Puce and McCarthy (2000) mentioned in their review about social perception, the reason why facial information is crucial to our daily communication is because human face not only provides multiple explicit information (i.e. age, sex, emotional state, etc), but also includes some implicit information. Therefore, it is likely that face is the “key” area in the upper body part that we focus on, after primed with a vicarious interpersonal touch scene. More specifically, the enhanced social processing (by observing interpersonal touch) may bias our attention towards faces and help us to deeply process facial related information.

As no previous study has focused on the link between vicarious interpersonal touch and our facial information processing procedure, the current study is interested in such topic and will test: 1) whether observing touch biases individuals to attend to faces, and 2) whether a potential face bias facilitates the retrieval of facial information.

Methods

Participants

Run pilot study with 10 participants. Depending on the effect size to decide the sample size for the main study. A balanced number of male and female participants will be preferred, as sex difference may influence the results of current experiment. Also, as the face images will all be Asian, we will only include Asian participants.

Experimental designated

The current experiment is a 2 (Touch: touch/no-touch) X 2 (Stimulus: face/car) X 8 (AOIs) X 2 (Old target: face/car) within subject design, with dependent variables of looking duration, number of fixations, and d’ value for the facial recognition data.

Stimuli

SToPS Images (vicarious interpersonal touch images). There are 480 SToPS images (see figure 1 as an example), which could be divided evenly into different within participant conditions in the current study. The SToPS images contain ten different gestures, which could be either touch or no-touch. There are three different actors’ versions for each type of image. Each version has four female and male dyads: female-female/female-male/male-female/male-male. Therefore, there are 10*2*3*4 = 240 images. Also, a mirrored version for each of the above images serves to avoid the left-handiness/right-handiness issues. In total, there are 480 vicarious interpersonal touch images.

Face and vehicle recognition task images. Instead of using traditional houses in the control condition, vehicle images will be used in the current experiment. This is because the shape of the frontal vehicle images is comparable to human faces. This will enable us to standardize face images and vehicle images in the same manner (see figure 2 as an example).

Figure1: Example of touch/no-touch images in SToPS picture set.

Figure 2: Example of face and vehicle images.

Procedure

The experiment will contain two phases: the study phase and the test phase. For the study phase, each experimental trial will start with a 200 ms presentation of a white fixation cross in the middle of the screen against a gray background. Participants will be presented with either a face image or a car image for 1 s. Participants will then be asked to explore the picture and try to remember it. In total, we will present 60 face images and 60 car images during the study phase. The inter-trial interval will be 1-3 s with a blank screen displayed. The study phase will last 6-7 mins for the 120 images. Then, participants will be told that this is the end of study phase and they will begin the test phase (see figure 3 for an illustration).

Figure 3: The procedure of the study phase.

For the test phase, there will be 240 trials. During each trial, a fixation cross will be presented for 200ms. Then, participants will be primed with a Touch or No-touch image for 1 s. After priming, we will display a pair of target face/car image on a computer screen for 1 s. For each of the Touch/No Touch priming, target image pair could be one of the following combinations: FaceNew/CarOld (30 trials), FaceOld/CarNew (30 trials), or FaceNew/CarNew (60 trials). We will present target images with primes in a counterbalanced manner. After watching the target image pair, participants will have to decide whether they have encountered an old image or not. To show their decision, they will need to press the left button or the right button to indicate whether the left image or the right image is an old one. Alternatively, if they decide both images are new, they press “Enter” to start the next trial. During the inter-trial interval, we will present a gray blank screen for 1-3 s. Based on the above design, each trial will last around 5 seconds. Therefore, the total length for test phase will be around 20 mins (see figure 4 for an illustration).

Figure 4: The procedure of the test phase.

Measurements

Eye-tracking system. Eye-tracking data will be analysed using 8 AOIs (see figure 5 for illustration). We will measure the looking duration and number of fixations during the test phase.

Figure 5: There will be 8 AOIs for face image as well as for vehicle image. Among the 8 AOIs, our interest will focus on the eye and mouth areas.

D’ value for facial recognition. We will use d’-prime as the data analysis method to find the difference in sensitivity towards facial images. We will compute the H values (hit / hit + miss), the F values (FA / FA + CR), and the d’ values for each trial.

Expected results

For the looking duration, we expect to find out the Stimulus main effect and an interaction effect between Touch and Stimulus (figure 6).

Figure 6: We expect to see: 1) in general, participants look at face images longer than vehicle images; 2) primed with touch images will increase the looking duration towards faces.

For the number of fixations, we expect to see a Stimulus main effect and an interaction effect between Touch and Stimulus (figure 7).

Figure 7: We expect to find out that: 1) the number of fixations on face images will be larger than the number of fixations on vehicle images; 2) the priming of touch images will increase the number of fixations on face images.

For the facial recognition, we expect to find out the Stimulus main effect, and an interaction effect between Touch and Stimulus on the d’ values (figure 8).

Figure 8: We expect to see that: 1) participants are more sensitive to face images, rather than vehicle images; 2) the sensitivity towards face images will be enhanced by touch priming, but not no-touch priming.

Discussion

The current study aims at seeking the relationship between observing an interpersonal touch and the social information procedure. We expect stimulus recognition and gazing to be enhanced when presented with face than with vehicle stimuli. Moreover this difference should be more pronounced for stimuli primed with touch as compared with no-touch images. If the results will be the same as our expectations, we can conclude that observing touch biases individuals to attend to faces, and a potential face bias facilitates the retrieval of facial information. These possible findings will provide us knowledge on the link between observing an interpersonal touch and our facial information processing. Future studies could use fMRI to locate the brain areas that work for both observing an interpersonal touch and our facial information processing.

References

Allison, T., Puce, A., & McCarthy, G. (2000). Social perception from visual cues: role of the STS region.Trends in cognitive sciences,4(7), 267-278.

Barnett, K. (1972). A theoretical construct of the concepts of touch as they relate to nursing.Nursing research,21(2), 102-109.

Blakemore, S. J., Bristow, D., Bird, G., Frith, C., & Ward, J. (2005). Somatosensory activations during the observation of touch and a case of vision–touch synaesthesia.Brain,128(7), 1571-1583.

Crusco, A. H., & Wetzel, C. G. (1984). The Midas Touch The Effects of Interpersonal touch on Restaurant Tipping.Personality and Social Psychology Bulletin,10(4), 512-517.

Field, T. (2001). Touch. Cambridge, MA: MIT Press.

Fisher, J. D., Rytting, M., & Heslin, R. (1976). Hands touching hands: Affective and evaluative effects of an interpersonal touch.Sociometry, 416-421.

Gallace, A., & Spence, C. (2010). The science of interpersonal touch: an overview.Neuroscience & Biobehavioral Reviews,34(2), 246-259.

Gobbini, M. I., & Haxby, J. V. (2007). Neural systems for recognition of familiar faces.Neuropsychologia,45(1), 32-41.

Keysers, C., Kaas, J. H., & Gazzola, V. (2010). Somatosensation in social perception.Nature Reviews Neuroscience,11(6), 417-428.

Kleinke, C. L. (1986). Gaze and eye contact: a research review.Psychological bulletin,100(1), 78.

Perrett, D. I., Smith, P. A. J., Potter, D. D., Mistlin, A. J., Head, A. S., Milner, A. D., & Jeeves, M. A. (1985). Visual cells in the temporal cortex sensitive to face view and gaze direction. Proc. R. Soc. London B 223: 293–317.

Rossion, B., Gauthier, I., Tarr, M. J., Despland, P., Bruyer, R., Linotte, S., & Crommelinck, M. (2000). The N170 occipito-temporal component is delayed and enhanced to inverted faces but not to inverted objects: an electrophysiological account of face-specific processes in the human brain.Neuroreport,11(1), 69-72.

Schaefer, M., Heinze, H. J., & Rotte, M. (2012). Embodied empathy for tactile events: interindividual differences and vicarious somatosensory responses during touch observation.Neuroimage,60(2), 952-957.

Schirmer, A., Reece, C., Zhao, C., Ng, E., Wu, E., & Yen, S. C. (2014). Reach out to one and you reach out to many: Social touch affects third-party observers.British Journal of Psychology.

Vallbo, A. B., Olausson, H., & Wessberg, J. (1999). Unmyelinated afferents constitute a second system coding tactile stimuli of the human hairy skin. Journal of Neurophysiology,81(6), 2753-2763.

You Might Also Like
x

Hi!
I'm Alejandro!

Would you like to get a custom essay? How about receiving a customized one?

Check it out