- Home
- e-Journals
- Gesture
- Previous Issues
- Volume 10, Issue 2-3, 2010
Gesture - Volume 10, Issue 2-3, 2010
Volume 10, Issue 2-3, 2010
-
Gesture and multimodal development
Author(s): Michèle Guidetti and Jean-Marc Collettapp.: 123–128 (6)More Less
-
Pointing gesture in young children: Hand preference and language development
Author(s): Hélène Cochet and Jacques Vauclairpp.: 129–149 (21)More LessThis paper provides an overview of recent studies that have investigated the development of pointing behaviors in infants and toddlers. First, we focus on deictic gestures and their role in language development, taking into account the different hand shapes and the different functions of pointing, and examining the cognitive abilities that may or may not be associated with the production of pointing gestures. Second, we try to demonstrate that when a distinction is made between pointing gestures and manipulative activities, the study of children’s hand preference can help to highlight the development of speech-gesture links.
-
Support or competition?: Dynamic development of the relationship between manual pointing and symbolic gestures from 6 to 18 months of age
Author(s): Claire D. Vallottonpp.: 150–171 (22)More LessDynamic Skills Theory (DST) posits that skills within domains may promote or suppress other skills as they first develop, resulting in spurts of growth in one skill concurrently with regression in another. I test this premise by examining development of two preverbal representational skills: manual pointing and symbolic gestures. Pointing is a robust early communicative gesture, indicating infants’ awareness of others’ attention, but limited in ability to represent infants’ conceptual repertoires as they grow beyond the immediate environment. Symbolic gestures are more specific but less flexible representational tools. Both skills predict language, yet no study has addressed the effects of these skills on each other. I observed the gesturing behavior of 10 infants over 8 months in a gesture-rich environment to test the effects of each skill on the other. Supporting DST, results show early pointing predicted earlier, but not more, symbolic gesturing, while symbolic gesturing did suppress pointing frequency.
-
From gesture to sign and from gesture to word: Pointing in deaf and hearing children
Author(s): Aliyah Morgenstern, Stéphanie Caët, Marie Collombel-Leroy, Fanny Limousin and Marion Blondelpp.: 172–202 (31)More LessIn this paper, we explore the issue of (dis)continuity between gestures and signs and gestures and words by comparing three longitudinal follow-ups of a hearing monolingual French speaking child, a deaf signing child (LSF), and a hearing bilingual (French-LSF) child. Our study indicates that the development of the same manual form (the index finger point) is influenced by the input children receive in the modalities they have at their disposal. Interestingly, the bilingual (French-LSF) child presents an intermediate profile as far as the number of points she uses is concerned. Our analyses do not enable us to differentiate pointing “gestures” from pointing used as a linguistic sign since we could observe no systematic formal distinction. But our study suggests that pointing facilitates the three children’s entry into syntax: pointing gestures or/and signs are more and more combined to words and/or signs, facial expressions, gaze, in complex linguistic productions and with more and more deictic and anaphoric values.
-
How the hands control attention during early word learning
Author(s): Nancy de Villiers Rader and Patricia Zukow-Goldringpp.: 202–221 (20)More LessWe have proposed that gestures play a significant role in directing infants’ attention during early word learning when caregivers synchronize the saying of a word with a dynamic gesture; this synchronization functions to bring sight and sound together, providing a basis for perceiving them as belonging together (Zukow-Goldring & Rader, 2001). To test this claim, we presented 9–14 month-old infants with videos of speakers using synchronous dynamic vs. static gestures (Study 1) or synchronous dynamic vs. asynchronous dynamic gestures (Study 2) while introducing a novel object. Eye tracking allowed us to measure where infants looked over time during the word–object pairings and during a test for word learning. We hypothesized that dynamic gestures would draw infants’ attention from the mouth to the object, that infants would attend more to the object at the time the word was spoken when the gesture was dynamic and synchronous with speech, and that synchrony of gesture and speech would result in better word learning. These three hypotheses were supported.
-
Infant movement as a window into language processing
Author(s): Laurel Fais, Julia Leibowich, Ladan Hamadani and Lana Ohirapp.: 222–250 (29)More LessWe demonstrate differential, systematic, cross-modal responses to language by contrasting the regularities of infant movement behavior in contexts in which infants are presented with language stimuli, with those exhibited in the context of music. Using a detailed coding system, we show that infants recognize the social underpinnings of language and respond to language stimuli with vocal, gaze, head, and torso, (but not arm or manual) movements that differ from those exhibited to music stimuli. We propose that measures of these sorts of bodily gestures can not only provide a reliable supplement to looking time measures for gauging infant language abilities, but also uncover novel, richly textured nuances to our understanding of infant language acquisition.
-
Children’s lexical skills and task demands affect gestural behavior in mothers of late-talking children and children with typical language development
Author(s): Angela Grimminger, Katharina J. Rohlfing and Prisca Stennekenpp.: 251–278 (28)More LessTo evaluate the influence of lexical development and task demands on maternal gestural behavior, we observed 17 German-speaking mothers and their children with typically language development (TD) and 9 mothers with their late talkers (LT) aged 22–25 months in task-oriented dialogues. Mothers instructed their children to put two objects together; canonical and — as more difficult tasks — noncanonical spatial relationships were requested. Deictic gestures were dominant in both groups and were used to reinforce speech. However, LT’s mothers gestured more than TD’s mothers and tended to hold their gestures throughout a complete utterance. Regarding the task demands, all mothers gestured more in noncanonical settings and this trend was more pronounced in LT’s mothers. Thus, certain aspects of gestural motherese (frequency and duration of gestures but not redundancy) seem to ‘operate’ on a scale between task difficulty and children’s language skills, suggesting that maternal communicative behavior is fine-tuned to children’s learning process.
-
The type of shared activity shapes caregiver and infant communication
Author(s): Daniel Puccini, Mireille Hassemer, Dorothé Salomo and Ulf Liszkowskipp.: 279–296 (18)More LessFor the beginning language learner, communicative input is not based on linguistic codes alone. This study investigated two extralinguistic factors which are important for infants’ language development: the type of ongoing shared activity and non-verbal, deictic gestures. The natural interactions of 39 caregivers and their 12-month-old infants were recorded in two semi-natural contexts: a free play situation based on action and manipulation of objects, and a situation based on regard of objects, broadly analogous to an exhibit. Results show that the type of shared activity structures both caregivers’ language usage and caregivers’ and infants’ gesture usage. Further, there is a specific pattern with regard to how caregivers integrate speech with particular deictic gesture types. The findings demonstrate a pervasive influence of shared activities on human communication, even before language has emerged. The type of shared activity and caregivers’ systematic integration of specific forms of deictic gestures with language provide infants with a multimodal scaffold for a usage-based acquisition of language.
-
Transcribing and annotating multimodality: How deaf children’s productions call into the question the analytical tools
Author(s): Agnès Millet and Isabelle Estèvepp.: 297–320 (24)More LessThis paper deals with the central question of transcribing deaf children’s productions. We present the annotation grid we created on Elan®, explaining in detail how and why the observation of the narrative productions of 6 to 12 year-old deaf children led us to modify the annotation schemes previously available. Deaf children resort to every resource available in both modalities: voice and gesture. Thus, these productions are fundamentally multimodal and bilingual. In order to describe these specific practices, we propose considering verbal and non-verbal, vocal and gestural, materials as parts of one integrated production. A linguistic-centered transcription is not efficient in describing such bimodal productions, since describing bimodal utterances implies taking into account the ‘communicative desire’ (‘vouloir-dire’) of the children. For this reason, both the question of the transcription unit and the issue of the complexity of semiotic interactions in bimodal utterances need to be reconsidered.
-
Mathematical learning and gesture: Character viewpoint and observer viewpoint in students’ gestured graphs of functions
Author(s): Susan Gerofskypp.: 321–343 (23)More LessThis paper reports on a research project in mathematics education involving the use of gesture, movement and vocal sound to highlight mathematically salient features of the graphs of polynomial functions. Empirical observations of students’ spontaneous gesture types when enacting elicited gestures of these graphs reveal a number of useful binaries (proximal/distal, being the graph/seeing the graph, within sight/within reach). These binaries inform an analysis of videotaped gestural and interview data and appear to predict teachers’ assessments of student mathematical engagement and understanding with great accuracy. Reframing this data in terms of C-VPT and O-VPT adds a further layer of sophistication to the analysis and connects it with deeper findings in cognitive and neuroscience and gesture studies.
Volumes & issues
-
Volume 20 (2021)
-
Volume 19 (2020)
-
Volume 18 (2019)
-
Volume 17 (2018)
-
Volume 16 (2017)
-
Volume 15 (2016)
-
Volume 14 (2014)
-
Volume 13 (2013)
-
Volume 12 (2012)
-
Volume 11 (2011)
-
Volume 10 (2010)
-
Volume 9 (2009)
-
Volume 8 (2008)
-
Volume 7 (2007)
-
Volume 6 (2006)
-
Volume 5 (2005)
-
Volume 4 (2004)
-
Volume 3 (2003)
-
Volume 2 (2002)
-
Volume 1 (2001)
Most Read This Month
Article
content/journals/15699773
Journal
10
5
false

-
-
Home position
Author(s): Harvey Sacks and Emanuel A. Schegloff
-
-
-
Depicting by gesture
Author(s): Jürgen Streeck
-
-
-
Some uses of the head shake
Author(s): Adam Kendon
-
- More Less