- Home
- e-Journals
- Gesture
- Previous Issues
- Volume 19, Issue 1, 2020
Gesture - Volume 19, Issue 1, 2020
Volume 19, Issue 1, 2020
-
“When you were that little…”
Author(s): Josefina Safarpp.: 1–40 (40)More LessAbstractIn this article, I analyse how conventional height-specifier gestures used by speakers of Yucatec Maya become incorporated into Yucatec Maya Sign Languages (YMSLs). Combining video-data from elicitation, narratives, conversations and interviews collected from YMSL signers from four communities as well as from hearing nonsigners from another Yucatec Maya village, I compare form, meaning and distribution of height-specifiers in gesture and sign. Co-speech gestures that depict the height of upright entities – performed with a flat hand, palm facing downwards – come to serve various linguistic functions in YMSLs: a noun for human referents, a verb GROW, a spatial referential device, and an element of name signs. Special attention is paid to how height-specifier gestures fulfil a grammatical purpose as noun-classifiers for human referents in YMSLs. My study demonstrates processes of lexicalisation and grammaticalisation from gesture to sign and discusses the impact of gesture on the emergence of shared sign languages.
-
Emotion matters
Author(s): Rachel S. Levy and Spencer D. Kellypp.: 41–71 (31)More LessAbstractRecent theories and neural models of co-speech gesture have extensively considered its cognitive role in language comprehension but have ignored the emotional function. We investigated the integration of speech and co-speech gestures in memory for verbal information with different emotional connotations (either positive, negative, or neutral). In a surprise cued-recall task, gesture boosted memory for speech with all three emotional valences. Interestingly, gesture was more likely to become integrated into memory of neutrally and positively valenced speech than negatively valenced speech. The results suggest that gesture-speech integration is modulated by emotional valence of speech, which has implications for the emotional function of gesture in language comprehension.
-
The more you move, the more action you construct
Author(s): Tommi Jantunen, Danny De Weerdt, Birgitta Burger and Anna Puupponenpp.: 72–96 (25)More LessAbstractThis paper investigates, with the help of motion capture data processed on corpus principles, the characteristics of head and upper-torso movements in constructed action and regular narration (i.e., signing without constructed action) in FinSL. Specifically, the paper evaluates the validity of two arguments concerning constructed action: that constructed action forms a continuum with regular narration, and that constructed action divides into three subtypes (i.e., overt, reduced, and subtle). The results presented in the paper support the first argument but not directly the second one. Because of the ambiguous position of reduced constructed action in between subtle and overt constructed action, we argue in the paper that the present three-part typology of constructed action may need revising. As an alternative way of subcategorizing the phenomenon we propose a division between strong and weak constructed action.
-
Gestures in patients’ presentation of medically unexplained symptoms (MUS)
Author(s): Agnieszka Sowińska and Monika Boruta-Żywiczyńskapp.: 97–127 (31)More LessAbstractThe aim of this paper is to explore speech-accompanying gesture use in presentation of medically unexplained symptoms (MUS). The data are 19 video-filmed semi-structured interviews with patients presenting MUS. Four patterns of gestural behaviors are established in symptom presentation: (1) No gesturing; (2) Overall low gesture rate; (3) Overall high gesture rate with low rate for symptoms; (4) Overall high gesture rate with high rate for symptoms. The patients with overall low gesture rate tend to perform deictic gestures, pointing to exact locations of the symptoms; those with overall high gesture rate and low symptom rate produce metaphorics, and those who gesticulate at high rates – mainly iconics and metaphorics. Although exact factors that lead to the four types of gesturing patterns are unclear, the findings encourage medical professionals to attend to the information in gesture use in order to obtain a better understanding of the patient’s experience of MUS.
-
Learning from an avatar video instructor
Author(s): Nicholas A. Vest, Emily R. Fyfe, Mitchell J. Nathan and Martha W. Alibalipp.: 128–155 (28)More LessAbstractTeachers often produce gestures, and, in some cases, students mimic their teachers’ gestures and adopt them into their own repertoires. However, little research has explored the role of gesture mimicry in technology-based learning contexts. In this research, we examined variations in the rate and form of students’ gestures when learning from a computer-animated pedagogical avatar. Twenty-four middle school students received a lesson on polynomial multiplication from a gesturing avatar video instructor. After the lesson, students were asked to provide an explanation of what they learned. Students varied in their gesture rates, and some students produced gestures that were similar in form to the avatar’s gestures. Students who produced gestures that aligned with the teacher’s gestures scored higher than those who did not produce such gestures. These results suggest that middle school students’ gestures play a key role when learning a mathematics lesson from an avatar instructor.
Volumes & issues
-
Volume 22 (2023)
-
Volume 21 (2022)
-
Volume 20 (2021)
-
Volume 19 (2020)
-
Volume 18 (2019)
-
Volume 17 (2018)
-
Volume 16 (2017)
-
Volume 15 (2016)
-
Volume 14 (2014)
-
Volume 13 (2013)
-
Volume 12 (2012)
-
Volume 11 (2011)
-
Volume 10 (2010)
-
Volume 9 (2009)
-
Volume 8 (2008)
-
Volume 7 (2007)
-
Volume 6 (2006)
-
Volume 5 (2005)
-
Volume 4 (2004)
-
Volume 3 (2003)
-
Volume 2 (2002)
-
Volume 1 (2001)
Most Read This Month

-
-
Home position
Author(s): Harvey Sacks and Emanuel A. Schegloff
-
-
-
Depicting by gesture
Author(s): Jürgen Streeck
-
-
-
Some uses of the head shake
Author(s): Adam Kendon
-
- More Less