Three years of gesture with Silent Sign (recording made 2003).
This eleven-year old student re-tells story learnt through Silent Sign.
This student and the others in his class moved up to secondary with competent levels of speaking and listening skills in English (B1/B2).
Before discussing how long we should use the Silent Sign tool in the English language classroom, I will summarize what the benefits are of working with a gesture code. Gestures in the English language classroom can help the teacher to do the following.
1) Transfer meanings of words to hand gestures so as to later elicit language from students at any time.
2) Maintain meaning and understanding in the classroom even though communication is at a spoken level and there are no or few written clues.
3) Facilitate oral input of further new vocabulary and structures thanks to enhanced understanding.
4) Allow students to practise spoken English for a greater proportion of class time and so develop speaking skills faster.
5) Require students to make multiple decisions during intensive input about spoken English vocabulary and structures at the sentence level.
6) Accelerate input of new and recycled language orally during English classes through speed of delivery and instant recognition of the hand gestures, their meanings and sounds.
As we can see, all the above points are concerned with facilitating comprehension of spoken English in the classroom. In elementary courses of English where a gesture code is not used, most teachers may feel the written word is necessary. Words provided in the written form help students visualize their phonemic value. Without writing words on the board, spoken English for beginners can be very difficult to understand and follow. The "writing tool" allows learners to assimilate the sounds and separate one word from another.
In one way, gesture is similar to the writing tool in that both tools allow students to visualize words. The important difference being that gestures remove the phonemic clues that writing provides and ensure students produce the sounds from their own interlanguage knowledge. Gestures convey meanings not phonetics. Furthermore, when students "read" the gestures, they must also interpret them; conjugate verbs and construct parts of speech - taking an active cognitive part in production. Writing, in comparison, is a passive phonetic transcription of spoken language that makes few calls on a learner's interlanguage. Interestingly, a learner of English can read aloud a written sentence in the target language and not understand a single word. The essence of gesture and the Silent Sign technique then is that learners are engaging in producing utterances from input which in itself is meaning albeit in iconic form. The inclusion of text as input for practising the target language may obfuscate the task of acquisition by including elements often indecipherable for learners. By removing text from the essential input, learners can focus on a direct fusion of meaning into utterance, which should enable them to concentrate fully on holistic spoken language skills development.
If a gesture code compensates for the difficulties during comprehension of the spoken foreign language, it follows that once a point is reached where students are able to understand sufficient spoken language unaided, gestures are no longer needed. A time comes when the teacher can explain new vocabulary and structures in the target language and students understand and are able to give feedback, ask questions and dialogue with the teacher and peers relying on their listening skills.
The longest GestureWay courses I have given have been of three years. Students who started at seven years of age and finished at nine or ten had a lower fluency ability than those who started at eight or nine and finished at eleven or twelve - the latter group reaching a B1 to B2 level. This is probably due to more mature cognitive skills and metalinguistic awareness among the older students. The Silent Sign gesture approach as I have explained it here could ideally start from seven years old onwards. Younger learners could well flounder trying to interpret raw gestures of this type.
However, though at the moment I can offer no personal experience on this, learners could start with gestures from pre-school and into early primary. The nature of the gesture input I believe should be different from the video examples you have seen on this site. Less metalinguistic skill can be insisted upon so gestures would be more literal, representing non-abstract words, child language collocations, simple set phrases and accompanying songs and chants.
Article based on findings from Phd thesis (Bilbrough 2017).
Copyright © 2019 Mike Bilbrough
All rights reserved