Protactile is a language used by deafblind people using tactile channels. Unlike other sign languages, which are heavily reliant on visual information, protactile is oriented towards touch and is practiced on the body. Protactile communication originated out of communications by DeafBlind people in Seattle in 2007 and incorporates signs from American Sign Language. Protactile is an emerging system of communication in the United States, with users relying on shared principles such as contact space, tactile imagery, and reciprocity.

Protactile
Protactile American Sign Language
Native toUnited States
RegionWashington, Oregon
Language codes
ISO 639-3

History

edit

In 2007, a group of three DeafBlind women working at the Deaf-Blind Service Center in Seattle, aj granda, Jelica Nuccio, and Jackie Engler, communicated with each other using American Sign Language (ASL) through the use of interpreters.[1] Using ASL required the group to either use interpreters to communicate simultaneously or limited their conversation to just two people communicating at a time (using hand over hand signing).[1] The three worked together to devise ways to talk with each other directly, using their sense of touch as the primary source of information.[2] They began inviting other DeafBlind people into their conversations and interacting using these new communication practices.[2]

In describing the origin of protactile, granda and Nuccio write:[1]

It happened organically. We didn't "invent" [protactile]. What we did was use our positions at the DeafBlind Service Center to set up programs and events that would put DeafBlind people in a teaching role more often. And then when practices started really changing, we created a politics around it. We labeled things, and tried to document what was happening.

Description

edit

Protactile has emerged in communities of people who were born deaf, learned ASL as children, then gradually lost their sight over decades, as is common in Usher syndrome.[3] Leaders and educators granda and Nuccio describe a "protactile movement" as empowering the DeafBlind community with a sense of community, with a language in DeafBlind people's preferred modality providing a remedy to the isolation imposed by hearing and sighted culture.[4] They describe a protactile philosophy as supporting DeafBlind culture, relationships, and politics.[4] Protactile is described by Helen Keller Services for the Blind as "much more than a system of touch signals," instead "a philosophy and a movement which focuses on autonomy and equality for people who are deaf-blind."[5]

In protactile, communication takes place by touch and movement focused primarily on the hands, wrist, elbow, arm, upper back, and when in a seated position, knees and the top of the thigh.[6] In formal instruction of protactile while sitting and facing a conversation partner, the "listening hand" has the thumb, index finger, and pinky extended, and is rested on the thigh of the other participant.[7] For example, several rapid taps on the thigh with all four fingers would indicate "yes," where a rapid back and forth brushing movement with the fingers would indicate "no."[7]

Tactile maps are used in protactile, communicating spatial information about the environment to the DeafBlind person.[6] A map can be drawn on a recipient's hand, arm, or back to describe surroundings or give directions.[6]

Instead of the "air space" used in visual sign languages, that is, the space around a signer's body, protactile is rooted in "contact space."[8] While ASL and other sign languages rely on handshape as one of the core components distinguishing a sign from other signs, in protactile the handshape is less important than the sensation received (for example, a series of tapped signs using different handshapes would all just be received as taps, with the handshapes being indistinguishable).[9]

Reciprocity

edit

A significant innovation in protactile involves the concept of reciprocity.[10] Communication partners are encouraged to use the same communication method (as opposed to using signed or spoken language along with protactile) to ensure vision is not unduly privileged.[1] Sharing experience is a core principle of protactile, with tactile imagery evoking sensations in storytelling in the same way that facial expressions do in a conversation between sighted people.[1]

Serving the same function as body language or verbal acknowledgments (such as "mm-hmm" or "yeah"), tactile backchanneling allows for smoother communication in protactile conversations. Tapping the partner's arm or leg during pauses or as confirmation of understanding serves as a continuous loop of backchannel feedback.[6] Agreement, disagreement, laughter, and other responses are signaled using manual cues.[6] These cues are not standardized, but are developed according to the needs of the individual and specific situation.[5]

Education and impact

edit

The DeafBlind Interpreting National Training and Resource Center was launched in 2017 as a resource for deafblind people.[11] The Center staff work to train protactile interpreters; as DeafBlind author John Lee Clark writes, "instead of providing 'accurate and objective information' in a way that unsuccessfully attempts to create a replica of how they're experiencing the world, Protactile interpreters must be our informants, our partners, our accomplices."[11]

A grant from the National Science Foundation led to the creation of a hybrid learning environment for young deafblind children.[12] The DeafBlind Kids! website provides parents and caretakers with information about protactile concepts such as tactile exploration, backchanneling, and co-presence.[12]

Protactile communication fosters inclusion and autonomy by providing DeafBlind people with more information about their environment.[13] More robust communication leads to fewer misunderstandings and more sense of involvement and connection.[13]

References

edit
  1. ^ a b c d e granda, aj; Nuccio, Jelica. "Protactile Principles" (PDF). World Association of Sign Language Interpreters. Tactile Communications. Retrieved February 12, 2022.
  2. ^ a b Van Wing, Sage (January 5, 2022). "New Protactile language emerges in Oregon". Oregon Public Broadcasting. Retrieved February 12, 2022.
  3. ^ Edwards, Terra; Brentari, Diane (2021). "The Grammatical Incorporation of Demonstratives in an Emerging Tactile Language". Frontiers in Psychology. 11: 579992. doi:10.3389/fpsyg.2020.579992. ISSN 1664-1078. PMC 7838441. PMID 33519599.
  4. ^ a b granda, aj; Nuccio, Jelica (March 2016). "Pro-Tactile Vlog #5". Pro-Tactile: The DeafBlind Way. Retrieved February 12, 2022.
  5. ^ a b "Touch Signals Terminology & Signs". Helen Keller Services. Retrieved February 12, 2022.
  6. ^ a b c d e Collins, Steven D. "Pro-Tactile : Empowering Deaf-Blind People" (PDF). Human Development Center. Louisiana State University Health Sciences Center New Orleans. Retrieved February 12, 2022.
  7. ^ a b "Unit 2: Proper Hand Placement and Use". DeafBlind Kids. Retrieved February 12, 2022.
  8. ^ Edwards, Terra; Brentari, Diane (2020). "Feeling Phonology: The conventionalization of phonology in protactile communities in the United States". DeafBlind Culture and Community. Retrieved February 12, 2022.
  9. ^ Nuccio, Jelica; Clark, John Lee (2020). "Protactile Linguistics: Discussing recent research findings". Journal of American Sign Languages and Literatures. Retrieved February 12, 2022.
  10. ^ Yeh, James (December 1, 2020). ""New kinds of contact": A DeafBlind poet's push for a radical language of touch". Inverse. Retrieved February 12, 2022.
  11. ^ a b Clark, John Lee (2021). "Against Access". McSweeney's Quarterly Concern. 64. Retrieved February 12, 2022.
  12. ^ a b "DeafBlind Kids!". DeafBlind Kids!. Retrieved February 12, 2022.
  13. ^ a b "Q&A: How Pro-Tactile American Sign Language — PTASL — is changing the conversation". Perkins School for the Blind. October 2018. Retrieved February 12, 2022.
edit