Map of the various sign languages spoken across Turtle Island, excluding Francosign languages. Plains Sign Language is labelled in red as Hand Talk
Plains Indian Sign Language (PISL), also known as Hand Talk, Plains Sign Talk, Plains Sign Language, or First Nation Sign Language, is an endangered sign language common to the majority of Indigenous nations of North America, notably those of the Great Plains, Northeast Woodlands, and the Great Basin. It was, and continues to be, used across what is now central Canada, the central and western United States and northern Mexico. This language was used historically as a lingua franca, notably for international relations, trade, and diplomacy; it is still used for story-telling, oratory, various ceremonies, and by deaf people for ordinary daily use.
In 1885, it was estimated that there were over 110,000 "sign-talking Indians", including Blackfoot, Cheyenne, Sioux, Kiowa, and Arapaho. As a result of the European colonization of the Americas, most notably including American boarding and Canadian residential schools, the number of sign talkers has declined sharply. However, growing interest and preservation work on the language has increased its use and visibility in the 21st century. Historically, some have likened its more formal register, used by men, to Church Latin in function. It is primarily used today by Elders and Deaf citizens of Indigenous nations.

History
Hand Talk's history is intimately associated with both ancient and recent petroglyphs of the continent, however, little is known to academia about Plains Sign Talk's historical antecedents. The earliest records of contact between Europeans and Indigenous peoples of the Gulf Coast region in what is now Texas and northern Mexico note a fully-formed sign language already in use by the time of the Europeans' arrival there. These records include the accounts of Cabeza de Vaca in 1527 and Coronado in 1541.
Signing may have started in the south of North America, perhaps in northern Mexico or Texas, and only spread into the Plains in recent times, though this suspicion may be an artifact of European observation. It is known that there is a complex of Maya sign languages called Meemul Chʼaabʼal or Meemul Tziij in the Kʼicheʼ language, but it is unknown to what extent Meemul Tziij has affected Hand Talk.
The Northwest is home to Plateau Sign Language, which is either a single language or a family of sign languages spoken by the local nations. It is also unknown how associated Plateau Sign Language is with Hand Talk, but it is probable that they are related. Although it is still spoken, especially by the Ktunaxa, the Plateau nations historically shifted to using Chinook Jargon instead
In recent years, the Oneida Nation has taken steps to revive their sign language. Historically, the nations of the Northeast Woodlands, like the Haudenosaunee, spoke a variant of Hand Talk. The Oneida Sign Language Project officially began in 2016, and more signs are being added to this day.
Geography
Sign language use has been documented across speakers of at least 37 spoken languages in twelve families, spread across an area of over 2.6 million square kilometres (1 million square miles). In recent history, it was highly developed among the Crow, Cheyenne, Arapaho and Kiowa, among others, and remains strong among the Crow, Cheyenne and Arapaho.
Melanie R. McKay-Cody, a Cherokee Deaf woman and Hand Talk speaker/researcher, motions that "Plains" Sign Language is actually a family of inter-related languages extending beyond the Great Plains. She breaks down the regional languages as: Northeast Hand Talk (including Oneida Sign Language), Plains Sign Language, Great Basin Sign Language (spoken, for example, by the Ute), and Southwest Hand Talk. She also notes a West Coast language spoken by the Chumash, and she advances the idea that Inuit Sign Language has some relation to this complex of manual North American Indigenous languages. Unmentioned is Coast Salish Sign Language. Within each of these languages, she explains that nations will themselves have specific dialects, such as the Blackfoot.
Southwest Hand Talk is spoken by the Navajo, Hopi, Apache, and Pueblo peoples. However, amongst the Navajo and Keres people, there are two unrelated sign languages also spoken: Keresan Sign Language and, by a Navajo clan with a large number of deaf members, Navajo Family Sign. Likewise, Plateau Sign Language may or may not be related to Hand Talk.
The hidden history of “Hand Talk”
reminders:
- 💚 You nerds can join specific comms to see posts about all sorts of topics
- 💙 Hexbear’s algorithm prioritizes comments over upbears
- 💜 Sorting by new you nerd
- 🌈 If you ever want to make your own megathread, you can reserve a spot here nerd
- 🐶 Join the unofficial Hexbear-adjacent Mastodon instance toots.matapacos.dog
Links To Resources (Aid and Theory):
Aid:
Theory:
I’m making a bot for my roguelike game and I’m getting a lot of pushback from my friends for using “AI”. I’m making little autonomous bots that use textual embeddings and a sprinkle of local llms to help plan their actions so they can seem real. No ai slop is shown to the user, it’s all behind the scenes for them to plan their actions and navigate the world as I work around the limitations of reinforcement learning. I publish my experiments on a microblog and apparently people were gossiping that “i was doing something bad (with ai)” and I was met with a lot of hostility when I tried to share it with friends. Made me feel really bad but maybe they are right?
so instead of a state machine or a bunch of conditionals, you're using machine learning?
that sounds really interesting, is there a performance hit for it?
Thats correct. A decision tree is very similar to a classification task on some input state. So we invert the model so we just need to determine what actions are valid, give a negative reward to invalid actions, then let the algorithm figure out the best sequence to take.
Getting a story relevant action sequence is something else. Let’s say you have a goal of “unlock the door”. This is semantically similar to “Get key. Use key on locked door” (when using text embeddings and cosine similarity). I’m using proximal policy optimization / MCTS to find the sequence of events that best fits the goal narrative by simulating the environment. For more complex actions like “leave through the exit”, I’m using an llm to generate intermediate goals in plain english. There are a lot of limitations and it’s very experimental but it works well enough to allow arbitrary goals.
It is less computationally complex than an exhaustive search and it is also ‘online’ so we can use and continue to train it, we just get more optimal actions over time.
thats really cool honestly