Meredith Small's Blog
August 4, 2017
The Anthropology of Alzheimers
Anthropologists are not often in the news, and they are even more rarely in the spotlight for discovering what might be a connection between a human ill and how other people live. But  was recently profiled (or his work was) in The New York Times for suggesting that Alzheimer's is so common in Western culture because we are devoid of regular parasitic infection.
Trumble's work comes from a long-term and collaborative study of the Tsimane of Bolivia. The Tsimane live in savannas and forests and they make a living by growing things to eat, foraging and hunting. Although many South American cultures have been affected by missionary or governmental zeal to acculturate them into Western ways, the Tsimane have resisted and maintain their identity and way of life.
 and their main focus has been their health and welfare over the human life cycle. Trumble focuses and contrary to what Westerners might expect, once successfully past childhood, many Tsimane live into their 90s and they are physically very healthy. Tumble became especially interested in the possible role of Alzheimer's and dementia in their lives.
Tumble put together what is currently known about the genetics and biology of Alzheimers with the particular life circumstances of the Tsimane and came up with the possibility that being subject to repeated parasitic infections repeatedly might have a key to understanding why some people exhibit Alzheimers and others do not.
Working with the Tsimane, Trumble compared the incidence of the ApoE4 gene, which in a double blast increases the risk of Alzheimers tenfold in the West, to results of cognitive tests with his volunteers. He found that Tsimane elderly who had a copy of the gene in fact did better on those tests than those with no ApoE4.
But that's not the really surprising part. Trumble also compared these results, genetic and cognitive, with parasite load. It seems that those with the gene in single or double form and parasites retained their mental acuity while those without parasites and even one copy of ApoE4 often did not. Â They looked like Westerners in terms of cognitive ability.
In other words, what we now see as a "bad" gene might have been protective and "good" when our ancestors lived a less parasitic life. Those genes might have evolved to protect our aging brains from parasites and now they are, instead, damaging brain cells because their evolutionary environment has changed.
Trumble's work comes from a long-term and collaborative study of the Tsimane of Bolivia. The Tsimane live in savannas and forests and they make a living by growing things to eat, foraging and hunting. Although many South American cultures have been affected by missionary or governmental zeal to acculturate them into Western ways, the Tsimane have resisted and maintain their identity and way of life.
 and their main focus has been their health and welfare over the human life cycle. Trumble focuses and contrary to what Westerners might expect, once successfully past childhood, many Tsimane live into their 90s and they are physically very healthy. Tumble became especially interested in the possible role of Alzheimer's and dementia in their lives.
Tumble put together what is currently known about the genetics and biology of Alzheimers with the particular life circumstances of the Tsimane and came up with the possibility that being subject to repeated parasitic infections repeatedly might have a key to understanding why some people exhibit Alzheimers and others do not.
Working with the Tsimane, Trumble compared the incidence of the ApoE4 gene, which in a double blast increases the risk of Alzheimers tenfold in the West, to results of cognitive tests with his volunteers. He found that Tsimane elderly who had a copy of the gene in fact did better on those tests than those with no ApoE4.
But that's not the really surprising part. Trumble also compared these results, genetic and cognitive, with parasite load. It seems that those with the gene in single or double form and parasites retained their mental acuity while those without parasites and even one copy of ApoE4 often did not. Â They looked like Westerners in terms of cognitive ability.
In other words, what we now see as a "bad" gene might have been protective and "good" when our ancestors lived a less parasitic life. Those genes might have evolved to protect our aging brains from parasites and now they are, instead, damaging brain cells because their evolutionary environment has changed.
Published on August 04, 2017 11:02
July 21, 2017
Take a Little Walk With Me
One of the most amazing things humans have ever done is walk around the world.
I don't mean one of those adventures where someone walks from Maine to Florida to lose weight, or California to New York just to prove they can. No, I mean that humans, as a species, have walked and walked and walked across the globe.
This desire to keep going must have very ancient roots for us. Early humans evolved in Africa, but about 1.5 millions years ago a species we now call Homo erectus walked out of Africa and headed east. They ended up in China and Indonesia and probably lots of other places we haven't discovered yet.
[image error] That first wave of world walkers was then followed, again and again, by various early humans and eventually by what we think of as fully modern humans or Homo sapiens sapiens.
And those people traveled even further, all the way to Australia, populating the Pacific Islands as they went. They were helped by land masses that have since been covered over by the Pacific and other bodies of water. But still, that's a long way to go to settle down.
Anthropologists have not been very sure about the exact dates of that exodus, but this week,
Preserved campfires, mortars and pestles, stone tools and painting materials confirm that modern humans, who evolved about 200,000-300,000 years ago in Africa, reached Australia pretty quickly and they kept to themselves and became the aboriginal people of today.
In fact, some of the researchers on the archaeology team were aborigines, a telling connection to the artifacts they were discovering.
This new archaeological work confirms that showed a date of 50,000 year ago for the ancestral aboriginal population.
But why did anyone leave Africa? And when they left, why in the world did they go so far?
We are not talking about a road trip that covered miles in hours, but an expedition that spanned continents in tens of thousands of years. One can only guess that they were following food (as in animals that walk around) or looking for food (as in vegetable matter during a drought), or maybe running away from something. But what? Over and over? We wait to hear more.
And by the way, Australia a young nation? Yea, no. Not at all.
Published on July 21, 2017 16:35
July 5, 2017
The Plague Is Back
As the New Mexico Department of Heath recently reported,
There are, it seems, three kinds of plague—pneumonic includes dangerous pneumonia, septicemic p makes the skin and other body parts turn black, and bubonic which causes swelling of the lymph nodes into baseballs. In other words, plague is not just a generic terms for any disease that spreads rapidly and potentially kills millions.  Plagues is, instead, a particular bacteria called Yersinia pestis.Â
What makes plague different from other contagious disease is the rapidity of its spread and the high risk of fatality if not treated.
We can blame rodents, and the fleas that bite them, for spreading plague to people, but who we really should blame is ourselves. Plague, and any infections disease, is a disease of civilization. That is, these scourges can spread because humans like to live in big groups and those groups are just the host that Yersinia pestis  thrives on.
Over the centuries, plague has devastated various populations and changed the history of the Middle ages and the Renaissance of Europe. But plague also initiated the cornerstone of public heath, and that happened in Venice, Italy.
In 1348, Venice was the first city that attempted to stop the spread of disease by making ships anchor off shore for 40 days (quaranta is Italian for 40, thus quarantine), and this was before anyone had a good idea how diseases spread. That quarantine was a sign of the strength of The Venetian Republic's desire to protect it citizens and its commerce which depended on ships coming and going.

That same year Venice implemented a rule that anyone who died of plague had to be buried on remote islands far away from the city, suggesting they knew, or hypothesized, that one could get this disease even from a corpse.
 In 1423 Venice then built a permanent plague hospital on a lagoon island just offshore � Santa Maria di Nazareth � where victims and their families were deported at the first sign of plague. Eventually that place was called Lazaretto (after Lazarus, a hope that someone could rise from the dead, even from plague). Other cities then followed Venice's example.
And today we quarantine everything from people to plants as a standard public heath measure, the best, and easiest, just-in-case measure to protect large populations from each other.
[image error]
Published on July 05, 2017 11:03
June 12, 2017
Baby Food
 My daughter’s first solid food, at three months of age, was ossobuco, the Italian dish of meat, wine and vegetables. If we had been Italian, or even visiting Italy, this would have made sense.Â
In fact, if she had been gumming her mashed up ossobuco in a restaurant in Italy the other diners would have ignored her, or clapped.Â
Since then, she has gone through phases of liking or not liking particular foods, but this past weekend she texted me a photograph of herself eating jellyfish, and I thanked the ossobuco and the wide array of dishes that she’s been offered over nineteen years for her adventurous gastronomic spirit.Â
But apparently, loading her baby spoon with food from other cultures is not all that was going on at our table. In a series of experiments with over 200 one-year-olds, Development Psychologists Zoe Liberman and Kathern Kinzler of Cornell University watched babies as the babies watched films of adults eating. This research protocol is a walk in the park because babies are fascinated with other people and when they gaze at someone for a long time, it’s meaningful. In general, the babies paid little attention if one person liked a food and the next person did as well but they stared longer, presumably confused, when the subsequent diner was disgusted by the test food.Â
More important, the babies also made layered social distinctions. If the two diners acted like friends and spoke the same language the babies expected them to like the same things. If they acted like enemies or spoke different languages, the babies expected different reactions to food.
We know that what we eat is highly cultural. Just discuss cupcakes and orange soda with the Maasai and watch them make the yuk face while we, in Western culture, would be hard pressed to drink a cocktail of milk and blood. Every culture’s diet is based on a particular kind of subsistence pattern linked to such mundane things as climate, topography, and available raw materials. The recent study shows that babies are not just being indoctrinated into their own cultures by the foods they are offered. They are also innately clocking people who look or talk the same or different, noting enemies and friends, figuring out who to trust and who not to trust. In other words, eating with others is one way babies go about filling in their social map. And eating alone is lonely.
As such, food is not just a cultural moment or a window to the past, it is not just identity or nutrition. Food and what we like or dislike is also one of the threads of connection that signal someone is one of us or not, a point of social communication that even infants recognize.
If I had known all this nineteen years ago, I might have paid more attention to the the context of my daughter’s first real meal. I would have seen her taking note of the reactions by the people at the table, good friends and devoted foodies who loved ossobuo. Her growing baby brain, already geared to such calculations, would have surely digested the fact that these people were part our tribe and that she was culturally home.
Published on June 12, 2017 11:54
May 25, 2017
The Monkey in the Coal Mine
A recent outbreak of [image error]
In fact, the monkeys have nothing to do with it, and authorities are now begging citizens to stop killing them.Â
Yellow fever is transmitted by mosquitoes, not mammals, and certainly not monkeys. As the death toll shows, these animals, fellow primates, are just as vulnerable to yellow fever as humans, maybe more so.Â
And of course, the fault is really ours. Slash and burn agriculture, deforestation, and climate change have made swamp out of large swaths of tropical forest. Swamps where where mosquitoes thrive. The human touch, fueled globally by greed, is turning a once pristine ecosystem into a charnel house.Â
[image error]
In that scenario, monkeys are actually useful and shouldn't be bludgeoned to death because they can be harbingers of infectious disease. (This sort of explanation, pointing out how some animal should be saved because it's useful to humans, pisses me off. But then I don't think humans are in charge of everything and every creature.)Â
That is, Brazilian authorities point out, the monkeys are the tropical equivalent of  "canaries in the coal mine." Miners used to bring caged canaries into the mines and when a canary died, they knew it was time to get out of that hole as soon as possible.Â
Danilao Simonini Teixeira, the president of the Brazilian Society of Primatology says  that people living in areas gripped with yellow fever don't seem to understand that monkeys are crucial to signaling the onset and march of diseases. Monkeys and humans are closely related primates and so when monkeys start dying it means something bad for humans.Â
Also, monkey deaths from yellow fever are putting some species, such as the golden lion tamarin at risk of extinction.  Â
[image error]
Brazil has has the greatest diversity of primate species on earth and what a shame to loose any of it at the direct hand of humans, as if the human caused habitat destruction  weren't enough.Â
When we are scared, we pick on the vulnerable, even when it wasn't their fault, even when they had absolutely nothing to do with it, even if they are suffering as well.Â
[image error]
And even when those vulnerable are so incredibly beautiful.
I Â
In fact, the monkeys have nothing to do with it, and authorities are now begging citizens to stop killing them.Â
Yellow fever is transmitted by mosquitoes, not mammals, and certainly not monkeys. As the death toll shows, these animals, fellow primates, are just as vulnerable to yellow fever as humans, maybe more so.Â
And of course, the fault is really ours. Slash and burn agriculture, deforestation, and climate change have made swamp out of large swaths of tropical forest. Swamps where where mosquitoes thrive. The human touch, fueled globally by greed, is turning a once pristine ecosystem into a charnel house.Â
[image error]
In that scenario, monkeys are actually useful and shouldn't be bludgeoned to death because they can be harbingers of infectious disease. (This sort of explanation, pointing out how some animal should be saved because it's useful to humans, pisses me off. But then I don't think humans are in charge of everything and every creature.)Â
That is, Brazilian authorities point out, the monkeys are the tropical equivalent of  "canaries in the coal mine." Miners used to bring caged canaries into the mines and when a canary died, they knew it was time to get out of that hole as soon as possible.Â
Danilao Simonini Teixeira, the president of the Brazilian Society of Primatology says  that people living in areas gripped with yellow fever don't seem to understand that monkeys are crucial to signaling the onset and march of diseases. Monkeys and humans are closely related primates and so when monkeys start dying it means something bad for humans.Â
Also, monkey deaths from yellow fever are putting some species, such as the golden lion tamarin at risk of extinction.  Â
[image error]
Brazil has has the greatest diversity of primate species on earth and what a shame to loose any of it at the direct hand of humans, as if the human caused habitat destruction  weren't enough.Â
When we are scared, we pick on the vulnerable, even when it wasn't their fault, even when they had absolutely nothing to do with it, even if they are suffering as well.Â
[image error]
And even when those vulnerable are so incredibly beautiful.
I Â
Published on May 25, 2017 11:45
May 22, 2017
A Head for the Future
Oh, the endless thinking. The ruminating that never, ever, stops, even when we are asleep. We think and think about the past and the future and often, very often, it takes effort to focus on the present.
I've always considered this mind buzz an unwelcome consequence of having a big brain. In fact, I believe that self-consciousness, which seems like such a good idea, is actually a human curse that we have to endure because it came along with the much more important puzzle solving skills that evolved to help us survive.
takes this idea a bit further—into the future. Authors Martin Segilman (a psychology professor at the University of Pennsylvania) and John Tierney (a science journalist) suggest that what really separates humans from other animals is not tool use nor language, but our ability to think about the future and come up with all sorts of possible scenarios.
The evolutionary advantage of this skill is obvious—lying awake at night trying to map where a tasty antelope might be going, or projecting where some tree might be fruiting, must have been a good thing.
But these days all this evolved forethought is not so advantageous. The problem is that humans are unable to turn off that mental shuffling through a zillion ways that things can turn out and we often get stuck on the negative possibilities while ignoring the positive possibilities.
Imagined catastrophes can be paralyzing and they are root of depression. Depression is, after all, the loss of hope and thinking that nothing will ever get better. In other words, the future looks beak. But in reality, the future is unknown and things might actually turn out pretty well.
The trick is to include the good possibilities, not just the bad and depressing ones, when you let your mind wander on its own into the future.
I've always considered this mind buzz an unwelcome consequence of having a big brain. In fact, I believe that self-consciousness, which seems like such a good idea, is actually a human curse that we have to endure because it came along with the much more important puzzle solving skills that evolved to help us survive.
takes this idea a bit further—into the future. Authors Martin Segilman (a psychology professor at the University of Pennsylvania) and John Tierney (a science journalist) suggest that what really separates humans from other animals is not tool use nor language, but our ability to think about the future and come up with all sorts of possible scenarios.
The evolutionary advantage of this skill is obvious—lying awake at night trying to map where a tasty antelope might be going, or projecting where some tree might be fruiting, must have been a good thing.
But these days all this evolved forethought is not so advantageous. The problem is that humans are unable to turn off that mental shuffling through a zillion ways that things can turn out and we often get stuck on the negative possibilities while ignoring the positive possibilities.
Imagined catastrophes can be paralyzing and they are root of depression. Depression is, after all, the loss of hope and thinking that nothing will ever get better. In other words, the future looks beak. But in reality, the future is unknown and things might actually turn out pretty well.
The trick is to include the good possibilities, not just the bad and depressing ones, when you let your mind wander on its own into the future.
Published on May 22, 2017 12:30
May 15, 2017
Regrets
We all have regrets, and usually they are highly personal. Most of them are about decisions we made long ago and when ruminating (or obsessing) about these regrets, we fantasize that a different choice might have led to a different life. One is that is would, of course, be better than how we ended up.
But there are some people who regret  what they did to others, and that must be a hell of its own kind. And what if the people you hurt were strangers? What if you played a major role in actually changing their way of life? And not for the better?
I'm not talking about politicians, or despots, or law makers, but anthropologists, people who have also, in many situations, had a hand in destroying the very cultures they studied.
In the last few months, The New York Times published two articles on the people of the Andaman Islands. These islands are in the Bay of Bengal and under the jurisdiction of India. And what makes Andaman Islanders so special is that they are hunters and gatherers who have only recently been integrated into modern Indian life, and it's fair to say things are not going well.
Western culture has a very long history of trying to end, or protect, what they see as "primitive cultures" (an insult in itself). Although the Indian government decided to try and protect the Jarawa and Sentinelese people by surrounding their land with a buffer zone, the modern world leaked in. , now 82, knows he is partly, or fully, to blame for encouraging these people to leave the forest and interact with Indians. For two decades he spent time with the islanders and eventually, they did indeed leave their home and seek the goods of modern culture. The result is a destruction of aboriginal life and the very soul of a people. And lots of horrific culture clash.
¹ó´Ç°ùÌý±ð³æ²¹³¾±è±ô±ð,ÌýThe Times also reported on the that was conceived when a Jarawa woman was raped by an outsider. The light-skinned baby was clearly not a full Jarawa and apparently they have a tradition of killing babies outside their blood line.
It's a mess and the authorities don't know who to prosecute or what to do.
Pandit now regrets his role in what has turned out to be the usual story of a society that went from self sufficiency and an intact cultural structure to corruption and poverty. As Pundit said to The Times, "Now they have gotten infected. They have been exposed to a modern way of life and they cannot sustain. They have learned to eat rice and sugar. We have turned a free people into beggars."
 Mr. Pandit sits at home ruminating on his life as a destroyer of a culture and a people. But he is not alone. As the world becomes more populated, and its most remote environs penetrated, this story will continue to be the usual one. There is something about "modern" culture (read Western culture) that has a manifest destiny about it. We can't help butting in and these groups can't help wanting what we have.Â
Our culture and society, when first viewed, seems so shiny and full of great things to see and to own. But in the end, it just doesn't fit everyone. In fact, it does't even work for many who have been citizens of modern culture since they were born.Â
Published on May 15, 2017 12:17
February 6, 2017
Keep Going
It's no secret that Western culture is not exactly the most healthy of places. Sure, we have lots of stuff and lots of food, but our affluence has also brought lots of down time. Or sitting down time. On sofas, in cars, across La-z-Boys. And the result is not pretty.
Anthropologists have long suggested that lounging around and stuffing our faces is not exactly how we were evolutionarily brought up. Instead, the theory goes, humans are physically hunter-gatherers, people who have to run after game or wander about the landscape for tubers and so our bodies are supposed to be on the go.
with the generous aid of hunter-gatherers themselves. Researchers from the University of Arizona and Yale University recruited Hadza hunter-gatherers in Tanzania to wear heart monitors for two weeks as they went about their day.Â
As the data show, the Hadza have great heart health at any age and, as expected, it's because the Hadza are always on the go. They aren't running and jumping but simply briskly active for more than two hours a day.Â
Men follow game all day and women walk into the bush and dig vigorously and so their heart rates are up.Â
As a result their blood pressure is down and their heart muscle exercised.Â
The Hadza also lie around a lot, but that's the reward for finding food, not the normal position for eating it.
Of course, this is not rocket science, but it is one way anthropologists have added to the conversation about the rate of obesity in Western culture and our modern health crisis. We now live in a world where it's almost mandatory to drive to get food, drive it home, and eat it sitting in front of the T.V.
Funny, Western culture also has the highest rate of depression in the world. Hey anthropologists, anything to add about that?
Anthropologists have long suggested that lounging around and stuffing our faces is not exactly how we were evolutionarily brought up. Instead, the theory goes, humans are physically hunter-gatherers, people who have to run after game or wander about the landscape for tubers and so our bodies are supposed to be on the go.
with the generous aid of hunter-gatherers themselves. Researchers from the University of Arizona and Yale University recruited Hadza hunter-gatherers in Tanzania to wear heart monitors for two weeks as they went about their day.Â
As the data show, the Hadza have great heart health at any age and, as expected, it's because the Hadza are always on the go. They aren't running and jumping but simply briskly active for more than two hours a day.Â
Men follow game all day and women walk into the bush and dig vigorously and so their heart rates are up.Â
As a result their blood pressure is down and their heart muscle exercised.Â
The Hadza also lie around a lot, but that's the reward for finding food, not the normal position for eating it.
Of course, this is not rocket science, but it is one way anthropologists have added to the conversation about the rate of obesity in Western culture and our modern health crisis. We now live in a world where it's almost mandatory to drive to get food, drive it home, and eat it sitting in front of the T.V.
Funny, Western culture also has the highest rate of depression in the world. Hey anthropologists, anything to add about that?
Published on February 06, 2017 09:15
January 30, 2017
Lucy in the Trees Without Diamonds
When I first learned the human fossil record back in my undergraduate days, it was a straight shot from Homo habilis to modern humans. But since then, the path of human evolution has become a tangled tree with many branches, and it's much harder to explain. Now we have all kinds of Australopithecines and any number of the species Homo and with each discovery comes a rethinking of our past.
One of those controversies focuses on Australopithecus afarensis, and more specifically on the 40% compete fossil specimen affectionately known as "Lucy."
I admit that Lucy has a special place in my heart. For many years I hung out with the paleontologists who found her, and so her fame at the time as the most ancient hominid feels like part of my personal history.
During my first years as a professor at Cornell, I had the job of purchasing and organizing fossil casts so that students could look at, and even hold, their ancestors  duringIntroductory Biological Anthropology.
For about two years, all sorts of casts were mailed to me, but none were as special as those of Al-288-1's (Lucy's official fossil name). She arrived in many boxes and I had to unwrap each piece and place them, one by one and in  correct anatomical orientation, onto sheets of  foam set in wood drawers. It was, for me, not so much about getting ready for a class as a sacred, deeply moving, act.
I remember so distinctly holding the cast of Lucy's tiny pelvis and thinking about bipedalism and how that bone confirmed that Lucy and her kind walked one two legs, even 3.2 million years ago, meaning she was a human, not an ape.
But since that time, researches have deduced that Lucy actually retained some ape characteristics—curved hand bones, long arms, and all suggesting lots of time spent in trees, perhaps.
I don't really care where Lucy spent her time. She's my special fossil. It was this woman's bipedalism that pushed the hominid lineage back millions of years. In a sense, Lucy stood up for us. And she was, in fact, the first women to stand up for herself and others.
Our legacy, in other words, started with her.
One of those controversies focuses on Australopithecus afarensis, and more specifically on the 40% compete fossil specimen affectionately known as "Lucy."
I admit that Lucy has a special place in my heart. For many years I hung out with the paleontologists who found her, and so her fame at the time as the most ancient hominid feels like part of my personal history.
During my first years as a professor at Cornell, I had the job of purchasing and organizing fossil casts so that students could look at, and even hold, their ancestors  duringIntroductory Biological Anthropology.
For about two years, all sorts of casts were mailed to me, but none were as special as those of Al-288-1's (Lucy's official fossil name). She arrived in many boxes and I had to unwrap each piece and place them, one by one and in  correct anatomical orientation, onto sheets of  foam set in wood drawers. It was, for me, not so much about getting ready for a class as a sacred, deeply moving, act.
I remember so distinctly holding the cast of Lucy's tiny pelvis and thinking about bipedalism and how that bone confirmed that Lucy and her kind walked one two legs, even 3.2 million years ago, meaning she was a human, not an ape.
But since that time, researches have deduced that Lucy actually retained some ape characteristics—curved hand bones, long arms, and all suggesting lots of time spent in trees, perhaps.
I don't really care where Lucy spent her time. She's my special fossil. It was this woman's bipedalism that pushed the hominid lineage back millions of years. In a sense, Lucy stood up for us. And she was, in fact, the first women to stand up for herself and others.
Our legacy, in other words, started with her.
Published on January 30, 2017 10:57
January 28, 2017
Double Clicking But Not With A Mouse
When I was an undergraduate, it was standard fare for students to watch the classic anthropology film called . Made in 1957, it was still a revelation when I saw it the first time in 1975.Â
The Hunters is a documentary of the Ju/'hoansi people of Namibia. You might know them as !Kung or Bushmen but that's certainly not what they call themselves.Â
The focus of the film is a many day hunt of a giraffe by four men. The viewer might be struck by their tireless pursuit of the failing giraffe, or the savanna landscape, or maybe what the hunters are wearing and also using as weapons. But what struck me most back then, and the many times I've watched the film since, is the language.
Sounds that we might group together "clicks" pepper Ju/'hoansi language (called Taa or !Xoon) where the ! signals a particular click that allides onto the word) making it a symphony of sound, and rendering any Romance language flat and boring in comparison.Â
Also, a listener is struck dumb on how a person could actually do those sounds, do them quickly and repeatedly. An English speaker attempting to imitate Taa usually ends up striking the tongue off the roof of the mouth and sputtering tsk tsk tsk, as if indicating someone has done something wrong. So not even close.
 It seems thatÂ
The home base of the treasure trove of the ingenuity of human language is the Kalahari Desert of Namibia, Southern Africa. Many Ju/'hoansi are still full time or part time hunters and gatherers while some have settled on government land. And there are only a few thousand people left how utter these sounds as they talk to each other and go about their day.
Taa is surely one of the most beautiful human mouth sounds on earth, along with with babies laughing, and whispered words of love, and what a shame that it is disappearing from the human linguistic playbook.Â
The Hunters is a documentary of the Ju/'hoansi people of Namibia. You might know them as !Kung or Bushmen but that's certainly not what they call themselves.Â
The focus of the film is a many day hunt of a giraffe by four men. The viewer might be struck by their tireless pursuit of the failing giraffe, or the savanna landscape, or maybe what the hunters are wearing and also using as weapons. But what struck me most back then, and the many times I've watched the film since, is the language.
Sounds that we might group together "clicks" pepper Ju/'hoansi language (called Taa or !Xoon) where the ! signals a particular click that allides onto the word) making it a symphony of sound, and rendering any Romance language flat and boring in comparison.Â
Also, a listener is struck dumb on how a person could actually do those sounds, do them quickly and repeatedly. An English speaker attempting to imitate Taa usually ends up striking the tongue off the roof of the mouth and sputtering tsk tsk tsk, as if indicating someone has done something wrong. So not even close.
 It seems thatÂ
The home base of the treasure trove of the ingenuity of human language is the Kalahari Desert of Namibia, Southern Africa. Many Ju/'hoansi are still full time or part time hunters and gatherers while some have settled on government land. And there are only a few thousand people left how utter these sounds as they talk to each other and go about their day.
Taa is surely one of the most beautiful human mouth sounds on earth, along with with babies laughing, and whispered words of love, and what a shame that it is disappearing from the human linguistic playbook.Â
Published on January 28, 2017 12:56