Feeds:
Posts
Comments

I was so absorbed in the Hidden Figures story, so in awe of the women’s accomplishments that I needed to know if it were true, if based on real people and actual events. If so, a movie showing three women’s critical contributions to NASA during the Space Race in the 1960s, nearly sixty years ago, fits into a theme I’ve been following. Like the movies in my last two blog posts, Hidden Figures presents a new and modern perspective on women, on accepting women in roles previously held almost exclusively by men, on a woman being something and someone other than a mother, a wife, daughter, sister, or an essentially feminine, female presence. Until recently, movies made in Hollywood and elsewhere, with few exceptions, defined women, even strong women, in relation to a man. If the woman were employed, it was in a job considered appropriate for a woman, as a secretary, nurse, librarian, primary school teacher. The women in Hidden Figures are scientists and engineers working at the cutting edge of technology; only in recent years is the public prepared to hear their story and give them the respect they deserve.

Included in this excellent review of the movie is an idea for another such movie. In the late 1800s “ … the Harvard College Observatory employed a group of women who collected, studied, and cataloged thousands of images of stars on glass plates. As chronicled in Dava Sobel’s book The Glass Universe, these women were every bit as capable as men despite toiling under less-than-favorable conditions. Williamina Fleming, for instance, classified over 10,000 stars using a scheme she created and was the first to recognize the existence of white dwarfs. While working six-day weeks at a job demanding “a large capacity for tedium,” they were still expected to uphold societal norms of being a good wife and mother.” I and little girls of previous generations were told that scientists are always men because boys are naturally good at math and girls are not and everyone knows that this is true.

The movie Hidden Figures is based on Margot Lee Shetterly’s non-fiction book Hidden Figures: The Story of the African-American Women Who Helped Win the Space Race (2016). She was born in Hampton, Virginia, where mathematician Katherine Johnson, (played by Taraji P. Henson), engineer Mary Jackson (by Janelle Monáe) and supervisor/computer expert Dorothy Vaughan (by Octavia Spencer) lived and where the NASA-Langley Research Center is located. Shetterly’s father was a research scientist in NASA and her mother, a professor in Hampton University. Shetterly is Black and knows the Hampton community from the inside.

How faithful is the movie to reality? From Richard Brody’s review in the NYTimes for the racial situation —   “ … the movie is aptly and thoroughly derisive toward the discriminatory laws and practices that prevailed at the time. The insults and indignities that black residents of Virginia, and black employees of NASA, unremittingly endured are integral to the drama. …”   The first scene of the three women together shows them on a country road, their car stalled, the hood open and Dorothy underneath the engine, trying to fix it. “ … A police cruiser approaches. They tense up; Dorothy says, ‘No crime in a broken-down car,’ and Mary responds, ‘No crime being Negro, neither.’ Their fearful interaction with the officer—a white man, of course, with a billy club in hand and a condescending bearing—is resolved with a comedic moment brought about by the women’s deferential irony. What emerges, however, is nothing less than an instance in a reign of terror. …”

The movie’s historical accuracy is discussed here.  The scene where Harrison smashes the Colored Ladies Room sign did not happen. In real life Katherine refused to walk the extra distance to use the colored bathroom and, in her words, “just went to the White one.” Harrison letting her into Mission Control to witness the launch was added. Screenwriter Theodore Melfi said he saw no problem with the changes he made; the movie representations are essentially true in showing the racism that Black women dealt with every day. Because of Virginia’s segregation laws, African American “computers,” as the women mathematicians were called, had to work in a separate “colored” building at the Langley Research Center. Not mentioned in the movie — The White women “computers” were provided with special housing, while the Black women were left to find their own accommodations. An interview with Katherine Johnson is here, in a video. She’s an attractive person. She says, “I didn’t feel any segregation. I knew it was there, but I didn’t feel it.”

Johnson’s calm self-confidence is impressive, as is that of Mary and Dorothy. I wondered where it came from and decided to look into the personal history of each of the three and learn something of the community and culture that shaped her. To quote James Baldwin: “History is not the past. It is the present. We carry our history with us. We are our history.”

While working at NASA, the three women were living in Hampton, in the city’s Black middle-class community. And what a history the city has.

It dates from 1607, when Captain Christopher Newport and his men, having sailed from London, landed at Old Point Comfort, later Fort Monroe, on the southeastern end of the Virginia Peninsula, and claimed it for the colony of Virginia. They continued sailing up the James River and established Jamestown, America’s first English colonial settlement. A few years later, Virginia colonials returned to Old Point Comfort, took over the Native American community on the Hampton River and established their own town on the site.

The Thirteen Colonies

We know from a letter written in 1619 by John Rolfe, widower of Pocahontas, that some 20 Africans from Angola had been rescued at Old Point Comfort from a slave ship. They were the first Africans to come ashore on English-occupied land in the future United States. The child of one couple, Antoney and Isabella, was baptized in 1624 in Hampton’s Anglican Church. At that time in that place, the Africans were considered indentured servants but the slave trade, mostly of Black slaves from the Caribbean, made slaves easily available in the colonies. In the North they lived in towns and worked as domestic servants, as artisans, sailors, longshoremen. In Virginia, the two Carolinas, and Georgia, in the South, where the weather was warm and wet and suitable for growing profitable cash crops, slaves worked on plantations growing tobacco, rice and indigo. In South Carolina, from the early 1700s to the mid-1800s, slaves outnumbered free Whites. Not insignificantly, South Carolina had in 1696 adopted the first full-scale slave code in British North America, modeled on the British colonial Barbados slave code of 1661 that came from the Caribbean, from island estates where White men ran plantations with African slaves who did the nasty, hard work of growing and processing sugar to be sold in Europe. In the northern colonies many colonists began to call for the abolition of slavery. They objected to slavery on moral grounds and, except for New England slave traders, it was not important to their economy. Nevertheless, after the American Revolution, 1775–1783, as a concession to the four Southern colonies, to keep them in the Union, the northern colonies agreed to the U.S. Constitution acknowledging, in coded language, the institution of slavery.

Wherever the plantation and whatever the crop being grown, plantation and slavery — dehumanizing, repressive, violent slavery — went hand in hand. In the 19th century, after the 1784 invention of a machine, the cotton gin, that removed seeds from the plant’s fiber, throughout the South cotton plantations grew in number and in size, becoming enormously profitable producing an easily storable and transportable raw material for the equally profitable textile factories of England and New England.

Ruins of Hampton, 1862

The American Civil War, from 1861 to 1865, a war over the future of slavery in America, set the circumstances for the city of Hampton and Fort Monroe to play a unique role in the conflict. Most of Virginia became part of the Southern Confederate States of America, seceding from the United States, but Fort Monroe remained in Union hands, part of the North. Confederate-owned slaves, to gain their freedom, escaped to the Union fort and were protected there in the Grand Contraband Camp, America’s first self-contained African-American community. In 1861, Confederate troops burned Hampton as they left in defeat but after the war the Black people who stayed and lived there revived the city. A number of modern-day Hampton streets still carry the names from that community. In 2010, 49.6% of Hampton’s population was African-American.

Emancipation Oak, Hampton

Mary Jackson, who had grown up in Hampton, attended Hampton University, a special university with a special history. In November 1861, Mary Smith Peake, the first Black teacher of the American Missionary Association (AMA), taught the children of Black freedmen who were living in the contraband camp. Until a building was provided for her, she held class under an oak tree for up to 50 children and at night for some 20 adults. In 1863, the Black community gathered under the Emancipation Oak to hear the first Southern reading of President Abraham Lincoln‘s Emancipation Proclamation.

Hampton Institute, 1899 class in mathematical geography

Hampton University grew out of the Hampton Normal and Agricultural Institute, later Hampton Institute, established in 1868 by the AMA and the post-civil war government’s Freedmen Bureau. It is one of the historically black colleges and universities and, as well, a land grant university. The latter were institutions of higher learning benefiting from a 19th century federal program that enabled state governments to use federal land and other resources to support such colleges. Booker T. Washington was one of Hampton Institute’s first students.

 

Dorothy Vaughan was born in 1910 and grew up in a West Virginia town, in a state that was formed in 1861, after the American Civil War had begun. Counties on the northwestern side of Virginia broke away, separating from the Confederacy to stay in the Union and in the North, where they shared a border with Pennsylvania and Ohio. West Virginia on the west had coal mines for steam engines, of interest to northern railway building, and to the east mountain country suited to small family farms, not to plantations that were more like running a business that made slave holding profitable. The new state’s constitution provided for the gradual abolition of slavery and for “free colored children” to be provided with schools. African Americans were free but White prejudice and segregation continued. I wrote an essay here, on the movie Race and on Jesse Owens, the famous Black Olympics champion from Ohio. He was Dorothy Vaughan’s age. His history is relevant to this discussion.

The West Virginia state government required that schools be provided for all children but the town where Dorothy lived, Morgantown, at the Pennsylvania border, provided Black children with only part-time schooling and not in a public school building. Instead, classes were held in the St. Paul’s African Methodist Episcopal Church. In 2010, only 4% of the town’s population was African-American, and in the 1910s and ‘20s the Black community may also have been that small. The very talented Dorothy graduated from high school at age 15 and and was moved by her family to Ohio, where she attended, on full scholarship, the historic Black Wilberforce University.

I thought it curious that an African American university would be situated in an Ohio town. The search for an explanation of why and how became an education in my country’s history.

Ohio joined the Union in 1803 as a free state, i.e. no slavery. The town was named for William Wilberforce, a statesman in England who worked for the abolition of slavery and the end of the slave trade in the United Kingdom and its empire. The Wilberforce townspeople were small farmers and tradesmen most of whom, most likely, were members of the Methodist Episcopal Church (MEC), the organization that helped establish Wilberforce College.

The Methodist Episcopal Church began in the Thirteen Colonies in the 1760s, as an extension of the Methodist church being created in England by John Wesley, a priest in the Church of England who was organizing the common people largely neglected by the Church hierarchy. I once asked an English Methodist pastor about his church and the first thing he thought to tell me was of the early Methodists going into the factories and teaching workers to read and write, at the time an illegal activity. The Methodists in this break-off church were the poor but also shopkeepers, craftsmen, workers and small farmers, the sort of people who would immigrate to the New World. They respected hard work, honesty, virtue and repudiated upper class values and lifestyles. Their strong anti-slavery beliefs and actions were based on moral principles but also on the fact that cheap and docile slave labor undercut employment and decent wages for free men. They welcomed slaves as allies and members of their congregation.

By 1800, Methodism was expanding into the region around Cincinnati, Ohio, which includes Wilberforce, and by 1807, the first Methodist church had been built in Cincinnati. African American freedmen were members of the church but lacking equal status with Whites in the congregation, in 1816 a group left the Methodist Episcopal Church to establish the African Methodist Episcopal Church (AME). In the 1820s, the MEC began in a number of the states to build colleges for their membership, and in 1865, to provide classical education and teacher training for Black youth, the Ohio Conference of the MEC and the AME founded Wilberforce College. Both Black and White community leaders were on the college Board. The school was later bought by the AME, the first college to be owned and operated by African Americans. It has an interesting history, described here.

Wilberforce was a station on the Underground Railroad, a network of secret routes and safe houses established in the United States during the early 19th century until the Civil War. It was used by slaves to escape the plantations into free states and Canada with the aid of abolitionists and allies who were sympathetic to their cause.

Dorothy graduated from college at age nineteen, in 1929, hoping to continue her education at Howard University, the highly ranked historically Black university in Washington, D.C., but the Great Depression set in and she needed to find a job. How she came to teaching math in a poor Black school in the middle of Virginia, I never discovered. I did learn, however, something new about the ground-breaking 1954 Supreme Court decision Brown v. Board of Education. In 1896, the Supreme Court had ruled in Plessy v. Ferguson that the segregation of races in public facilities was constitutional if the separate facilities were equal. In the 1954 decision the Court ruled that separate cannot be equal and public school system must desegregate.

I discovered that the students of the school where Vaughan taught for ten or more years, the Robert Russa Moton High School, had played a critical role in bringing about the 1954 decision. In 1951, 16 year-old Barbara Johns organized the students to protest the unfair treatment of education for Black students and the NAACP convinced the parents to protest through the courts. Moton High provided three-fourths of the plaintiffs in Brown vs. Board of Education. The school building is now a National Historic Landmark and a civil rights museum. The school was named for Robert Russa Moton from the Hampton Institute.

In 1943 Dorothy moved with her husband and children to Newport News, a city on the James River north of Hampton. It was during World War II and NACA, based in Hampton, had need of mathematicians. She soon rose to supervisor of the Black women “computers,” prepared them and the White “computers” to work the machine computers by teaching herself and her staff Fortran. She later headed the programming section of the Analysis and Computation Division (ACD) at Langley. All this while also raising her six children.

 

Katherine Johnson was born in 1918 in a small but wealthy West Virginia town near Virginia. Her curiosity and extraordinary math talent was evident at an early age, and because the town had only a primary school for Black children, her parents took her to a high school on the campus of West Virginia State College, which was, like Hampton Institute, a Black land grant college that attracted the top professors of the day, including the sociologist W. E. B. Du Bois. Her Black professors tutored her and arranged for her to receive a level of education in mathematics not ordinarily available to highly talented Black students, or perhaps to many students anywhere. She graduated from high school at age 14 and at age 18 from college, with high honors in mathematics and French, followed by teaching in Black high schools.

In 1939, a few years after Katherine graduated, West Virginia State College became the first of six historically Black colleges to be authorized by the Civil Aeronautics Authority to establish an aviation program, a program that graduated a number of the Black aviators in the World War II U.S. Army Air Corps. Others joined the famed 99th Fighter Squadron and 332nd Fighter Group (the Tuskegee Airmen) that served with distinction in the European TheaterRose Agnes Rolls Cousins was the first African American woman to become a solo pilot in the Civilian Pilot Training Program.

Following Brown v. Board of Education, West Virginia State College desegregated and was transformed from an all-Black land grant college to one with mostly White students.

After graduation, Katherine taught in Black high schools. When West Virginia decided, in 1939, to integrate its graduate schools, she and two young men were the first Black students to be offered places in West Virginia University, in Morgantown. She resigned from her teaching job and enrolled in the graduate math program. At the end of the first semester, however, she left school to have a child, then returned to teaching when her three daughters were older. In 1952, having learned of positions opening in NACA for Black women, Katherine and the family moved to Newport News for her to take advantage of the opportunity. Her husband died of cancer in 1956. In the movie, she is shown raising her daughters with the help of, I think, her mother and the beginning of her marriage to James A. Johnson, who had been a Second Lieutenant in the Army.

On September 22, 2017, the 99-year-old Katherine Johnson cut the ribbon for the Katherine G Johnson Computational Research Facility at the Langley research center in Hampton, Virginia, where she was honored as a trailblazing “human computer.”

It felt good seeing Katherine Johnson so honored and knowing that she and Mary Jackson and Dorothy Vaughan are being honored by the movie and the book on which it is based.

 

Advertisements

My previous post is a commentary on Their Finest, a pleasant-to-watch, true-to-the-period movie of people making a movie in London, set in the period of Britain’s entry into the Second World War, centered on a woman finding herself as a professional screenwriter. In A Woman in Berlin, also set in WWII, we see women coping with total destruction and destitution at war’s end. It is a grim but powerful movie. I saw it over a year ago and can still recall many of the scenes and images. From Roger Ebert’s review – The central character, a woman known as Anonyma, “… is played by Nina Hoss, who …. has emerged as a strong, confident actress with innate star quality.” And from A.O. Scott in the New York Times’ review — “ … Ms. Hoss, whose strong frame and graceful bearing suggest both old-style movie-star glamour and Aryan ideals of feminine beauty, is an actress of haunting subtlety, and the film, episodic, ambitious and a few beats too long, is held together by the force of her performance. …” I was too engrossed throughout to think it too long.

It strikes me that both movies, both made in the 2000s, one British and the other German, share a modern perspective on their women characters and implicitly on women’s status in society. Until recently, movies from Hollywood and elsewhere, with few exceptions, defined women, even strong women, in relation to a man or holding a position, such as nurse, librarian, primary school teacher considered a woman’s job. In Their Finest women are shown using the circumstances available to them during the war to become independent, self-reliant, autonomous individuals who move out into the working world, defining themselves in terms of their skills and accomplishments. Reviewers wrote that in real life many women at the time, in the mid-1940s, believed the effect would be permanent, but I was a teenager then and remember no such talk. When peace and prosperity returned, women not only returned to their traditional roles, they forgot their wartime independence. For example, an American woman I knew who successfully held a high level government position in Washington during the war was replaced at war’s end by three young, inexperienced men, each of whom received a beginning salary higher than what she was ever paid. When she told me this I was incensed, but she was not; she thought it perfectly normal. She went on to open a small shop and became a successful businesswoman, then married and willingly closed the shop to become a proper housewife. This was America, and my experience has been that European women tended to be even less independent-minded. Victoria, in her comment on such matters, here, would agree with me. Her comment is at the end of my essay on women’s roles throughout history, which would begin to change in 1960, when we at last had The Pill, a reliable, affordable contraceptive. (Victoria’s delightful blog is The Franco-American Flophouse.)

The woman at the center of A Woman in Berlin is based on a real person, a journalist who kept a diary at the end of World War II when the Red Army took over Berlin. She recorded the systematic rape of German women, including herself, by Russian soldiers and how she and other women, always on the edge of starvation, used sex and their wits in dealing with the men to obtain food, some degree of safety, simply to survive. She published, anonymously, her diary as a book. From The Guardian – “ … When the diary that provided the source material for “A Woman in Berlin” was first published in Germany in 1959, it was attacked in print and quickly pushed aside. In West Germany patriarchal attitudes defining male sexual violence as a matter of female honor made the frankness of the diary seem brazen and shameful, while in the East (East Germany) criticism of the heroic Soviet liberators was forbidden.   The sexual depredations of the victorious Red Army in Germany at the end of World War II were hardly secret at the time, certainly not to the women who suffered them. But the systematic rape of German women by Russian soldiers was nonetheless shrouded in silence for decades. …”

The author records … the world actually in front of her eyes, and here no detail escapes her — the stench of buildings where Russians have defecated wherever it suited them, the eerie silence of a whole city hunkering down, the behavior of her neighbors, often petty even in crisis. She has written, in short, a work of literature, rich in character and perception. It is dispiriting that shame or fear of social ostracism caused her to hide behind the label Anonymous (her fiancé left her when he heard about the rapes), but even anonymously she has given us something that transcends shame and fear: the ability to see war as its victims see it. …”

From Roger Ebert “ … What little I know about war suggests that sometimes it comes down to a choice between two dismaying courses of action. Some people would rather die than lose their honor. Most people would rather not die, particularly if their deaths would not change anything. Why is a woman’s sexuality her honor? A man using sex as an instrument to survive would not be shamed. ”

In 2003 the diary was republished, still controversial but in the time of the women’s movement and after the collapse of Communism. It was a best seller.  And thus, finally, a movie was made that tells what actually happened and respects the women who survived the terror inflicted upon them.

There are other signs in cinema that the revolution in women’s status is happening everywhere, throughout the world. I enjoyed Whale Rider. It is a well-acted, beautifully filmed, authentic story of a Māori girl seeking to play a role in her community traditionally held only by a male. Her grandfather opposes her at every turn but a number of modern-minded men and boys actively assist her. Modern means including women in traditional men’s roles if that mode suits them, freeing boys and men from taking on traditional roles that do not suit them and giving individuals the opportunity to be themselves. The movie was made in 2002, a coproduction between New Zealand and Germany, directed by Niki Caro, based on the novel of the same name by Witi Ihimaera. It stars the wonderful actor Keisha Castle-Hughes as Kahu Paikea Apirana, a twelve-year-old Maori girl.

Here for a preview.  I learned something about film directing from this analysis of various scenes.

 

It’s been a long time since I watched a movie twice in the same day, but after seeing Their Finest on DVD, I wanted to hear the Director’s commentary. Both the movie a second time and listening to Lone Scherfig were well worth the time taken.

Their Finest is set in London in 1940, during the height of the Blitz, and takes its title from a speech by Winston Churchill. I recommend that if you are watching it on-line or on DVD you should first watch this ten minute explanation of what happened at Dunkirk and why it is important. Actually, the video is so well done it is watching, period.

Their Finest is in the movie within a movie genre but special. The movie being made within the movie is for the British public but also as propaganda for Americans. It will remind the British of the heroism of ordinary men and women who came to the rescue in the Dunkirk crisis, raise their morale and encourage them to carry on even as they endure the terrible bombing. The Americans will see the British as brave allies holding the fort for democracy and needing American help. Their Finest is history accurately represented and it informs us on how important movies and the cinema were for people in those most trying times.

Scherfig remarks, with evident pleasure, that because the British maintain and respect their built environment, the film crew found locations and ways to present the streets and buildings of London as they really were in the 1940s, aided by the use of CGI for recreating the images of a bombed cityscape. Scenes of the Dunkirk beach and shore were shoot on the beautiful beach of Pembrokeshire, Wales.

The leading man in Their Finest, a screenwriter, Tom Buckley, played by Sam Claflin, expresses his dream of creating a quality movie that is worth people’s time to watch. We watch how they accomplish this, how all the elements come together. I was intrigued by ways in which the script took shape. Scherfig in her comments speaks of the extraordinary effects produced in the main film by the camera men and women and by the access to archived films, by the set designers, the crews working behind the scenes, and above all, by the amazing actors. All the roles in Their Finest, even the smallest, are played by experienced and skillful actors. Gemma Arterton as Catrin Cole is just right as the central character, subtle and expressive. I was engaged with her throughout. Jeremy Irons does a wonderful cameo as Secretary of War quoting Shakespeare. Favorites of the reviewers are Bill Nighy as Ambrose Hilliard and Helen McCrory as Sophie Smith who becomes his agent. From the New York Times review “… sly puss Bill Nighy as a faded star in permanent high dudgeon over his career …” steals scenes and “… Sophie as a talent agent, brings man and dog to heel in a few short, barbed scenes. She’s the kind of no-nonsense woman you can imagine contributed to the real war effort, including in the film industry. …” The Director reports that she had to omit from an already complex movie a subplot they had filmed of Sophie supporting Jewish refugees in her home.

I liked the Phyl Moore character, played by Rachael Stirling and the way she evolves. She is the executive who keeps tabs on the film crew for the Ministry of Information. Tom, the screenwriter, definitely disapproves of her and of women generally taking on work he believes belongs to men, and he is no different from any of the other men on this. Catrin’s main challenge is to prove herself as a writer, as an independent individual and to make certain their screenplay does justice to the bravery of women in the movie they are making. Hilliard points out that the women and he, as an aging actor, are benefiting from the war taking away the young men who would ordinarily step in and push them aside. From the Guardian review — We see “… the quiet revolution in wartime sexual politics – the key female characters are in their jobs because the chaps are otherwise engaged, but for the most part, the women have no intention of going “back into their boxes” once the war is over. It also acknowledges the dismissive, tweedy sexism of the era by having even the most sympathetic of the male characters, sarcastic bespectacled screenwriter Tom Buckley, blithely dismiss women’s dialogue in a movie as “slop”. …”

Phyl says to Catrin that the men will expect the women to go back into their box once the war is over, implying that women will resist this, but in fact they did go back to traditional roles. This is clear from the BBC Call the Midwife I comment on here. The fundamental change for women, and for the larger society, came later, with the reliable contraception.

This is an excellent review that includes a video showing a critical scene in the story line, one where Catrin approaches Hilliard and they, as writer and actor, join forces to save the movie they are making. You will see a bit of the film’s marvelous acting.

The Guardian review is very good and includes a six and a half minute video.

My Indian husband, Ravi, told me long ago, when he was still new to American culture and could contrast it with his own, that the American view of life was different from his, that Americans see an individual’s life as a trajectory that begins in the home as a launching platform, on into school, the job market, marriage and children, gradually rising to the prime of life, followed by decline into old age. Indians, he said, see life in four stages, the ashramas, of student, family, post-family, and a fourth, last stage. At the time we were students, still believing, like young people everywhere, that we would live forever, always young. I thought of ashramas as little more than an interesting cultural concept and went through the first three stages without remembering or being reminded of it. It was in my Sannyasa stage, reexamining our lives together and studying Ravi’s life in India to better understand him that I rediscovered the ashramas. Of course, traditionally, with rare exceptions, the stages are for men. What I write here is an addition to recent modifications that include women, (I discussed in a previous post my thoughts on women in the scheme of things.)

I read that within Hindu philosophy the last stage of life, Sannyasa, is one of renunciation and asceticism. I have already redefined the third stage, Vanaprastha, (forest dweller) to be the time in life when family responsibilities are lessened and one turns to activities in and for the community. I use the modern word, Senior. The second stage, Grihastha (householder), is the central period of adulthood, beginning when the first stage, the Brahmacharya (student), ends. Traditionally the family arranged the boy’s marriage when he was in his late teens, maybe early twenties. Marriage for a girl was linked to her first menstruation, when she was able to have babies. (I discuss a charming, excellent movie, The Householder, here) And here the idea of age grades rather than stages of life and of when and how age grades began changing in our modern society.

The sannyasi, as enters the last stage of life, was expected to detach himself from material life and spend his time reflecting on the spiritual, on matters more abstract than a specific religion. It seems to me the sannyasi was, and still is, searching for something called the meaning of life. Even though that phrase may sound banal and overused, it is a profound philosophical question to be asked and explored in multiple ways, without a universally accepted answer ever being found. I certainly have not found the meaning of life. I cannot imagine a creator god as an answer to how and why we exist. All I know at this moment for certain is that we are here, must take care of one another and more actively protect our endangered planet from the ravages we inflict upon it.

I do believe the Sannyasa stage of life grows out of our biology and psychological nature but changes in the specifics with each era and from culture to culture. My era is the Information Age (discussed in my previous post) and my culture is middle-class in an advanced post-industrial society. I heard in an interview on a PBS television program a remarkable comment that has given me a new way to think about what I have become. Bill Moyers, a public intellectual now in his mid-eighties, remarked that when one lives this long and is still in good health the mind becomes one’s university.

My mind has become my university. From my mind I retrieve the experiences, information and ideas that are stored there. The traditional sannyasi delved into his mind as I do into mine. The differences in outcomes are due to the differences in the times in which we live. He had his mind’s acquisitions plus, maybe, a book of sacred learning. Alternatively, I have, because of the computer, both a carrier of the internet and an instrument that makes the act of writing infinitely easier. I have access to the equivalent of hundreds of universities and their libraries. I do not think as fast or remember as well as I once did, but I can turn to the internet for help, some of it simple, such as for spelling or confirming the definition of a word, but also to check out the accuracy of a memory. Information via the internet introduces me to entirely new ideas I use for completing and disciplining my thoughts. The technology allows me be patient with myself, to follow at my own pace any path I choose to take as I wander through my mind full of memories.

I continue to learn. I miss my friends and being out in life but that is my past. Fortunately, I loved school, especially the university, both as a student and as a teacher. Now I am both, a teacher to my student self.  I find it exciting and deeply satisfying.

The fourth stage of life, described here and in following essays, is an idea Ravi, my husband, introduced into the way I see life’s progression.

I am in my fourth stage and write from that perspective but admit to finding it increasingly difficult to concentrate on my past. I follow the chaotic American political scene, read and watch the news obsessively, contemplate the future of our democracy. I won’t be around for much of it but still wonder, worry and am curious about the new world we are entering. And it is a new world. I believe we are on the threshold of one of the most fundamental changes ever made in human societies — a change in the basic nature of our relationships with one another, but first —

The Digital Revolution has been with us for a while, having begun in the late 1950s with computers and digital computing. I was approaching my thirties by then, but being in or around a university while living in the States, in Ankara, and in Paris I was an early adopter. In 1983, in a French research institute, I saw one of the first DVDs, tried and failed to get a grant to develop, with a young computer guy, a program on one for a health care application. No one where I applied could imagine the technology I proposed to use. In 1985 I bought my first computer, a Macintosh. Throughout this time I read and was aware of the personal computer’s effects on bureaucracies. It was hollowing out middle management; fewer managers, usually middle-class, were needed for supervising front-line, often working-class, workers. The computer certainly lessened the importance of the secretary, a middle-class job for non-college educated women.

In 1995, Ravi and I retired to a city with three major universities and I used their libraries while writing Tales of Mogadiscio. By 2012, driving to the libraries became too inconvenient for me to manage. Today, without the internet I would go crazy. We have entered the Information Age, an age of societal change brought on by essentially new technologies and comparable in its magnitude to that of the Agricultural Revolution, the Neolithic, some 12,000 years ago, and the Industrial Revolution, from the late 1700s to the mid-1800s, the age when the steam engine expanded our sources of energy beyond human, animal, water and wind. In between were the Bronze Age and cities, when metal tools replaced stone tools, and the Iron Age, when a stronger metal smelted from a more widely available ore came into use.

Each of the Revolutions and Ages resulted from new technologies for producing more food more efficiently, producing more surplus, allowing more people to survive and freeing more of them to produce more goods, manage larger and denser societies, be creative and advance the culture. Of course, the individuals freed to participate in these more complex societal activities were, for better or for worse, male. Despite profound technological, social, and cultural change, women continued in their primordial narrow range of roles. They had babies, raised children, tended to family needs and participated in public activities if and when doing so supported the roles their male family members were playing outside of family life. Until very recently and in only a few societies, the great historical revolutions, agriculture and the civilizations, have not favored women. Women fared far better in hunter-gatherer societies, the sort we lived and evolved in for over tens of thousands of years. The average woman had better health, a longer life expectancy and certainly a higher social status in a hunter-gatherer band than in the Neolithic or any other form of society that followed.

I discovered this reassessment of the original form of human societies when returning to one of my favorite books, a history of what we humans were as hunters-gatherers for tens of thousands of years, of how and why and where agriculture began and how and where our world civilizations developed. The book is Guns, Germs, and Steel: the Fates of Human Societies, published in 1997, by Jared Diamond, professor of geography and biology. It is encyclopedic in range, brilliant in its basic thesis and fascinating in detailed expositions based on scientific evidence. Additionally, Diamond being a great storyteller, it’s a good read, but written in modules, so I take it in stages, skipping sections and returning to them when some relevant question or idea particularly interests me. A readable summary and discussion of Guns, Germs, and Steel is here.

In 2005, the National Geographic Society made a documentary film based on the book. Episode One is available on-line, here. It is about the hunter-gatherer society Diamond lived with. Episode two is here. It is about the Spanish conquest of the Peruvian civilization, informative, enlightening and beautifully photographs but not within my concerns at the moment. Outlines and explanations of the two documentary films are here and here.

Here for an engaging and informative interview with Jared Diamond.

Diamond’s passion for studying birds in all their variety lead him into Papua New Guinea and living in the jungle with the Kaulong, a hunter-gatherer band-level people who knew the environment. He soon realized that as individuals they were essentially no different from people in his own society, obviously equally intelligent and certainly more resourceful. The question then arose of why they are still hunter-gatherers and why his society is more complex, or as Yali, a man in the local government asked him: Why is it you white people developed so much cargo, but we black people had little cargo of our own?” 

Diamond decided to find out why, to discover the causes of societal differences. He begins his book with a history of Homo sapiens hunter-gatherers and why some invented agriculture and others did not, moves on to why some societies with a Neolithic village level of technology, economy, social organization or a Bronze Age/urban level of complexity continued to develop and others ceased changing or went into decline. He argues that the gaps in power and technology between human societies originate primarily in environmental differences, which are amplified by various positive feedback loops. He views Europe, Asia and the Mediterranean area as one continuous landmass with environmental conditions that gave it a number of advantages over other areas of the world and that advances there, such as written languages and dominance in trade, occurred through the influence of geography on societies and cultures.

In accounts of his New Guinea hunters-gatherers, a band-level society of about thirty persons, he covers where and in what context a woman gives birth, how a child is weaned and how children are raised, but mostly he writes of the men’s activities.

I recently watched a television program on the Hadza, a hunter-gatherer people living today in the vast grassland area of Tanzania’s Rift Valley, on land not yet taken over by other people for raising cattle or farming. They are the indigenous people, and except for using steel knives instead of stone tools and wearing clothing, they live much as did their ancestors.

The San

Seeing them took me back to the 1960s when I taught Anthropology 101 and showed my students John Marshall’s 1957 documentary film, The Hunters, of the Kalahari Bushmen, the San people in South Africa. I remember the scenes of San men shooting a giraffe with a poisoned arrow and tracking it down as the poison took effect, then the activities that followed as everyone in the band shared in the eating and celebrating.

The program on the Hadza began with a journalist entering their camp with her interpreter, a young man, introducing her to a group of men who were sitting under a tree, doing men’s things, such as making arrows for the hunt, and talking. Diamond and other observers of hunter-gatherer people remark on their continual talking, comparing notes, being social. She chatted with the men, interviewed them and after some time asked, “Where are the women?” They were, of course, taking care of children and out gathering food, picking berries and fruit and using the digging stick to bring up tubers, the band’s basic food. Men hunt and women gather. Both activities require skill and knowledge of the environment, but the men’s side of the culture is more dramatic, less time-consuming, more fun to watch and more likely to be studied and recorded. In this simple band level society there is no social hierarchy; all are equal, but men dominate. My own pregnancies and having babies taught me why women are less mobile, mostly do the work compatible with childcare, plus we manage the inconvenience of menstruation. Besides, men are bigger and stronger. In this very readable account of the Hadza society Michael Finkel wrote “Gender roles are distinct, but for women there is none of the forced subservience knit into many other cultures. … women are frequently the ones who initiate a breakup—woe to the man who proves himself an incompetent hunter or treats his wife poorly. … some of the loudest, brashest members were women. …”

Here for a 7 minute video and here for photos, all of men.

In 1987, after having lived in the New Guinea forest with his friends, Diamond wrote for Discover Magazine a brief, widely read article in which he called the adoption of agriculture “the worst mistake in human history,” a mistake from which we have never recovered. Humans as hunter-gatherers, he wrote, living in bands of around thirty individuals, were more successful, as measured by increase in population and territory, than any other animal ever, but agriculture, the Neolithic Revolution, was the beginning of our taking over the earth.” — through our continual, accelerating population growth.   Archeological evidence shows that band level people had more leisure, were healthier and longer lived than people growing their food and living in villages. Diamond notes that circumstances for women changed with settled village living, and not for the better.

The Neolithic entailed making new sorts of stone tools, growing food that could be stored, domesticating animals, building substantial housing, living in settled village communities of a hundred or more households, making pottery, weaving cloth. In a hunter-gatherer mode of life a woman can manage only a carrying baby and a walking child; pregnancies and births were/are spaced (lactating suppressing ovulation, abstinence and infanticide. A child is not quite human until it proves likely to live and is therefore given a name.) Settled village life made it possible for a woman to have a baby every two years and for children to have a better chance of survival into adulthood. In band level society population increased; with agriculture, the rate of increase increased. The larger community needs a more complex social organization. In the hunter-gatherer band everyone was/is equal (although Diamond does mention that a girl new-born is more likely than a boy new-born to be put to death). In the settled village community one man, a chieftain, from a particular family or lineage ranked above other individuals and families/lineages and became the center of social organization. As the population increased, villages split, sending families out to establish new villages and the founding family usually held a special, often religious role in that village as it grew in population. The number of villages increased and became a tribal unit. A chieftain could organize men and take military action against other villages/tribes.

During the Bronze Age and onwards, technological innovation continued, societies grew in size and cultures became more complex and diverse, at least for the men. The lives of women remained much the same. They had babies, raised children, tended to family needs and participated in public activities primarily through their family or lineage roles.

Not all women were pleased with their limitations. The Queen of Sweden and Norway, wife of King Charles XIII and II, known for her beauty and vividness, kept a diary, 1738 – 1788.  Since she either miscarried or her children died soon after birth, she was not absorbed in the usual woman’s occupation. She wrote in her diary, “You have to admit, my dear friend, that woman is truly an unhappy creature: while men have their complete freedom, she is always burdened by prejudice and circumstance; you may say that men also have that hindrance, but it is not in equal degree. I am convinced that most women would ask for nothing more than to be transformed to men to escape the unhappy bondage and enjoy their full freedom.”

In my comments here on the Cyrano de Bergerac movies I include Roxane’s words as written by the playwright Edmond Rostand, 1897. Cyrano and Christian moon over Roxane but that does not change her status. She will have to marry some man or be a nun. All she asks for, most eloquently, is some say in her fate. Rostand’s wife, Rosemonde-Étienette Gérard, was a poet and playwright, and it shows.

In my 1970 study of a traditional town in Central Turkey, I decided to record and call attention to the unrecognized, unappreciated women’s contribution to the town economy. It is here in a three-part essay – “Surviving the Patriarch” – Part I is on middle-class Ankara, Part II on women’s work and the traditional family, Part III on the women’s day in the hamam.

As a girl growing up I was certainly aware of my lower status as female but had no mother to enculturate me into the attitudes and behaviors expected of a woman or how to value being female. It was evident to me that my father wanted a son. No one I knew ever questioned standard male/female roles. Everyone accepted that a woman hold a job after high school, then quit it to become a housewife when she married. In the 1940s and ‘50s I heard both middle-class White women and a middle-class Mexican-American woman say they liked having a man dominate them; it made them feel feminine and loved. It puzzled me. I wonder how many middle-class women today think or feel that way. As a girl I always had a boyfriend, and more than most girls, friends who were boys, always as equals, or so I believed. The rule for girls, and for nice boys with girls they might marry, was no sex before marriage, so by teenage there was lots of what we called “necking,” or “petting” but no genital contact.

In 1952, I read, in English translation, the 1949 book The Second Sex by Simone de Beauvoir. Given my educational background, most of it went over my head but I did remember Beauvoir asking, “What is woman?” She argues that man is considered the default, while woman is considered “The Other … Thus humanity is male and man defines woman as not herself but as relative to him.” She stated what I knew to be true.

 

In 1960, The Pill came onto the American market. The first reliable and discreet contraceptive. For the first time in human history a woman could easily, dependably decide if and when she would become pregnant. It was the beginning of a slowly growing revolutionary force.

In my early blog posts I wrote of why, in 1953, not yet 23, I married Ravi. He and I went on to graduate school together. I had two unplanned pregnancies, and as much as I loved my babies, always feared having another. Then the Pill arrived. It transformed my life, and Ravi’s. I had been using the diaphram as our contraceptive. If there had been a third pregnancy and baby, I could not have continued in school, could not have done research or written my M.A. thesis. We could not have lived twice in Somalia as a family. I could not have taught even part-time as a Lecturer in a university. Ravi could not have accepted the position that took him and us to Turkey and eventually to Paris. Another baby, maybe two, would have meant spending my life managing our family on a professor’s modest salary. And Ravi did not want to spend his life as a professor. The Pill saved us. I did not have the academic career I dreamed of, but I did all right.

Other young faculty wives in the 1960s were reading Betty Friedan’s The Feminine Mystique. The first chapter of the book concludes by declaring “We can no longer ignore that voice within women that says: ‘I want something more than my husband and my children and my home.

In 1957, American women had an average of 3.7 children. Catholic women had an average of 4.5 children. Those numbers began to fall immediately after the introduction of the pill. Today, American women have an average of 1.9 children, an all-time low.

All is not ideal. The rate of unintended pregnancies in the U.S. is higher than the world average, and much higher than that in other industrialized nations. Almost half (49%) of U.S. pregnancies are unintended, more than 3 million per year. In 2001, of the 800,000 teen pregnancies per year, over 80% were unintended.

Jonathan Eig tells the history of The Pill in his 2004 book The Birth of the Pill: How Four Crusaders Reinvented Sex and Launched a Revolution. The four people who created this revolution were: Margaret Sanger, who believed that women could not enjoy sex or freedom until they could control when and whether they got pregnant; scientist Gregory Pincus, who was fired from Harvard for experimenting with in-vitro fertilization and bragging about it to the mainstream press; John Rock, who was a Catholic OB-GYN and worked with Pincus to conduct tests of the pill on women; and Katharine McCormick, who funded much of the research. Women’s control over when and how many children they have is indeed revolutionary.

I suppose my form of feminism is a concern for how and when a woman becomes a mother. Nothing is more important than being a mother but one must also participate in society as a full individual. How does one manage that?  In 1972, I returned to graduate school at the U. of Chicago into a specialty for the management of family planning programs, followed by the MBA in hospital/healthcare management. All but one of my consultancies were on health care programs but family planning and population concerns were always with me.

On the broader population concerns, demographers have noted the Demographic transition. In the 1800s in Western European advanced industrial societies, urban middle-class families were already having fewer children, even without effective contraception. Poor women still lacked control of when they got pregnant and upper class families could afford as many children as came naturally,  Overall, the fertility rate, roughly the number of births per woman, declined. Nevertheless, since the number of young women in the population was still high and the death rate was decreasing, the population continued to increase.

I remember the women I knew while doing sociological research. In the 1950s it was my friend in the Mexican-American community who was having a baby every two years. In Somalia it was the woman who scolded me for having only two children. What would I do, she asked, when they died, as her children had died? And Savamma, the poor woman in a South Indian town where I was doing a study and my failed attempt to get her into proper care for an abortion. And I think of the poor women in so many countries and refugee camps who do not have contraceptive services available.

The empowerment of women. And the changed circumstance for the family and children. The complexities are beyond my even listing them. I discussed The Intern, A Movie for Our Time here. Robert de Niro is great. The situation of the young woman and her husband would have been unimaginable a decade or two earlier.

On another aspect of controlling the timing and number of births — I do believe the greatest threat to the earth’s environment is overpopulation. That next.

It was sometime in the mid-1980s, while visiting friends at the university where Ravi and I had once taught, that I told an anthropologist about important differences I noticed between the various countries where I had been living and working. I attributed the differences to the agricultural base, whether it was wheat-growing or rice-growing. We discussed my observations at some length and he included them in a footnote, crediting me, to an article he was writing for an anthropological journal. I had left academia in 1968, had become an expatriate wife and mother following Ravi from country to country as he followed his career. For myself, I did research projects, went back to school for two years with the kids, and finally, on a consulting basis, did evaluations of primary health care programs in developing countries. For a number of years I lived in Turkey, a wheat-growing country, and did a rural-to-urban migration study while there. In India, broadly, the Northwest is wheat-growing, while the East and South are rice-growing. I had done health care related research projects in both the regions, and of course, had visited Ravi’s family across India, from Bombay, Delhi, Benares, Calcutta, down to Bangalore. In rice-growing Indonesia, I worked in rural and urban contexts with Ministry of Health colleagues across Java, Madura, Sulawesi and Sumatra. (Somalia, where I lived and did research, was a tribal, pastoral nomadic society, like the Arab Bedouins, with yet another, quite different pattern of psychological-cultural characteristics.)

Woodcut by Piero Crescentio

Thrashing wheat in Delhi, at Humayun’s tomb

I told my anthropologist friend how different the wheat villages were from the rice villages and how the differences affected me.When I first lived in Turkey, in 1968, farming was still traditional, not very different from earlier Europe or northern India, rather like peasant farming. Tractors were still rare. The village was a compact settlement of a hundred or so households, sometimes more, surrounded by wheat fields, pasture, fallow land, wooded areas. Travel time from one village to another was considerable. At the center of production was a man with his plow and an ox, mule or horse. (I wrote here on women’s role in the economy of a traditional Turkish town.) Inheritance was from father to son and women were outsiders who married into the family.

Planting rice

plowing the field

I first went to Bali in 1978 and after that, until 1994, to the other islands. As I  crossed Indonesia it was the nature of the farming more than the shape of the villages that caught my attention. Without actually studying the agriculture, it was obvious to me that rice farming is multi-phased and especially complex in the relationships required between farming households within the community. I describe Bali’s farming complex here   The man and his plow and bullock were there but as one part of the work sequence.

Women worked in the planting, weeding, harvesting, and they participated, at least in Bali, in the community’s agricultural decision-making meetings. (a photo of the woman’s blade for harvesting rice is here.) I read that in Java, descent is reckoned from both the father’s and the mother’s family line.

By habit, I view life from a sociological perspective but psychologists also have found differences between people dependent upon one or the other of the two basic food crops. (Maize/corn was the base in native American civilizations and the potato in South America.)  Psychologist Tim Talheim did research in China, on individuals in both the wheat-growing north and the rice-growing south. I found the results fascinating and relevant to my particular interests. Ordinarily, China had not been a culture I read or thought about but last year an archeological discovery having to do with amazing art lead me incidentally into months of acquiring background for understanding Indonesia. The discovery related to the 8,000 terracotta warriors that for millennia have kept watch over the tomb of China’s first emperor. Two of these remarkable statues had been shown in an art museum and I saw them. The Terracotta Army depicts the armies of Qin Shi Huang, the first Emperor of China, in 210–209 BCE. Based on DNA evidence, archaeologists now think Greek sculptors could have trained the local artists, indicating that the north China civilization, based on millet as its grain, had early-on contact with Indo-European culture via Central Asia and the northern steppe – and with growing wheat. Entirely separately, in South China, by 7500 BCE, rice-based agriculture had developed along the Yangtze River.   Here for a map showing where rice, wheat, corn are grown in China.

migrations of Dai people

It was in the Yangtze River Neolithic based on rice, before the northern Han expansion, that the agricultural people of the S.E. Asia have their origins. They are the Austronesians, named for their language family. Genetic research shows that an ethnic minority in southern China, the Dai, are the people from whom the Indonesians are descended. Most likely their ancestors came from the region around the Tonkin Gulf, the homeland of the Dai, and migrated to Indonesia through the Vietnam corridor. (I previously wrote here, based on linguistic theory, that the indigenous people of Taiwan were the original Austronesians. That now seems unlikely. For Japan’s deep history, see comment five below. )

Tim Talheim and his research team propose that a history of farming wheat makes cultures more independent, while farming rice makes cultures more interdependent, and these agricultural legacies continue to affect people in the modern world. He reports that the northern Chinese seemed more direct, while people in the south were more concerned about harmony and avoiding conflict. They tested 1162 Han Chinese participants in six sites and found that those in rice-growing southern China were more interdependent and holistic-thinking than respondents in the wheat-growing north. To control for confounds like climate, they tested people from neighboring counties along the rice-wheat border and found differences that were just as large. Individualism is often considered a trait related to modernization but it did not fit the data. The wheat-growing/rice-growing differences persist with urbanization and modernization.

Quoting from the National Geographic article –  “To see if these agricultural differences led to contrasting psychological traits, Talhelm’s team recruited 1,162 Han Chinese students from around the country and showed them sets of three objects, such as a train, a bus, and tracks. When asked to pair two of them together, volunteers from rice-growing regions were more likely to choose holistic pairings based on relationships (train and tracks), while those from wheat-growing areas chose analytic pairings based on abstract similarities (train and bus).”

From a New York Times article, social psychologist Hazel Rose Markus asked people arriving at San Francisco International Airport to fill out a survey and offered them a handful of pens to use, for example, four orange and one green; those of European descent more often chose the one pen that stood out, while the Asians more often chose the one more like the others.

I wonder how Ravi would have chosen. His family was both north Indian and south Indian. Oddly, for example, for him a proper Indian main meal included both wheat chapattis and rice. In making a pair from the bus, train and tracks set I agonized over my choice. I am from a wheat culture but as a professional woman felt more at ease in rice culture countries, in cultures where even in the traditional sectors of the society women participated in public life. I will reflect more on this as I write about my experiences in Indonesia. One aspect that immediately comes to mind is my American habit of smiling at people in public and how this affected me, as a woman, in the various countries where I lived and worked. I describe that here.

Addendum — I wrote here of my first experience, other than Bali, in a paddy rice village and remarked on ways in which it was different from with the wheat societies I grew up in. My working in public health shaped much of how I understood each culture I came to know.

 

I think about movies and their influence in our lives. Gaslight, for example, is a good movie and it added a useful word to the languge; “gaslighting” is a verb that names behavior previously we could only describe. Now we say “he gaslighted me” and people know what we mean. Decades ago Rashomon, a truly great movie, gave us a phrase, the “Rashomon effect.” It did not become part of our common, every-day vocabulary but is used by psychologists, psychotherapists, journalists, and people like me because it names, and therefore helps us understand, a particular sort of complex social interaction.

But first, the movie – I saw Rashomon in 1952, soon after it arrived in the U.S.  Roger Ebert wrote that “Rashomom (1950) struck the world of film like a thunderbolt.” It won the Golden Lion at the Venice Film Festival, introducing Japanese cinema to the world, and won the Academy Award as best foreign film, setting box office records in the U.S. for a subtitled film. At the time I was a university student and holding down a job to pay for it, so had little time for movies, which I then thought were, anyhow, mostly a waste of time. Student friends more sophisticated than I were foreign film enthusiasts but a movie from Japan did not interest them. It was outside their experience, so I, the anthropology student, asked my sociology student friend, who happened to be Japanese-American, to see Rashomon with me. (During World War II, his family was relocated and interned in a camp for Japanese-Americans, a totally unjust action taken by the U.S. government.) Less than half an hour into the movie he stood up, abruptly, and walked away, out of the theater. I was so involved in the movie, in the story and the marvelous images, that only after it had ended, after sitting there for some time, still caught in its spell, did I wonder why my friend had left. He never spoke of it and I felt somehow it better not to ask.  Here for a full description of the movie.

No one discussed Rashomon with me; in college social circles cinema was not yet considered an art form and I was not yet reading movie reviews. Bosley Crowther’s in the New York Times seems to be the only Rashomon review from 1951. Robert Ebert wrote his in 2002, in concert with the Criterion Collection release on DVD. For many years I believed I was the only person who knew of Rashomon, but the question it asks and scenes from the movie stayed with me. I have seen it again on DVD and on-line.

Not all the critiques of Rashomon have been unstinting in their praise. Phillip Lopate, who reviews the book by Paul Anderer, “Kurosawa’s Rashomon, Vanished City, a Lost Brother, and the Voice Inside His Iconic Films,” believes that Anderer overrates Rashomon. In Lopate’s opinion – “… …The fact that it is “iconic” does not necessarily make it a masterpiece — certainly not one of comparable depth to, say, Mizoguchi’s “Ugetsu” or Ozu’s “Late Spring.” Visually dazzling, yes, but the hammy and naïve aspects remain irksome. Toshiro Mifune’s monkey-scratching bandit, charming at first, becomes one-note; the drifter’s cynical laughter is excessive; and the woodcutter’s rescue of the baby at the end, a crudely sentimental device. Kurosawa’s Big Thoughts, like What is truth? and Is man inherently evil?, seem trite. The problem is not that these questions are undeserving of consideration, but that Kurosawa poses them in a didactic, simplistic, self-congratulatory manner.”

I prefer Roger Ebert’s view. He accepted the elevated emotional level of the actors, as did I. Kurosawa was not looking for realism. He had the actors, in the mode of silent film, use their faces, eyes and gestures to express emotion, and the story takes on a universal, mythical meaning beyond what language can ordinarily communicate.

In another review from 2015 2015 Kim Newman places Rashomon as less than a masterpiece but important in film history. She considers it is essential viewing.

James Berardinelli’s thoughtful commentary is well worth reading to comprehend the significance of the movie and the questions it raises.

Whether one considers Rashomon a great movie or merely consequential, the idea of the “Rashomon effect” remains. For the anthropologist it is central; a main problem is how do deal with different understandings in different cultures.

Michael Lakahn defines it: “Stated simply, the “Rashomon effect” is the effect that our subjective perceptions have on our memories of events. The result is that two or more observers of the same event will describe substantially different but equally plausible accounts of the event. As neurological science has demonstrated, we are all subject to the Rashomon effect. We are all unreliable narrators.”

In the first scene of Rashomon, in a heavy wind-driven rain, two men sit in the ruins of a once massive city gate. The woodcutter says and repeats, “I don’t understand. I just don’t understand.” The priest, in a sad voice, lists many of the wrongs perpetuated by man and by nature — wars, plagues, floods, fires — but finds the court trial testimonies he and the woodcutter heard, sitting there in the background, the most distressing and disastrous of them all. A commoner who joins the woodcutter and priest states the obvious. He points out that murder and violence are everyday occurrences and wonders why they agonize over this particular murder, over how it was committed and by whom.

For the priest, the succession of personal tales may have reinforced his fear that human beings are inherently flawed, self-serving, locked into the individual’s own perspective, while the very essence of being religious is putting the good of others above self. For the woodcutter, perhaps the stories told were reminders that people in other places and other social statuses are different from himself, not really understandable, but since he had played a part, if only a small part, in the drama of the thief, the noblewoman and the nobleman, he cannot separate himself from them. The commoner shrugged it off, mindlessly, and looked to his own interest.

I have lived my life as a friend, wife, citizen, anthropologist and public health worker trying to understand why people from different backgrounds think and act differently from the ways I acquired growing up in America. I recognized the Rashomon effect but until recently did not find it, as in the movie, a reason to lament the nature of human nature; I thought differences in perception were simply a fact of life to analyze and work with. Now, however, everything around me has changed. I observe new technologies and the global economy changing American society and culture. Americans are divided as never before into factions with sharply opposing self and national interests. We have at present a toxic political climate and a government dominated by a President and his appointees who represent one part only of the society, and a minority part, at that. I wonder how our traditional democratic institutions, the economy, ordinary people will weather this ideological storm.

I cannot help but reconsider the grave view held by the priest in Rashomon. We are all subject to the Rashomon effect. We are not only unreliable narrators, we are unreliable observers who interpret circumstance and relevant information to favor our own self-interest and to act in that regard, justifying our actions as beneficial for the larger society. Thus, we have no hope for a universal court where everyone can agree on what is true and just. However, unlike some of the commentators, I do believe that a knowable psychological and social reality does exist and can be discovered through scientific study.

But enough of this for now.