The Neolithic: Revolution or Evolution?

I, like you, am a small piece of a community, which is part of a society, which is a piece of an ecosystem, which is part of a planet which is part of a solar system, which makes up a bit of a galaxy, which is one of a cluster, which is part of a supercluster, which is part of a universe.  And yet, I am also a collection of individual cells, organized in differentiated organs, each with its own role and each of those cells is itself a small community of organelles, including the nucleus and the mitochondria that were once different species and each of which harbors a community of genes, which struggle individually and separately to replicate themselves through cell division, reproduction, survival of the individual person, survival of the community, survival of society and so on.  I am a part of a larger whole and I am a whole of smaller parts.

The grand organization I exist in developed slowly, very slowly, over billions of years, beginning with the simplest genes, or so we assume, and developing more and more complex forms of organization.  We now live in a multi-species community.  It is hard to call it anything else.  We convince ourselves that we are its masters, that we have molded it and not the other way around, that we have shaped the plants and animals around us into a world that meets our liking.  Perhaps we have, but things are never so simple.

Since the Neolithic Revolution brought multiple species of plants and animals into permanent relationships with human beings in different parts of the world at similar times, we have come to depend on a variety of other species for our existence, and they have come to depend on us.  It is more than symbiosis and it is something different than an ecosystem.  Human beings and our numerous allied species have transformed much of the planet into something we find more suitable to the propagation of our genes.  The plants and animals that evolved into human beings’ pleasures or pests have succeeded in propagating their genes in a way that would have been unimaginable to our hunter-gatherer forebears, if they ever thought to imagine such a thing.

Look at corn, or maize, as it is more properly called.  It descended from the humble teosinte grass and is so different from its ancestor that its origins are still debated.  Human beings bred and adapted this plant in Mesoamerica, as they bred and adapted wheat in the Middle East.  Or did the plant evolve to take advantage of human agriculture, making itself more appealing to us so we would spread its genes far and wide?  Did we choose corn or did it choose us?

I am more fascinated with the chili pepper, a plant that evolved long ago to be pleasant to birds, which spread its seeds, and unpleasant to mammals, which don’t, except for human beings, the only mammals that eat the fruit of the plant and have happily spread its seeds around the globe without once thinking they were doing the plant a favor.  Did we choose chili peppers or did they somehow choose us?

There are some species that have clearly joined up with human beings voluntarily, especially pests, but also cats and dogs, two carnivores that seem out of place in human society.  Cats’ participation in our communities appears to have been completely voluntary and some people believe dogs also domesticated themselves, joining up to get a regular lunch.  Maybe someday a raccoon subspecies will tame itself in a similar way.

But what about cows?  Cows no longer exist as a wild species.  We feel sorry for them.  I think that is appropriate, but at the same time, they are a spectacularly successful species mostly because human beings like to eat them and drink their milk.  If people never ate them, never bred them, how many cows would there be?  Cows may suffer, but their genes are fulfilling their one and only goal: reproduction.

In any case, human societies stopped being exclusively human thousands of years ago.  We are now part of a web of interdependent species.  Many of the species we rely on would struggle to survive in any other environment and we would struggle to survive without them.   We are part of a new ecosystem that can pick itself up and move in toto from one continent to another, an ecosystem that has cut down forests, drained swamps, filled in shallow seas, terraced hills, irrigated deserts and paved over grasslands in all but the least amenable climates.

Trees, grasses and corals are among the species that have played similar transformative roles, making homes for numerous other species, but ours is just a bit different.  In forests, grasslands and reefs, there are also complex relationships, but it is less important which species of tree, grass or coral is providing the home and which species play other roles, like pollinating flowers or spreading seeds.  In contrast, human ecosystems require a specific set of species.  Since the dawn of the Neolithic Revolution, when people move they take their web of species with them.

The hybridized human ecosystems that developed after Columbus have been more successful than the older ones.  The new mix of plants and animals that we can’t live without is one of the reasons for the huge growth in human numbers during the last few centuries.

So, was the Neolithic Revolution spark a new way of life or a new kind of life?  Was it cultural change like the Industrial Revolution or evolutionary change like multicellular life?  Are we really separate from the plants and animals we rely on?  Or do we need them as much as we need human society?  And is there a difference anymore?

 

Advertisements

What I Know to Be True

One of the few things I regret from my nearly 50 years as a member of the Church of Jesus Christ of Latter-day Saints (formerly the Mormons), was standing up in church and talking about knowing the church was true.

I know what experiences I based those statements on, but I no longer believe those experiences meant the church was the one and only church with authority to act in the name of Jesus Christ. I believe such experiences are available in other churches and religions. I believe they do indicate that God exists, but I can’t say that for sure, either.

The one thing I know is that it is wrong to mistreat other people. I know it is right to be good to each other. I know it is right to be kind. I know it is wrong to deliberately hurt someone else. That’s the sort of thing I know. Everything else is belief.

Why Occam’s razor works for science, but not religion

For those who don’t know, “Occam’s razor” describes a way of choosing between two different explanations for the same thing.  It basically means that the simplest explanation is the best one.  It was named after a 14th century Scottish scholar/friar (because pretty much all scholars were all churchmen at the time).

It is my observation that Occam’s razor really describes an almost universal human strategy.  It is the way human beings approach truth.  “Where there’s smoke, there’s fire” is one example of how people apply the principle to ordinary life.  This saying basically says the simplest explanation for human behavior is the true one.

The problem with Occam’s razor is when it is used as a way to determine absolute truth, rather than determine what truth is most likely.  It’s not unusual for people to use Occam’s razor as an argument-ender (Hypothesis A is simpler, so Occam’s razor says A is true. End of discussion.).  The problem with this is that the principle is only as good as the data we have.  More data can change what the simplest explanation is.  What we thought was the simplest explanation can turn out to be very complicated.

For example, for most of the period that human beings have been around to contemplate the world we live in, most people believed the sun revolved around the earth.  That was the simplest explanation.  Until the 16th century, it was, in fact, the explanation demanded by Occam’s razor in almost all human societies.

What changed?  More and better data.  Europeans realized that the model of the universe they were using didn’t explain the movements of the planets well enough.  If all the planets revolved around the earth, they didn’t stay in nice, neat orbits.  The geocentric model of the universe became more and more complicated.  Copernicus put the sun at the center and the motions of the planets were all described by nice, smooth paths.  More data made a heliocentric model of the universe the simplest explanation and it was eventually adopted by everyone (until it became clear that the sun was not the center of the universe, either).

The amount of data we have today makes a geocentric model completely impossible (the few people who claim to doubt it are obliged to say the data is false).  We have sent people to the moon and machines to the farthest reaches of the solar system.  We have taken so many photographs and measurements of objects in space and of the earth from space that a geocentric model of the universe isn’t even an option.  It’s easy to forget that there was a time when the simplest, most scientific explanation for the motion of the sun, moon, planets and stars was that everything revolved around the earth.

When Copernicus first described his theory, there was nowhere near as much data available and only experts knew the data existed.  At first, Copernicus’ model was only highly probable, not proven, and non-experts had little or no reason to believe him.  We judge the critics of Copernicus and Galileo a bit too harshly.

There is also a flaw in the human mind.  People have a hard time with concepts like “almost certain,” and “highly probable.”  Human beings see them as meaning the same thing as either “true” or “false,” depending on what they are inclined to believe.  If you are willing to believe an idea that is “almost certain” or even “highly probable,” you will probably see “highly probable” as being the same as “true.”  If you really don’t want to believe an idea, you will probably focus on the inherent uncertainty in the term “highly probable” or “almost certain” and say that the idea is “false.”

This is a problem in science, for both scientists and the public, since science rarely declares an idea to be completely true or false at first.  In most cases, science initially rates ideas as being “probable,” “unlikely,” “highly probable,” etc.  Scientists themselves often take sides in scientific debates and talk about their side as if it were “true,” while scientists on the other side talk about it as if it were “false.”  We can hardly expect the public to be more nuanced than scientists are themselves.

As scientists accumulate more data, their ideas stop being “probable” and become either “highly probable” if the evidence supports it or “not very likely” if it doesn’t.  Then—if we’re lucky—more data will show the idea to be either “proven” or “disproved.”  This has happened again and again.  It is how science works, as a whole.

Take evolution.  The amount of data Darwin was working with was fairly small, but over time, biologists described more and more species and archaeologists dug up a seemingly immense number of fossil, with approximate dates provided by dating methods that have themselves gained more and more certainty as more data has been accumulated.  Biologists examined minute cell structures under the microscope and then geneticists added in DNA evidence.

The amount of evidence that supports evolution is now staggering, with much of that evidence discovered in the last 50 years.  Of course, not everyone believes it, for the same reasons human beings rejected previous new ideas: they don’t know the evidence and hearing any uncertainty about an idea they really don’t like is the same as hearing that it’s false.

Of course, not every idea in science is proven to be true.  Some are quietly forgotten as new evidence shows that they aren’t just unlikely, but false.  For example, scientists used to declare that there was a sharp division between animal intelligence and human intelligence.  That difference is slowly blurring as scientists accumulate more data.  New evidence always seems to contradict that idea, rather than confirm it.  The idea that there is a large gap between the intelligence of human beings and that of all (other) animals is headed for the dustbins of history.

This does not bother scientists because science is supposed to be the best description we have of the world around us and how it works, and science is always evolving.  That’s the whole point of doing it.  Some ideas in science are now beyond dispute, but others are not.  The disputable ideas are the ones scientists love investigating and arguing about, by the way.

Religion is not science.  Religion does not purport to be “the best description of the world available,” but “the truth.”  It also makes claims about things that cannot be investigated, proven or disproved.  That’s the whole point of religion.  “Highly probable” is not an acceptable level of certainty in religion.  Religion is supposed to go beyond the available data.

Religion is, indeed, a matter of faith.  For the believer, it is a matter of knowing true things that cannot be discovered by science.  In a religious context Occam’s razor becomes unhelpful because it describes what is most likely true, given the available data, while religion is supposed to describe what is true, without any available data.  That is the very definition of faith.

For example, does evolution disprove the Bible?  Some people believe that it does, but many others believe that it does not.  These believers do not see the question as being “Is the creation story in Genesis true or false?” but rather as “Can the Bible be true even if the creation story is false?” or “Can my religious beliefs be true if the creation story in Genesis is false?”  Millions of people have decided that the answer to one or both of those questions is yes.

For the religious, it is not a question of what is most likely, but of what is possible. Religious people arrive at their beliefs through methods that are not subject to scientific investigation.  When they use science and evidence to test their religious beliefs, their question is not usually “Is it likely that my religious beliefs are true?” but “Is it possible that my religious beliefs are true?”

While specific beliefs of religion can be proven or disproved, the scant evidence surrounding religious belief almost always leaves some uncertainty in general matters, enough wiggle room for people to say “Yes, my central religious beliefs can be true.”  For true believers, the possibility that their beliefs are true is all they need, since they didn’t base their belief on physical evidence in the first place and never expected to have proof of them.

That’s the attitude they have if they’re objective, which most people aren’t.  Most people will do the same as they do with science, except it kind of works in reverse.  When people want to believe in a religion, any uncertainty about evidence that contradicts it will make them see that evidence as “false,” but when people don’t want to believe in a religion, any evidence that “very probably” contradicts it will be seen as “certain proof.”  Believers will think the evidence is irrelevant, while doubters will think that the belief in question has been as thoroughly disproved as the idea that the sun revolves around the earth.

In sum, Occam’s razor is a useful tool, but it is not the same as proof.  When applied to faith, it loses its usefulness.  In addition, the human emotions surrounding religion will cause most people either to exaggerate what it says or ignore it entirely.

Religion, Atheism and Critical Thinking

To begin with, I have to say that I believe in God, but I do not believe atheists are going to hell.  In my opinion, there is little significant difference between believing in goodness and believing in God.  I am more confident in the eternal fate of an atheist who tries to be the best person they can than I am in the fate of a believer who muddles through life without making the hard choices that true goodness requires on a regular basis.

I believe I am on firm ground on this, by the way.  Changing your opinion is easy.  Changing your character is not.  As far as I’m concerned, character is what counts, now and forever.

So, when the New Testament, for example, talks about the need to believe in Jesus Christ, I see that belief as being measured by a person’s actions, not their opinions.  And if we cannot change our opinions after death, we are all in trouble.  Imagine being stuck with the opinions you have now for all eternity!  I know that a belief in God is not just any opinion, but is it so different that it cannot be changed after death?

My religion leads me to this view, of course.  While I am not an “active” Mormon at the moment, the church’s teachings still inform my beliefs and the idea that people can choose to convert after death is essential to the religion’s view of life, death and the eternities.  So, the arguments I have made here are really nothing more than a way of explaining and defending Mormon belief with generic terminology.

As a result of this belief, Mormons do not usually fear for the eternal fate of non-believers.  I am hardly alone in this perspective.  While individual Mormons may look at some of their loved ones and believe they are bound for hell, most Mormons are quite optimistic about the fate of the people who are dear to them.  The religion gives you a choice in how you see others.

The real issue I wanted to address, however, was the claim that religious people (especially Mormons) abandon critical thinking to maintain their beliefs.  I had to include the above paragraph so that I didn’t contribute to—or participate in—the atheist-bashing that is so common in American culture.

It is simply not true that religious people do not engage in critical thinking.  It is not even true that Mormons do not engage in the practice, no matter how many people may claim otherwise.  (For examples of Mormons using critical thinking, see Joanna Brooks and Jana Riess or the sprawling website By Common Consent.)

Someone who does not use critical thinking will encounter facts that counter their beliefs and ignore them.  Someone who uses critical thinking will encounter those same facts and see if there is some way they can be reconciled with their beliefs.  They ask whether it is their understanding of God and religion that is wrong, rather than rejecting the facts or abandoning their beliefs.  They ask whether they need to adjust and adapt their beliefs.  They ask whether rejecting one of their beliefs will really diminish the rest of them.

Someone who engages in critical thinking does not immediately abandon their beliefs, although they may do so eventually if they can find no way of reconciling or adapting their new knowledge to their former beliefs.  I’m not saying that critical thinkers never come to the conclusion that they must abandon what they previously believed.  It is clear that some do.  But the majority of critical thinkers with deep-held beliefs about God do maintain most of them, without rejecting the new facts they encounter.

In sum, you cannot divine a person’s character or their eternal fate by asking if they believe in God, and neither can you use the question to assess a person’s critical thinking skills.

 

 

Noble Selfishness: Donald Trump, Bernie Sanders, the Brexit and Islamic Terrorism

I woke up this morning in a world that seemed somehow changed.  After months of hourly news developments related to Donald Trump’s latest outrage, followed by the “worst mass shooting in American history,” (which may or may not have been an act of terrorism), even as we approached one year since a white supremacist murdered nine blacks in a historic Charleston church, we now face a British vote to exit the European Union.  As my mind struggles to make sense of it all, I recall a common phenomenon I have noticed among parents: noble selfishness.

An individual’s efforts to advance their own welfare are labeled, quite appropriately, as acts of selfishness, but a parent’s efforts to advance the welfare of their children are often seen as acts of love.  It is rare when a parent’s advocacy for their children is seen as a fault.

Coincidentally, we have also seen one of those rare moments in recent weeks, as the father of a young rapist was dragged through hell on the internet for defending his son and brushing aside the profound effect his son’s actions had on another human being.  Even so, the selfishness of his words would have been completely overlooked if the tears of his distraught victim had not spread around the nation before his unfortunate letter did.  Having absorbed her pain before we heard his compassion for his son, we reacted quite differently than the lenient judge who decided the case.

While that case may be extreme, it is hardly an isolated phenomenon.  Parents are usually given great leeway in advancing the interests of their children, even when other children are indirectly—or even directly—harmed as a result.  Schools and teachers are quite familiar with this kind of noble selfishness as they deal with the righteous indignation of a parent whose child did not receive every benefit and every accolade the school could provide.

Continue reading

My personal code for life

This is my personal code for life.  It’s a bunch of obvious things that most people already know.  I am sure I’ve missed some significant things, but I still like it:

  • Anything that leads to love is good. Anything that leads away from it is not.
  • I claim the god-given right to be imperfect.  I also claim the inherent right to have “issues.”  I recognize that everyone else has the exact same rights.
  • I cannot control other people and if I try to do so, I will make both them and me miserable. I do have the right to set boundaries for the treatment I will accept from them. It is my responsibility to communicate those boundaries to them. I have the inherent right to disassociate myself from them if they refuse to honor those boundaries. If someone still tries to hurt me, I can seek out appropriate protection.
  • The universal golden rule still applies: treat others the way you want to be treated. It is wrong to harm others. Allowing someone else to come to harm through inaction is also wrong. And one way we cause harm through inaction is choosing to remain ignorant of the harm we cause to others.
  • It is also wrong to cause harm to ourselves, through action or inaction.
  • I am never responsible for another person’s behavior, but I am responsible for the temptations I create for them. I will inevitably tempt others to be angry, to lash out, to be jealous, to seek revenge, etc., but if I choose to ignore the temptations I create for others I am harming them through inaction.
  • I have the right to decide to believe in God or not. If I believe in God, I have the right to decide what expectations that being has of me. I also have the right to follow those expectations. What I may not do is harm others through either action or inaction in order to satisfy God.
  • It is more important to be wise than to be happy. Seeking happiness over wisdom is likely to lead to neither one, but seeking wisdom over happiness is likely to lead to both. Wisdom is the key to happiness. Understanding is the key to wisdom. Knowledge is the key to understanding.
  • There may be nothing in life I can truly control. Life is a matter of probabilities and odds. I cannot change that. What I can do is change the odds.

Why I cannot be silent

I would like to relate an experience I had while I was a student at Brigham Young University. I had many good experiences there, but I had a few bad ones as well.  And since painful experiences stay with us in ways that positive experiences to not, this event left an impression on me that has never gone away.

It happened in a religion class, almost 25 years ago. All students at BYU are required to take religion classes as part of their General Education requirements, no matter what their beliefs may be. If this sounds odd, you may ascribe it to the fact that the church subsidizes the tuition of all students, even those who are not Mormon.  The tuition non-Mormons pay may be higher than the rate members of the church pay, but it still does not cover the total cost of their education, or at least it did not do so when I attended the university. The religion classes are part of the deal.

It may seem unusual to people from other Christian traditions, but even though BYU religion teachers teach church-designed classes at a church university, they do not have any kind of official capacity within the church. Our church does not have trained clergy or any training program (which sometimes really shows) and BYU religion teachers are all professors who have been trained in other fields. As professors, they have some freedom to teach whatever they would like, meaning that while their lessons usually conform to established church doctrines, this is not always the case.

This particular professor, on this particular day, decided to talk about birth control and how wrong it was. That is not the position of the church, either then or now. In fact, while it is true that some Mormons eschew birth control, it is actually a common practice among faithful members of our church, which means that it is also common at BYU, where a large percentage of the students are married.

One of the women in the classroom decided to express her disagreement with his lesson and included the fact that she and her husband practiced birth control and why they did so. The teacher responded by condemning her for not having sufficient faith that God would take care of them if she became pregnant. A few of the members of the class joined in and ganged up on her. It was ugly and she left the room in tears.

While several people attacked her faith, no one stood up for her. More importantly to me, I did not stand up for her, even though I agreed with her completely. When she left the room crying, that hit me more than anything else could have. I resolved never to let such a situation go by again. I was determined to stand up for people who were being unfairly attacked.

I am sure I have not been perfect in following this principle, but I have tried. One of my strongest memories was when I once defended my church and my own beliefs. Defending my own group wasn’t quite what I had determined to do, but it was close enough to satisfy me. It happened a few months after my experience in that religion class.

My sister Tamara was working as a nanny in New York City and I made some extra money over the summer.  Instead of spending my extra cash on something practical like a well-used car, I kept walking and riding my bike round town and flew to New York City  to tour it on the cheap (still not a cheap trip). Tamara set me up in the apartment of a friend and arranged for people to take me to interesting places during her work hours. It was an amazing trip that included seeing Les Miserables on Broadway. The actor who had originated the role of Jean Valjean on Broadway had returned for a special visit that night and we took in the show from the fourth row. I remember a lady we met in line praising my sister for the low price she had found for those prime seats. The musical and the entire week were an incredible experience for me and I have never properly thanked her for what she did for me.

One of the things we did together was go to her Institute Class. Institute classes are religion classes for Mormon adults of college age, but outside of a formal university setting. They are similar to the BYU classes and follow the same textbook.  I do not know whether or not the teachers have the same freedoms that BYU religion professors have, but I doubt it.

New York City did not have a large number of Mormons or Mormon chapels at the time.  The lack of church buildings meant that my sister’s institute class was held in a public building at West Point, a military university campus. At the same time we were there, another religious group was meeting just down the hall. Someone in our group realized the others were watching the movie The Godmakers. This is a piece of anti-Mormon propaganda that was very common at the time and there probably weren’t any two words in existence that could raise Mormon hackles the way the title of that movie could then.

And it certainly raised mine. I stood outside the door of that antagonistic meeting, listening, not daring to go in…until it ended. As soon as it was over, I went in and confronted the group and told them that the movie they had been watching was full of half-truths and lies. They were fairly polite in their responses. For my part, I felt I had followed through on my recent determination to stand up for those who were unfairly attacked.

Generally speaking, my determination to stand up for others has not led me to be so bold, nor has its focus been on defending my own religion, though I have done that as well. Occasionally it has led me to confront members of my own religion regarding their treatment of others.

In the end, no matter who I am defending from whom, it all goes back to that day in my religion class at BYU when I saw that woman leave in tears. I guess I already kind of knew how it felt to be an outsider, to feel like everyone was against you. After seeing bullying like this happen as an adult, I could never be a part of it again. I could never again be comfortable remaining in silence and let other people treat another human being that way without raising an objection.

This matters to me. It is one of my personal rules. And even if I am not perfect in following it, I do try and I intend to continue.