Why Occam’s razor works for science, but not religion

For those who don’t know, “Occam’s razor” describes a way of choosing between two different explanations for the same thing.  It basically means that the simplest explanation is the best one.  It was named after a 14th century Scottish scholar/friar (because pretty much all scholars were all churchmen at the time).

It is my observation that Occam’s razor really describes an almost universal human strategy.  It is the way human beings approach truth.  “Where there’s smoke, there’s fire” is one example of how people apply the principle to ordinary life.  This saying basically says the simplest explanation for human behavior is the true one.

The problem with Occam’s razor is when it is used as a way to determine absolute truth, rather than determine what truth is most likely.  It’s not unusual for people to use Occam’s razor as an argument-ender (Hypothesis A is simpler, so Occam’s razor says A is true. End of discussion.).  The problem with this is that the principle is only as good as the data we have.  More data can change what the simplest explanation is.  What we thought was the simplest explanation can turn out to be very complicated.

For example, for most of the period that human beings have been around to contemplate the world we live in, most people believed the sun revolved around the earth.  That was the simplest explanation.  Until the 16th century, it was, in fact, the explanation demanded by Occam’s razor in almost all human societies.

What changed?  More and better data.  Europeans realized that the model of the universe they were using didn’t explain the movements of the planets well enough.  If all the planets revolved around the earth, they didn’t stay in nice, neat orbits.  The geocentric model of the universe became more and more complicated.  Copernicus put the sun at the center and the motions of the planets were all described by nice, smooth paths.  More data made a heliocentric model of the universe the simplest explanation and it was eventually adopted by everyone (until it became clear that the sun was not the center of the universe, either).

The amount of data we have today makes a geocentric model completely impossible (the few people who claim to doubt it are obliged to say the data is false).  We have sent people to the moon and machines to the farthest reaches of the solar system.  We have taken so many photographs and measurements of objects in space and of the earth from space that a geocentric model of the universe isn’t even an option.  It’s easy to forget that there was a time when the simplest, most scientific explanation for the motion of the sun, moon, planets and stars was that everything revolved around the earth.

When Copernicus first described his theory, there was nowhere near as much data available and only experts knew the data existed.  At first, Copernicus’ model was only highly probable, not proven, and non-experts had little or no reason to believe him.  We judge the critics of Copernicus and Galileo a bit too harshly.

There is also a flaw in the human mind.  People have a hard time with concepts like “almost certain,” and “highly probable.”  Human beings see them as meaning the same thing as either “true” or “false,” depending on what they are inclined to believe.  If you are willing to believe an idea that is “almost certain” or even “highly probable,” you will probably see “highly probable” as being the same as “true.”  If you really don’t want to believe an idea, you will probably focus on the inherent uncertainty in the term “highly probable” or “almost certain” and say that the idea is “false.”

This is a problem in science, for both scientists and the public, since science rarely declares an idea to be completely true or false at first.  In most cases, science initially rates ideas as being “probable,” “unlikely,” “highly probable,” etc.  Scientists themselves often take sides in scientific debates and talk about their side as if it were “true,” while scientists on the other side talk about it as if it were “false.”  We can hardly expect the public to be more nuanced than scientists are themselves.

As scientists accumulate more data, their ideas stop being “probable” and become either “highly probable” if the evidence supports it or “not very likely” if it doesn’t.  Then—if we’re lucky—more data will show the idea to be either “proven” or “disproved.”  This has happened again and again.  It is how science works, as a whole.

Take evolution.  The amount of data Darwin was working with was fairly small, but over time, biologists described more and more species and archaeologists dug up a seemingly immense number of fossil, with approximate dates provided by dating methods that have themselves gained more and more certainty as more data has been accumulated.  Biologists examined minute cell structures under the microscope and then geneticists added in DNA evidence.

The amount of evidence that supports evolution is now staggering, with much of that evidence discovered in the last 50 years.  Of course, not everyone believes it, for the same reasons human beings rejected previous new ideas: they don’t know the evidence and hearing any uncertainty about an idea they really don’t like is the same as hearing that it’s false.

Of course, not every idea in science is proven to be true.  Some are quietly forgotten as new evidence shows that they aren’t just unlikely, but false.  For example, scientists used to declare that there was a sharp division between animal intelligence and human intelligence.  That difference is slowly blurring as scientists accumulate more data.  New evidence always seems to contradict that idea, rather than confirm it.  The idea that there is a large gap between the intelligence of human beings and that of all (other) animals is headed for the dustbins of history.

This does not bother scientists because science is supposed to be the best description we have of the world around us and how it works, and science is always evolving.  That’s the whole point of doing it.  Some ideas in science are now beyond dispute, but others are not.  The disputable ideas are the ones scientists love investigating and arguing about, by the way.

Religion is not science.  Religion does not purport to be “the best description of the world available,” but “the truth.”  It also makes claims about things that cannot be investigated, proven or disproved.  That’s the whole point of religion.  “Highly probable” is not an acceptable level of certainty in religion.  Religion is supposed to go beyond the available data.

Religion is, indeed, a matter of faith.  For the believer, it is a matter of knowing true things that cannot be discovered by science.  In a religious context Occam’s razor becomes unhelpful because it describes what is most likely true, given the available data, while religion is supposed to describe what is true, without any available data.  That is the very definition of faith.

For example, does evolution disprove the Bible?  Some people believe that it does, but many others believe that it does not.  These believers do not see the question as being “Is the creation story in Genesis true or false?” but rather as “Can the Bible be true even if the creation story is false?” or “Can my religious beliefs be true if the creation story in Genesis is false?”  Millions of people have decided that the answer to one or both of those questions is yes.

For the religious, it is not a question of what is most likely, but of what is possible. Religious people arrive at their beliefs through methods that are not subject to scientific investigation.  When they use science and evidence to test their religious beliefs, their question is not usually “Is it likely that my religious beliefs are true?” but “Is it possible that my religious beliefs are true?”

While specific beliefs of religion can be proven or disproved, the scant evidence surrounding religious belief almost always leaves some uncertainty in general matters, enough wiggle room for people to say “Yes, my central religious beliefs can be true.”  For true believers, the possibility that their beliefs are true is all they need, since they didn’t base their belief on physical evidence in the first place and never expected to have proof of them.

That’s the attitude they have if they’re objective, which most people aren’t.  Most people will do the same as they do with science, except it kind of works in reverse.  When people want to believe in a religion, any uncertainty about evidence that contradicts it will make them see that evidence as “false,” but when people don’t want to believe in a religion, any evidence that “very probably” contradicts it will be seen as “certain proof.”  Believers will think the evidence is irrelevant, while doubters will think that the belief in question has been as thoroughly disproved as the idea that the sun revolves around the earth.

In sum, Occam’s razor is a useful tool, but it is not the same as proof.  When applied to faith, it loses its usefulness.  In addition, the human emotions surrounding religion will cause most people either to exaggerate what it says or ignore it entirely.

Advertisements

Religion, Atheism and Critical Thinking

To begin with, I have to say that I believe in God, but I do not believe atheists are going to hell.  In my opinion, there is little significant difference between believing in goodness and believing in God.  I am more confident in the eternal fate of an atheist who tries to be the best person they can than I am in the fate of a believer who muddles through life without making the hard choices that true goodness requires on a regular basis.

I believe I am on firm ground on this, by the way.  Changing your opinion is easy.  Changing your character is not.  As far as I’m concerned, character is what counts, now and forever.

So, when the New Testament, for example, talks about the need to believe in Jesus Christ, I see that belief as being measured by a person’s actions, not their opinions.  And if we cannot change our opinions after death, we are all in trouble.  Imagine being stuck with the opinions you have now for all eternity!  I know that a belief in God is not just any opinion, but is it so different that it cannot be changed after death?

My religion leads me to this view, of course.  While I am not an “active” Mormon at the moment, the church’s teachings still inform my beliefs and the idea that people can choose to convert after death is essential to the religion’s view of life, death and the eternities.  So, the arguments I have made here are really nothing more than a way of explaining and defending Mormon belief with generic terminology.

As a result of this belief, Mormons do not usually fear for the eternal fate of non-believers.  I am hardly alone in this perspective.  While individual Mormons may look at some of their loved ones and believe they are bound for hell, most Mormons are quite optimistic about the fate of the people who are dear to them.  The religion gives you a choice in how you see others.

The real issue I wanted to address, however, was the claim that religious people (especially Mormons) abandon critical thinking to maintain their beliefs.  I had to include the above paragraph so that I didn’t contribute to—or participate in—the atheist-bashing that is so common in American culture.

It is simply not true that religious people do not engage in critical thinking.  It is not even true that Mormons do not engage in the practice, no matter how many people may claim otherwise.  (For examples of Mormons using critical thinking, see Joanna Brooks and Jana Riess or the sprawling website By Common Consent.)

Someone who does not use critical thinking will encounter facts that counter their beliefs and ignore them.  Someone who uses critical thinking will encounter those same facts and see if there is some way they can be reconciled with their beliefs.  They ask whether it is their understanding of God and religion that is wrong, rather than rejecting the facts or abandoning their beliefs.  They ask whether they need to adjust and adapt their beliefs.  They ask whether rejecting one of their beliefs will really diminish the rest of them.

Someone who engages in critical thinking does not immediately abandon their beliefs, although they may do so eventually if they can find no way of reconciling or adapting their new knowledge to their former beliefs.  I’m not saying that critical thinkers never come to the conclusion that they must abandon what they previously believed.  It is clear that some do.  But the majority of critical thinkers with deep-held beliefs about God do maintain most of them, without rejecting the new facts they encounter.

In sum, you cannot divine a person’s character or their eternal fate by asking if they believe in God, and neither can you use the question to assess a person’s critical thinking skills.

 

 

Noble Selfishness: Donald Trump, Bernie Sanders, the Brexit and Islamic Terrorism

I woke up this morning in a world that seemed somehow changed.  After months of hourly news developments related to Donald Trump’s latest outrage, followed by the “worst mass shooting in American history,” (which may or may not have been an act of terrorism), even as we approached one year since a white supremacist murdered nine blacks in a historic Charleston church, we now face a British vote to exit the European Union.  As my mind struggles to make sense of it all, I recall a common phenomenon I have noticed among parents: noble selfishness.

An individual’s efforts to advance their own welfare are labeled, quite appropriately, as acts of selfishness, but a parent’s efforts to advance the welfare of their children are often seen as acts of love.  It is rare when a parent’s advocacy for their children is seen as a fault.

Coincidentally, we have also seen one of those rare moments in recent weeks, as the father of a young rapist was dragged through hell on the internet for defending his son and brushing aside the profound effect his son’s actions had on another human being.  Even so, the selfishness of his words would have been completely overlooked if the tears of his distraught victim had not spread around the nation before his unfortunate letter did.  Having absorbed her pain before we heard his compassion for his son, we reacted quite differently than the lenient judge who decided the case.

While that case may be extreme, it is hardly an isolated phenomenon.  Parents are usually given great leeway in advancing the interests of their children, even when other children are indirectly—or even directly—harmed as a result.  Schools and teachers are quite familiar with this kind of noble selfishness as they deal with the righteous indignation of a parent whose child did not receive every benefit and every accolade the school could provide.

Continue reading

My personal code for life

This is my personal code for life.  It’s a bunch of obvious things that most people already know.  I am sure I’ve missed some significant things, but I still like it:

  • Anything that leads to love is good. Anything that leads away from it is not.
  • I claim the god-given right to be imperfect.  I also claim the inherent right to have “issues.”  I recognize that everyone else has the exact same rights.
  • I cannot control other people and if I try to do so, I will make both them and me miserable. I do have the right to set boundaries for the treatment I will accept from them. It is my responsibility to communicate those boundaries to them. I have the inherent right to disassociate myself from them if they refuse to honor those boundaries. If someone still tries to hurt me, I can seek out appropriate protection.
  • The universal golden rule still applies: treat others the way you want to be treated. It is wrong to harm others. Allowing someone else to come to harm through inaction is also wrong. And one way we cause harm through inaction is choosing to remain ignorant of the harm we cause to others.
  • It is also wrong to cause harm to ourselves, through action or inaction.
  • I am never responsible for another person’s behavior, but I am responsible for the temptations I create for them. I will inevitably tempt others to be angry, to lash out, to be jealous, to seek revenge, etc., but if I choose to ignore the temptations I create for others I am harming them through inaction.
  • I have the right to decide to believe in God or not. If I believe in God, I have the right to decide what expectations that being has of me. I also have the right to follow those expectations. What I may not do is harm others through either action or inaction in order to satisfy God.
  • It is more important to be wise than to be happy. Seeking happiness over wisdom is likely to lead to neither one, but seeking wisdom over happiness is likely to lead to both. Wisdom is the key to happiness. Understanding is the key to wisdom. Knowledge is the key to understanding.
  • There may be nothing in life I can truly control. Life is a matter of probabilities and odds. I cannot change that. What I can do is change the odds.

Why I cannot be silent

I would like to relate an experience I had while I was a student at Brigham Young University. I had many good experiences there, but I had a few bad ones as well.  And since painful experiences stay with us in ways that positive experiences to not, this event left an impression on me that has never gone away.

It happened in a religion class, almost 25 years ago. All students at BYU are required to take religion classes as part of their General Education requirements, no matter what their beliefs may be. If this sounds odd, you may ascribe it to the fact that the church subsidizes the tuition of all students, even those who are not Mormon.  The tuition non-Mormons pay may be higher than the rate members of the church pay, but it still does not cover the total cost of their education, or at least it did not do so when I attended the university. The religion classes are part of the deal.

It may seem unusual to people from other Christian traditions, but even though BYU religion teachers teach church-designed classes at a church university, they do not have any kind of official capacity within the church. Our church does not have trained clergy or any training program (which sometimes really shows) and BYU religion teachers are all professors who have been trained in other fields. As professors, they have some freedom to teach whatever they would like, meaning that while their lessons usually conform to established church doctrines, this is not always the case.

This particular professor, on this particular day, decided to talk about birth control and how wrong it was. That is not the position of the church, either then or now. In fact, while it is true that some Mormons eschew birth control, it is actually a common practice among faithful members of our church, which means that it is also common at BYU, where a large percentage of the students are married.

One of the women in the classroom decided to express her disagreement with his lesson and included the fact that she and her husband practiced birth control and why they did so. The teacher responded by condemning her for not having sufficient faith that God would take care of them if she became pregnant. A few of the members of the class joined in and ganged up on her. It was ugly and she left the room in tears.

While several people attacked her faith, no one stood up for her. More importantly to me, I did not stand up for her, even though I agreed with her completely. When she left the room crying, that hit me more than anything else could have. I resolved never to let such a situation go by again. I was determined to stand up for people who were being unfairly attacked.

I am sure I have not been perfect in following this principle, but I have tried. One of my strongest memories was when I once defended my church and my own beliefs. Defending my own group wasn’t quite what I had determined to do, but it was close enough to satisfy me. It happened a few months after my experience in that religion class.

My sister Tamara was working as a nanny in New York City and I made some extra money over the summer.  Instead of spending my extra cash on something practical like a well-used car, I kept walking and riding my bike round town and flew to New York City  to tour it on the cheap (still not a cheap trip). Tamara set me up in the apartment of a friend and arranged for people to take me to interesting places during her work hours. It was an amazing trip that included seeing Les Miserables on Broadway. The actor who had originated the role of Jean Valjean on Broadway had returned for a special visit that night and we took in the show from the fourth row. I remember a lady we met in line praising my sister for the low price she had found for those prime seats. The musical and the entire week were an incredible experience for me and I have never properly thanked her for what she did for me.

One of the things we did together was go to her Institute Class. Institute classes are religion classes for Mormon adults of college age, but outside of a formal university setting. They are similar to the BYU classes and follow the same textbook.  I do not know whether or not the teachers have the same freedoms that BYU religion professors have, but I doubt it.

New York City did not have a large number of Mormons or Mormon chapels at the time.  The lack of church buildings meant that my sister’s institute class was held in a public building at West Point, a military university campus. At the same time we were there, another religious group was meeting just down the hall. Someone in our group realized the others were watching the movie The Godmakers. This is a piece of anti-Mormon propaganda that was very common at the time and there probably weren’t any two words in existence that could raise Mormon hackles the way the title of that movie could then.

And it certainly raised mine. I stood outside the door of that antagonistic meeting, listening, not daring to go in…until it ended. As soon as it was over, I went in and confronted the group and told them that the movie they had been watching was full of half-truths and lies. They were fairly polite in their responses. For my part, I felt I had followed through on my recent determination to stand up for those who were unfairly attacked.

Generally speaking, my determination to stand up for others has not led me to be so bold, nor has its focus been on defending my own religion, though I have done that as well. Occasionally it has led me to confront members of my own religion regarding their treatment of others.

In the end, no matter who I am defending from whom, it all goes back to that day in my religion class at BYU when I saw that woman leave in tears. I guess I already kind of knew how it felt to be an outsider, to feel like everyone was against you. After seeing bullying like this happen as an adult, I could never be a part of it again. I could never again be comfortable remaining in silence and let other people treat another human being that way without raising an objection.

This matters to me. It is one of my personal rules. And even if I am not perfect in following it, I do try and I intend to continue.

Speculation Unlimited Part VIII: The Future

The future is a wonderful thing to speculate about. You can say anything you like and by the time it gets here, almost everyone will have forgotten about it. Just don’t market it too loudly or you will be on the receiving end of movies like Tomorrowland.

Before we get to the future, I want to say a few things about the present. We live in a remarkable age. This blog post could theoretically be read by most of the adults on Earth, or at least those who can read English, which is still an amazing number of people. There are people who cannot access the internet, but grand efforts are currently being made to reduce that number.

Not only that, but there are seven billion people on the planet, and that number is expected to reach nine billion or more. The possibility of communicating with so many people and with every corner of the globe means we have just started to really think as a human race. We have only just begun.

It was Douglas Adams who spread the idea of the human race as a thinking machine. He probably invented the idea as well. Of course, he pointed out that this world does a much better job of creating questions than it does of creating answers, but it is still a remarkable concept.

Human beings do not think by themselves. Everything we think is a synthesis of a mishmash of other people’s thoughts. We pass along knowledge to the next generation, primarily our children, but not exclusively so, and they keep working on the problems and ideas we leave to them. The human race really does resemble a massive supercomputer.

Increasingly superior methods of communication have allowed people to exchange ideas like never before. First, postal systems democratized long-distance communication, then the telegraph, the telephone, radio, movies and television all ramped up the possibilities. Now, the internet explosion has made long-distance communication so easy and so democratic that it is hard to imagine communication methods improving much from here on out.

The current barrier to communication is now language, so I suppose once everybody learns English the possibilities will multiply several times more. (That was actually intended to be a joke, since it is a rather common and arrogant assumption among my cultural fellows that English is all we need. Strangely, it didn’t really sound like one.)

Still, we can now trade ideas almost without limit. Someone from China or Nigeria or Chile or Samoa or Egypt could read this post and correct me or point out something that turns my thoughts in a new direction. Or it could stimulate their own thoughts and allow them to come to a better conclusion or to understand something entirely different.  Or they could mention one of my ideas to someone else, possibly in Chinese, Yoruban, Spanish, Samoan or Arabic, and that could trigger a new idea in the mind of that third person, who could go on to transmit that new concept to someone else, possibly in a completely new language.

This massive interaction started about 500 years ago and is now reaching astounding levels. It has enormous implications for the development of culture, philosophy, science and technology. People are exposed to new ideas to a degree that has never been possible. People move around the world in ways that were never possible. People marry outside of their culture more often than was ever the case before. Cultural change is inevitable, everywhere.

This makes the future inherently unpredictable. There is no telling where we will be in a hundred years. It could be a total disaster. It could be a paradise. It will most likely be something in between, because people have been pretty consistent in that respect for a very long time and in every location.

Still, here are a handful of predictions that I am willing to make: in the next two decades carbon-based fuels will go into serious decline and computing power will massively increase. We will also come to understand what our genes and epigenes are and what they do. That’s it. (Ok, “epigenes” isn’t a word, but I couldn’t find one that fit.)

I make these three predictions because I am alive, and also because I am an avid reader of the website sciencedaily.com. This website carries news articles about new scientific research. It is clear that a great deal of time and money is currently being invested in finding ways to reduce the cost of solar power, create better batteries (which are needed for alternative energy sources to be truly viable), increase computing power and study our DNA.

We may have an electronic leaf in the relatively near future that works like plants do. We will certainly have much cheaper solar cells. We will also have much better batteries to run our cars and store our electricity. We may have quantum computers or photon computers that make our electronic computers seem like something out of the dark ages. We will definitely have a much better understanding of our DNA and what it does, although it will probably take longer than 20 years to sort that out.

That is where we are headed. Carbon emissions may mostly disappear, which would be a great blessing. The question is, will it be soon enough? I have hope that we will avoid the worst of the effects of increased carbon dioxide levels, but we may not. The Earth is near a tipping point when it comes to land ice and ocean acidity. We really do not know how well the ecological system can adapt to this new reality. We can only try to make things better and hope. My gut instinct tells me that we will make it, but the future is probably not going to be a total paradise. As human beings, we continually solve old problems and create new ones. That is one thing that is not likely to change any time soon.

Speculation Part VII: Economic growth and the rise of the United States

This is another question that has intrigued me: Why did the United States become so wealthy and powerful? Why wasn’t it some other nation? I did not think of this question on my own, however. I read it as part of my Spanish and International Relations studies in college. The question I read was raised by an Argentinian and it specifically asked why it was the U.S. and not Argentina that grew so wealthy.

The author pointed out all the similarities between the United States and Argentina at the beginning of the 20th century. Americans don’t pay much attention to that part of the world even today, so they rarely realize that Argentina and some other South American countries like Uruguay and Brazil were also up-and-coming nations. They attracted their own share of European emigrants looking for opportunity. They were not impoverished lands of peasants and landlords. They had thriving, vibrant economies that were attractive to people who lived in the less-favored lands of Europe.

I also asked why other wealthy countries were wealthy. Why Japan? Why were the so-called Asian Tigers growing so quickly? How about Israel? I wanted an explanation that would cover everything I saw.

And as now may be expected, I have an idea, or rather four or five (this goes along with my tendency to believe that if you can’t think of at least three reasons something happened, you aren’t trying very hard). I think it was the result of the combination of education, a positive business environment, quality infrastructure and low corruption. I think it also really helped to have close cultural ties with the United Kingdom, which dominated so much of the world’s business in the 19th century.

In essence, I think that these factors had two major effects: they supported internal economic growth and they attracted foreign investment, particularly British investment. As I write this, it seems so obvious to me that I can hardly find things to say to support each factor. Unfortunately, this doesn’t prove I am right, but rather that these ideas are deeply entrenched in my society.

Of course education helps economic growth. An educated population is better able to create innovations, increase productivity and so forth. That is the assumption of the culture I live in. It seems to be true.

Of course a positive business environment helps economic growth. A balanced budget during inflationary times makes it possible to have lower interest rates without increasing the money supply. Reasonable regulations allow people to take risks and innovate. The ability to keep a reasonable amount of your profits has the same effect. A healthy banking sector can give loans that also help people innovate and take reasonable risks. And so on.

A capitalist system does have some downsides. It creates booms and busts and has a tendency to foster inequality, among other things. As a result people challenge it during the down times, but most people love it during the booms.

But quality infrastructure is obviously good for economic growth, right? Infrastructure allows goods and services to be transported from one place to another. (We don’t usually talk about transporting services, but that is exactly what happens when long-distance communication is possible.) That encourages the possibilities for trade and increases the chance someone can find another person who is willing and able to join them in a mutually beneficial financial arrangement. (It’s a win-win situation!)

Now, the economic benefits of low corruption may not be quite as obvious to someone who has lived in a country where corruption is less widespread, but corruption can create a serious drain on an economy. This doesn’t escape anyone who lives with corruption. You can create a great regulatory system, but corruption will defeat it. You can create a great tax system that favors healthy investment, but corruption will negate it. You can create an education system that allows your best and brightest to contribute their all, but nepotism will nullify it. Corruption works against you at every turn.

The least-noticed factor in the growth of the United States, at least if you are an American, is probably the close relationship we had with the United Kingdom. We can see how our economic and political relationships helped Israel, Japan and the Asian Tigers. We can also see how we helped Western Europe after World War II (although Western Europe had been wealthy and powerful for centuries before the war, making it difficult to say exactly how much the United States helped.) What we tend to ignore is how much British investment made America great.

I have heard that the American railways were built with Chinese labor and British money. I expect that was true. Britain has invested countless millions of dollars in the United States over a long period of time, in railroads and everything else. There is no possible way we could have developed economically the way we have without British money. The British invested in us for all of the reasons I have mentioned here, because the same things that made it more attractive for Americans to take risks and innovate made it attractive for the British to do the same.

Our shared language, our shared culture, and the shrinking distance between us (due to faster transportation) made our economy even more attractive to British investment. Some of the vast flows of money that were directed to the United Kingdom were diverted to the United States and invested here. I think that the exact portion is likely to have been fairly significant. The U.S.A. was the land of opportunity for Europe’s poor people and even more so for its rich ones.

The combination of all these things gave the United States a growing, thriving economy.  Combined with everything else (its size, its relative isolation from Europe’s wars, the particulars of the moment in time, etc.), it had everything it needed to become the most influential nation the world had ever seen, at least for a few decades.

So, if you are running a country, here is my advice: educate your people, invest in infrastructure, create a reasonable environment for investment, fight corruption and cozy up to the big wealthy countries of the world. It certainly worked for us.