0:33 | Intro. [Recording date: February 13, 2018.] Russ Roberts: I'm both excited and slightly embarrassed to say that our topic for today is skin in the game. It's in a way our third episode on the topic. We did one episode on the paper you wrote with Constantine Sandis that had the title "The Skin In The Game Heuristic for Protection Against Tail Events." Then last August of 2017 we did have an EconTalk episode on the book that's coming out shortly. Nassim Nicholas Taleb: You mean on some aspects of the book. Russ Roberts: Some aspects of the book when it was in draft form. And today we're going to talk about a number of topics from the book that we didn't get to. Nassim Nicholas Taleb: Which are in fact central. Russ Roberts: Which in fact are central. I mean, I don't know how we had that other episode, but we managed it somehow; and I'm sure we're going to get into some other things, as well. But, our topics for today are rationality broadly defined; decision-making under uncertainty. And I think we're going to get to religion as well. Nassim Nicholas Taleb: And the notion of survival. Russ Roberts: And the notion of survival. Which is actually, the more I think about it, and the more I read your work, the more I think of it as being central to the lessons that you have to teach in terms of decision making under uncertainty and skin in the game. So, I'm going to start with two kinds of probability that you talk about. One is ensemble probability and the second one is time probability. Set this up with the example from the casino that you used in the book. Nassim Nicholas Taleb: So, most people have the illusion that you can compute probabilities--what we call state space in finance by looking, say, at the returns on the market, what people are making, returns on businesses. And that we would apply to you. In fact, if you have the smallest probability of an absorbing barrier, then you're never going to be able to capture that market return or that ensemble return. Russ Roberts: But explain what an absorbing barrier is, first. Nassim Nicholas Taleb: Yeah. And absorbing barrier is a point that you reach beyond which you can't continue. You stop. So, for example, if you die, that's an absorbing barrier. So, most people don't realize, as Warren Buffett keeps saying, he says in order to make money, you must first survive. It's not like an option. It's a condition. So, once you hit that point, you are done. You are finished. And that applies in the financial world of course to what we call ruin, financial ruin. But it can be any form of ruin. It can be ecological ruin; it can be personal ruin. It could be the death of a community. Whatever it is. So, let's isolate the point with the following thought experiment. You send a hundred people to a casino, and the casino, you don't know the return from the casino. It's just set up, some weird person, and you don't know if the person who set it up is giving you the edge or not. Okay? So, you send them--you give them, each of the people, him or her, an allowance, and ask them to gamble for an entire day. So, and then you compute the expected return of the casino per day by what comes back. So, if Person Number 27 goes bust, loses everything, would it affect Number 28? Russ Roberts: Not at all. Nassim Nicholas Taleb: Not at all. Okay, so you probably will have a certain number of people go bust in your sample, but you don't mind; you count that as zero and you compute the expected return; and you can figure out if it's a lunatic or a very smart person running the casino. You get the exact return per day of the casino for the [?] strategy. Now if, on the other hand, you found one person, say yourself; you go 100 days to the casino, with the same strategy, and on day number 27 you are bust, will there be another 29? Russ Roberts: There will not. Nassim Nicholas Taleb: Exactly. So, that's the absorbing barrier. So, eventually if you have an absorbing barrier the question is not, you know, whether you are going to survive or not. The question is when are you going to go bust? Because eventually you are going to go bust. So, your return, your expected return, if you have a strategy that entails ruin, is exactly, the expected return, is--depending how you calculate it, you are going to lose everything. The expected ruin, you can count it at negative infinity if you are using log, or negative 100%, or whatever it is. So, any strategy that has ruin will eventually have, if you extend time to infinity, will have -100% return. And that's not very well understood, because a lot of people engage in strategies that entail ruin, not realizing that eventually it's going to catch up to them. But, one thing that I learned when I was a trader, the very first lesson from all traders, was, 'Listen. Take all the risks you want. But make sure you're going to be here tomorrow.' The game is about being in the office tomorrow at 7 a.m., because you can always start early. And that was the game. You can take all the risks you want. And effectively, every single surviving person, they take [?]--all these people, all they worry about, is ruin. They don't worry about return, all this complicated stuff. In finance, emerged two paradigms. One, Markowitz, which is entirely academic, not even used by Markowitz himself, which is like computing complicated probabilities of what may happen with returns, [?], very complicated. And then the other one is a very simple one that focuses on two things: what you expect to make, adjusted every day, and survival. Make sure you don't go bust. So, almost all traders that survive use the latter. Okay? And every single academic who went to trade--and we counted, I think in 1998, how many academics went bust after the LTCM [Long-Term Capital Management]--academics you mean in finance, not in mathematics--and we noticed that close to 100%. There's only one person who may have survived, in the 1998 collapse when Long-Term Capital Management was effectively [?] a short-term firm went bust making bets on small probabilities. |
7:50 | Russ Roberts: So, let me restate this a little bit. I think--in thinking about the casino, there's a presumption that the odds are in favor of the casino. You started out by saying we don't know how the casino owner is setting things up; but if you have a long-running casino like in Las Vegas today, the odds are slightly in favor of management. And so, one way to say what you just said is: You can't have a lifetime strategy of earning money by going to the casino. Nassim Nicholas Taleb: No, that's not what I'm saying. Actually, what I'm saying is even stronger. I am saying that even if you have the edge, in the presence of the probability of ruin, you will be ruined. Even if you had the edge. Russ Roberts: If you play long enough. Nassim Nicholas Taleb: If you play long enough. Unless you engage in strategies designed by traders and rediscovered by every single surviving trader, very similar to what we call, something called the Kelly Criterion, which is to play with the house money. In other words, you start betting in a casino, the strategy is as follows: You go with $100, whatever you want; and you bet $1. If you lose your bet less than a dollar, you bet, say, 90 cents, or whatever; and if you make money, you start betting with the house money. And this is called, playing with the market money or playing with the house money. And so increase your bet as you are making money, and you reduce your bet as you are losing money. And that strategy is practically the only one that allows you to gamble or engage in risky strategy without ruin. Russ Roberts: It challenges the--in other words--think about it as an asymmetry there between wins and losses: that one might think of as--I don't, but many people think of as irrational. But you are saying it's not irrational; and more than that, often we as economists make fun of people who say, 'Well, I was way ahead and I took a big gamble because I wasn't using my own money. I was using the house money.' And economists look at that and laugh and we say, 'But it's your money. You could have walked away. You could have kept it.' And you are saying that it's actually rational to treat the money you win differently from the money you lose. Nassim Nicholas Taleb: Exactly. Behavioral economists have something called mental accounting, which states exactly what you just said: that treating money according to the source is irrational because these are one-period models. That's how they view the world, as a one-shot experiment. They don't view the world as repetition. A repetition of bets. So, if you look at the world as repetition of bets, under condition of survival, then mental accounting is not only not irrational but is necessary. Any other strategy would be effectively irrational. |
10:46 | Russ Roberts: So, I'm going to read a long quote from the paper, which I think sums this up really well; and it's shockingly provocative. Especially when we think about what is going through people's heads when they are sitting in an experiment that we are trying to generalize from. This is what you say: The flaw in psychology papers is to believe that the subject doesn't take any other tail risks anywhere outside the experiment, and, crucially, will never take any risk at all. The idea in social science of "loss aversion" has not been thought through properly--it is not measurable the way it has been measured (if it is at all measurable). Say you ask a subject how much he would pay to insure a 1 percent probability of losing $100. You are trying to figure out how much he is "overpaying" for "risk aversion" or something even more foolish, "loss aversion." But you cannot possibly ignore all the other financial risks he is taking: if he has a car parked outside can be scratched, if he has a financial portfolio that can lose money, if he has a bakery that may risk a fine, if he has a child in college that may cost unexpectedly more, if he can be laid off, if he may be unexpectedly ill in the future. All these risks add up, and the attitude of the subject reflects them all. Ruin is indivisible and invariant to the source of randomness that may cause it. So that's a very, I think a very deep insight into how carefully we have to be interpreting what seems to be a very clean experiment: Your willingness to pay for insurance, say, of a particular event. Nassim Nicholas Taleb: Okay. Let me sample it[?] by my methods. The way you approach a problem, say an economic theory, and you wonder if it changes if you make things dynamic, not static, you see, like in other words it's not a one-shot experiment but many, many experiments. Or many repetitions of the same risk. And the second one is what I call, you perturbate. In other words, you just assume that you may have the wrong model here and there. And so these two tricks effectively pass a lot, much of the results of the hero[?] economics. Not the psychology. The psychological experiments are fine. But the hero[?] economics are trying to make something, derive the rules from simplified sets. And let me also add another dimension that people miss. Say I ask an economist or a person who studied economics but not well enough. And what's the risky scenario? He or she would answer, 'Well, my death[?]?' And then, I would say, 'Well, do you have family? Can something be worth just your death?' And effectively they say, 'Oh, yeah, yeah, my death plus the deaths of my parents and children, and cousins, and pets and so on.' I say, you continue, 'How about the ruin of your tribe?' They say, 'That's worse than--that's a set that's worse than the previous one. And then, till you hit the environment, and earth. And then, what you notice that they effectively, intuitively, when they don't, you know, repeat what they've learned at school, they will consider a risk based on both repetition and life expectancy that is reduced by taking that risk. So, for example, if I cross a street, I am not--I am of course reducing my life expectancy maybe by a second, or not even by a nanosecond. Russ Roberts: The expected value. Nassim Nicholas Taleb: Exactly. The life expectancy, I am reducing it. But if I am taking a risk for something higher than me, namely a tribe, the tribe is supposed to survive longer than me. And of course humanity, supposed to have an extra few billion years, so you are reducing from that, the value of that. And of course, when you talk about ecosystem, you'd like to be permanent or whatever you can call permanent--billions of years. And reducing that, I think by some actions. So, the ranking of the risk based on lifetime, the life expectancy that you are reducing, is something that has not been in literature[?]. So, when we did our precautionary principle and I had a talk with you about that, our point was that humanity to survive forever, so if you take these smaller pieces of risk that threaten, okay, humanity, or threaten something we call total human[?] extinction or extinction risk, then you are gambling with something much more dangerous. And there is a pyramid of ruin risks. My ruin is not a big deal. I would just [?] I think I've listened to your podcasts extends my life. So maybe I lived another 50 years. And 50 years is, yeah, I reduced my life expectancy by a little bit. It's not a big deal. But if I reduce life expectancy something that should survive an extra billion years, that's a big, big, big cost. And effectively, you can phrase that in terms of cost/benefit along these lines and obtain results that are vastly different from what is believed by the so-called risk community. |
16:00 | Russ Roberts: You criticize your critics. When you talk about the precautionary principle, they respond, 'But you do cross the street.' So, even though the expected loss is very small because the odds of being struck by a car are very small, you do cross the street. You do take some risk of ruin. You don't just stay home in your bed. And, what's your response to that? Nassim Nicholas Taleb: My response is when I, the way to treat these risks is how many times over my life will I cross the street. Okay, several thousand times. Crossing the street reduces your life expectancy by 1 in 47,000 years. It's not a big deal. So, the--crossing the street basically is close to zero risk for me, because my life expectancy is not infinite. But if you made humanity cross the street, that would be a problem, because it would reduce life expectancy commensurably. So, the problem of these analyses that people throw around is that they ignore the value from life expectancy of whatever you are threatening. Russ Roberts: So, give the-- Nassim Nicholas Taleb: And repetition. But let me give you one simple example of how they miss repetition. The way they treat--and I say it in a chapter on rationality and on risk, on rationality--survival is what matters first. Okay? So, people have developed the instinct paranoia. Paranoia based off[?] basically with we survive as a species, humans, however you define it, as whatever species we were, we have to have had some paranoia otherwise we wouldn't be here. Being here after millions of years. So, people develop good reasoning. So, if you ask a psychologist--if you narrow the experiment the way they do it, and you say, 'Okay, why shouldn't I smoke a cigarette?' In a one-shot[?] experiment, it makes a lot of sense. The risk is tiny and the pleasure is good. So I should smoke a cigarette. But, your grandmother would say, 'I've never seen someone smoke a cigarette and enjoy it and not smoke another one.' So your grandmother will think in dynamic terms. Because that's how we think. We think in dynamic terms. You see? Paranoia, locally, for example: If someone points out the risk of some, whatever, terrorism or something, we are--apparently we are overestimating. But people don't understand that if you eliminate paranoia, you've eliminated eventually the human race. You have to have that paranoia for anything that entails, you know, massive tail risk. And that's the only way to do it. But I see it, you know, in reduced form in trading. You see traders that basically are paranoid about anything that will bankrupt them. But they don't care about variation. They don't--they bet, they speculate so long as they know they are not going to be, you know, to produce way extinct[?]. Yeah. It worked out. So that's the idea of separating these risks and the risk of being wiped out and what are you wiping out? Are you wiping out a community, or are you wiping out something? And, in the process, and talking to my co-author Sandis, who, you know, does Philosophy, various philosophy of action and he does Ethics. And we encounter, we saw the paradox that remained unsolved, as follows. Aristotle, in his Nicomachean Ethics has various statements that encourage courage [?] as the highest virtue, and at the same time prudence, the highest virtue. Now, and also, there is a belief among the ancients that you should have all virtues or you have none. So, in other words, if you have one virtue, you should have all the others. Okay? And also there's another belief that one virtue equates to all the others. There's an equivalence. So, whatever. So, it looks like paradox: How could you be both valuing courage, you know, i.e., risk-taking, and prudence, which is the avoidance of some classes of risk? Well, it turns out courage is prudence. Because if I save a collection of children from drowning, effectively I have reduced my life expectancy. But, increased theirs. Which is longer. And more numerous children. So we understand that if you take risk for the collective, you are courageous, for yourself, but prudent for the collective. So, that is how we solve that paradox, Constantine and I. And we are going to probably publish something if we get to it, but now for view, sort of like we are confident that we have solved that paradox, that wasn't seen that way. But if you start doing the things we are talking about--dynamic, in other words things are repeated--and layering, in other words, things have higher life expectancy than others--then you can solve a lot of paradoxes. And a lot of the things that appear to be biases in the, the literature and the economics--not economics, behavioral economics literature--are not really--what, they are biases, maybe? But they are not bad biases. They are necessary biases. You have to be necessarily careful about paranoia, about the survival, particularly of something much higher than you. |
21:41 | Russ Roberts: So, one example you use in the book which I think brings this home is the smoking example that you just mentioned. Let's just structure it a slightly different way. A hundred people smoking one cigarette a day might be relatively harmless. One person smoking a hundred cigarettes a day is not so good. And you can't--as you point out often in the book, scaling is tricky. You can't just say, 'Well, if 100 people smoking one cigarette is not so bad, then one person smoking a hundred is the same thing.' But they are not the same thing. Nassim Nicholas Taleb: They are not the same. They are not the same. This is an Antifragile that people are starting to get now, 5 years, 6 years later. And, as I say, if a hundred people jump 1 meter it's not the same risk as one person jumping 100 meters. So, I mean, because you have acceleration. So, and you have accumulation. And these things are, in fact, well understood by out-psychologically[?]. We are excellent risk managers when we are left on our own. And it's not some psychologist who just read a few books and knows maybe mathematics who is going to make us look irrational. And try to nudge us into some different behavior. The point is, we have survived so much. We have huge track record. And any statistician would say that something with such a track record has to have some evidence of skills in surviving. Russ Roberts: I have to confess that when I worked in a racetrack in Monmouth, New Jersey, for a summer, my grandmother did tell me not to place any bets. She was a wise woman. I, of course, was a--I thought I was a wiser 18-year-old. And since I had promised I wouldn't make any bets, I did keep that promise--sort of. I would occasionally--well, actually once a day, I would split a bet with a woman who worked in the kitchen. I was the ice man. And it turned out, we did okay. We didn't go on to that second bet. But I think that's what she was worried about. And correctly so. She was worried about me losing my summer money, my summer's earnings, through addictive behavior. And I think it's a very interesting challenge to think about life as one-shot deals versus longer-term dynamics. You know, one more cookie is always harmless. But, when it's 10 because you had 9 before, it's not so harmless. So it's hard to keep that in mind. It's a good thing to think about. Nassim Nicholas Taleb: Yeah. Thanks. Let me make a confession about gambling. I've been a--I've traded for so many years. And I have such an allergy for gambling. I've never gambled. Every year I go to Las Vegas for a seminar or a conference where you drink, you eat, you do a lot of things. But the gambling table, I can't even concentrate on the table. I mean, I try to watch a game, I can constantly--there is something about it that is so contrived that you really have to have a certain mindset to gamble that's not that of a trader. A trader doesn't like constrained rules. You see? And I know very few traders who gamble. Some of them play bridge. Some play poker, slightly a different dynamic. But gambling is not something that attracts a trader. There is something--plus there is something horrifying about somewhere, about entering a trade knowing you are losing. You see? Russ Roberts: Well, I think there's an opportunity here. Someone out there listening should fund or create the documentary: Nassim Taleb at the Mirage. They would follow you through the casino; we would allow you to expound on the things you are talking about in the first few minutes of this conversation. I see it as sort of a stop action, Claymation kind of thing. I think it would be awesome. |
25:28 | Russ Roberts: I want to ask you a question about--well, first, let's talk about religion. Now, a lot of people--it's very fashionable--that's a disrespectful word. I'm going to rephrase that. A lot of smart people are very critical of religion these days. And, one of the things that you hear is that religion is irrational: There's no evidence for it; it's a superstition that was comforting to people before we had the enlightenment. And you argue in the book that religion--that's not the right way to think about the rationality of religion. And the fact that certain religions have survived for a long time shows that they are "rational." And your definition of rationality in that context is the same as you've been using in this gambling context, which is: It leads to survival. It promotes survival. So, talk about religion. Nassim Nicholas Taleb: Yeah. What comment I would make is that it's not the religion that survives. It's people who have it that survive. So, whatever beliefs these people have that allow them to survive cannot be discounted. By looking at their cosmetic expression. So, let me, so, religion. A few things that I talk about: Let's make sure that we don't equate all religions, because some religions are religious[?] and other religions are not. Some are more literal; others are more, let's say, semi-literal or definitely metaphorical. But, one thing about belief, okay? And [?] support of rationale. If you--and this came to me from meeting, finally, Ken Binmore, who really probably did more fundamental work, foundational work, on rationality than anyone else. And Ken Binmore effectively says that all these attacks--or, you know, on economics, economic decision-making, based on, you know, by arguing about irrationality aristophy[are at risk defining?] irrationality. You see? For example, the conventional economics don't define you as a--the economic gains as accounting. That's just a vocabulary. There are other things. So, if you for example give your money to the poor, there's nothing irrational about it. You see? So, there's some restriction incoherence. So, I thought about what he was saying and how people define rationality, and that went back to how people express what they call rational. And I notice is that usually, ex ante, hence non-empirical definition of rationality. Ex ante means that I define an action as being irrational. You know: it means that you know everything that is going to go on around that action. In other words, that your model represents the world. And we've known since Simon [Herbert Simon], other rationality, that effectively you will never be able to build a model that can understand the world. So, when I say an action is irrational, ex ante, beforehand, I'd better have a track record of that action, because we need to see if it's maybe because there are things that are not included in that model. So, for example, if I say that it is irrational to prefer A to B, B to C, but not C to A, I'd better have a good model that this holds in the real world. That's called the transitivity condition. And I have argued in Antifragile that if you expand the model to saying that for an individual it may make sense to be coherent but collectively we cannot operate what's coherent individual because you deplete resources--for example, if you always tuna to steaks, you would deplete the tuna supply. And so, therefore, you need to cycle. And nature makes you randomly change preferences. And that's a great way for things to survive. So, for example, these are the modifications to the narrowly defined, what I call baby models that you encounter in behavioral economics, and then in decision-making--and all these so-called decision sciences, what I call decision pseudo-sciences--is really [?] and I find[?] that they are irrational[?]. Or, for example, if in intertemporal preferences if someone offers you an apple today versus two apples tomorrow--well, in an ecological framework, you may say, 'Well, what if he's a person who is full of baloney? Okay. I'll take the apple now. I'm not taking it now because I prefer to eat an apple now. I'm taking it now because he may disappear[?]'-- Russ Roberts: He may not come back tomorrow. Nassim Nicholas Taleb: He may not come back. He may die tomorrow. If you include these models, then a lot of these hyperbolic discounting--well, all of these models become much more coherent. So, let me say something now about religion. So, if I judge religion without its track record, I'm going to get into a lot of theoretical--I'm not just saying empirical--theoretical mistakes, because if you think of what was, what would have happened if we didn't have these religions, I think a lot of people wouldn't haven't had the right decisions. And so religion allows you sort of intergenerationally to convey some kind of behavior. Okay? Now, if you have to give the story with a religion to justify that behavior, well, that's it. Who cares? And the example I use in the beginning is, even in science we don't have a perception when I look at Greek columns, you see, there is a distortion, for instance [?]. Religion may be a distorted view or way for us to view the world that has allowed us to survive. So, I give a lot of example of how to judge religion--you should judge it ex post, not ex ante. And I take, for example, something that seems for non-religious Jews, not, you know, not rational, which is to have 500-and-some dietary laws and two sinks in your kitchen. Now, when you think about it, it's the wrong way to just judge that on the basis of rationality. The way you've got to look at, see it, is as follows. What if Jews didn't have these dietary laws? What would have happened? Well, you know that those who eat together diet[?] together. So, they would have been more dispersed; therefore much more vulnerable. And so they owe their survival to their dietary laws. So-- Russ Roberts: If you take that view--a variant on that is that eating pork or shellfish is bad for your health in times when there's not good refrigeration, etc.--if you take that anthropological perspective, there certainly isn't a case, rather than, say, a holy or divine one, there's no case for people to keep kosher today, if that's your view, right? Nassim Nicholas Taleb: No, I really don't--we don't quite--[?] we don't fully understand the world. And a rule[?] that has survived a long time may have [?] that we haven't detected yet. You see? The idea that, you'd say, not eating shrimp is because they are impure, may be because they are impure or may be because it's good to have dietary laws. Maybe it disciplines you elsewhere. I don't buy the idea of pork being insalubrious, therefore Semitic religions except for Christianity refuse pork. The idea, to me, is probably deeper, because the Greeks also living in the same environment, the Cypriots and the Egyptians initially[?] didn't have these dietary laws. The North Africans also didn't[?] have these dietary laws and came later. So, I don't believe that we should give a lot of reasons for these--that we should go back and say it affected the [?] thing as necessarily the reason. It's a possible one, but you can never test it. We know that these religions have helped in survival, and whatever is related to survival is essential, because there's a path dependence. To do science, you must first survive. |
34:10 | Russ Roberts: So, I want to--do you want to say anything else about religion? Because I want to switch gears in a minute. But, do you want to say anything else? Listeners know that I keep Jewish law; there are certainly parts of Jewish law that are not easy to accept or to view as rational. But, as you say, I take the whole picture. I do not choose one by one. And the outcome for me has been very good. I don't mean this sense of very good in that I'm rich or I'm healthy or whatever. I find the practice of my religion deeply satisfying, and I'm Talebian enough to say I can't then take one plank out of the boat and say, 'Well, this one doesn't make sense.' I accept the whole thing, with all of its flaws, and the outcome, ex post, is good for me. Nassim Nicholas Taleb: Yeah; you'll notice one thing: that religions come as a package and you can't pick and select. It's not like political parties: you can be the Left with this with respect to abortion but on the Right with respect to economics. It doesn't come that way. Religion comes as a single block. You take it all or leave it all. Russ Roberts: Yeah--you're either in the club or--there are different clubs with different rules, so you can choose to that extent. Nassim Nicholas Taleb: Yeah, of course, of course. Even then--but, one thing that's misunderstood, as I started looking--I've been looking for 20-some years, I've been looking at religion mostly because I'm interested in Semitic languages, and Semitic beliefs, not so much initially in theology. And I've noticed that, you know, people calling religion--they are very confused about what they call religion. And the following will try to explain to us the main difference between atheism and secularism--why we should focus on secularism, not atheism. So, religion, and you notice that for the Jews was initially law. So, it was a legal system. But it was tribal initially; and then later on of course expanded. For the Arabs, for the Moslems, religion was law. And actually, the word 'din' in Arabic is 'law.' If you use Aramaic, you use 'nomus[?]' for 'law,' not 'din'--religion. But 'din' in Arabic means law. And 'Medinat,' this place where law prevails. And actually the name of state of Israel is Medinat, and Arabic city is Medina. So you realize that 'din' means law. But 'din' is the courthouse; and you are taking the law. And that's basically it. It's a unified body of law that you are prescribed to. That's religion. That was religion. Now, came Christianity. Christianity is, fundamentally is secular religion. Because, from Christ himself, was, you know, didn't really want to have writing[?] enrollments[?]. It was his, 'Give Caesar what belongs to Caesar.' So, it's not like, take less, take over, and Caesar, okay? So, the idea, of course, Christianity evolved here and there into theocracies. But it could not fundamentally accommodate the notion of being a [?]. Why? Very simply, it developed. And it was absorbed by the Roman Empire. It developed within a system in which law was a Roman law. And, I know the subject quite well as I was interested in school of law in Beirut, which was effectively was where law was made. And then you can see how the documents, the recent[?] Pagan scholars, were, became--when Theodosius would be made his code, the main code for the Byzantine, for the Roman Empire later on, the Theodosius Code, all he took is pagan Roman law and add the blessing, a couple of pages of blessing at the beginning. So, it was not Sharia. And you know this, so Christianity was separation between church and state from the beginning. And that separation is what a lot of the modern world developed, and the secular approach. And, the second thing I mention in the book is that when you look at the behavior of people, you should not look at what people say, but how they behave. You notice that--and the chapter is called "Is the Pope Atheist?" If you look at the behavior, anyone, anyone within these [?]branches of Christianity--of course you've got to exclude the fringe ones like Scientology and all of that--you will notice that when facing big decision they act the same way as an atheist. For example, the Pope and Richard Dawkins would go to the same hospital to get the same treatment. The difference is they would wrap it up differently. And then you notice, also, how people, atheists go to a concert where they are silent and meditative, and Christians, you know, go to the Mass where they are also doing the same thing. As a matter of fact, they are sometimes listening to the same music. So, the idea is, you get in Skin in the Game the entire concept of Skin in the Game is: Look at what people do, not what people say. So, that's what I have to say about religion. |
39:33 | Russ Roberts: But, I didn't understand that parallel precisely. I didn't understand your takeaway. I understand my takeaway. My takeaway comes from David Foster Wallace who says everyone worships. We all have an urge to be part of something bigger than ourselves, and some people express that through their religion; some people express that through a concert; some people express it through a sports team; some people express it through a political party, a political movement. And that sense of belonging, that tribal sense of belonging is a very powerful part of who we are. So, when I think about your point about--I'm going to expand on what you said in the book rather than what you said just now--when the Pope goes to the hospital, there are a lot of well-wishers and prayers and believers who hope to get some kind of divine response. But he also goes to the hospital. He doesn't just rely on the prayer. And similarly, when Richard Dawkins goes to the hospital, he also goes to the hospital, first. He also has well-wishers. They don't think they are bringing divine intervention, but they are hoping that he turns out all right. And there's some sort of a community response among people who like his work, just like there are people who like the Pope. What's the point of those parallels, and what does it have to do with skin in the game? Nassim Nicholas Taleb: Okay. It has to do with the following: the whole idea of skin in the game, as I outline in the Prologue, is, I don't really care what people think. I care about what they do. It's about action, not what comes behind as ornaments. Thought as ornament. I consider thought as just your background furniture. And that may lead you to certain actions. And that's skin in the game. Skin in the game is to establish a difference and the problems, a lot of the pathologies that we have in the modern world come from the fact that we forget that almost everything that was developed came from skin in the game, not from thinking. Sometimes you can find[?] thinking as justification. Like, we didn't develop the steam engine by looking at the previous work with the Greeks had one model--no, it came from developing it with our own hands. So in other words we live in a world that is very easy to capture by doing but not easy to capture by thinking. And thinking, to me, is--of course, I put it in its proper context. Russ Roberts: But are you saying that the Pope talks like a religious person but he acts like an atheist because he doesn't just rely on prayer: he actually goes to a doctor? Nassim Nicholas Taleb: Exactly. People doing things, how they would act in circumstances; and I notice that the difference between a secular Christian--a secular Christian is someone who is, they are a Catholic or an Orthodox person--and an atheist is the same facing some action. So, therefore, I don't see the point in atheism, because of that. Russ Roberts: I don't see that. Explain. Nassim Nicholas Taleb: Okay. In other words, let's not focus on what people think. Focus on what they do. And if you judge people, a Martian observes the behavior of atheists and secular Christians, they would observe the same behavior [?] account of things that matter. Russ Roberts: And so you are saying--are you suggesting that the Pope is a hypocrite for going to the hospital? Nassim Nicholas Taleb: No, not at all. That we have-- Russ Roberts: And are the atheists hypocrites for going to the concert, because they are also religious? Nassim Nicholas Taleb: No, the difference is that the idea of atheism assumes that religion is literal; and religion is about, the criticism of religion by atheists and the promotion of atheism assumes that the behavior of--it's the thought that matters, not the behavior. And the behavior of Christians is pretty much the one that atheists like. Russ Roberts: Oh, I see. Okay. Nassim Nicholas Taleb: That was my point. Russ Roberts: I get it. Nassim Nicholas Taleb: But there's another thing about religion that I'm going to say, here: that religion historically was about skin in the game. So, the gods do not like cheap talk. They like you to do something. So, you had to offer sacrifices. And it was a great model in the past, because it forced you into sacrifices. And there is something that stays with us[?], that talk is cheap, from that, in the behavior. Another thing I'm going to say about religion is I thought for a long time why the Christian religion insisted on Christ being both man and God. And the fact is, he had skin in the game by being man. And people respect those with skin in the game. Had he been God, he wouldn't have suffered. And I noticed that a lot of people who have scars effectively are exhibiting their skin in the game. They are not empty bureaucrats or something like what I call an empty suit in the book. And you would have been an empty suit if you were not harmed by anything. And I observe how Trump owes most of his appeal, all during the Republican Primary when he was standing next to people, and he looked real. Because he lost money. His adversaries were saying, 'He lost so much money.' It made him real. It's much better than someone who lives in cyberspace just writing memos, you see. And the American public understood that, something that the intellectual didn't get: That, America is not about talking. It is about doing. And, losing money is evidence that you are in a doing business, not in a talking business. Russ Roberts: Yeah, well the claim is, in Silicon Valley, that--I don't know if it's still true but it used to be the claim, that if you'd gone bankrupt, if you'd had a startup that failed--or even better, a couple--that maybe it was easier to raise money, because then you'd at least shown that you had those scars. Carrying those around with you, you'd learned something. And in theory now, you could go off and because successful. Of course, it doesn't necessarily follow. Nassim Nicholas Taleb: But that's--warriors try to show off their scars. And these scars, visibly, are a sign of--competence, 'Oh, look. He has a scar. He's a good warrior.' The person who escaped. Scars mean that you are in business. And that makes people, that creates an appeal. So, the suffering of the [?] of Christ are part of that. So, I have these things on theology that are sort of counterintuitive. But that allowed me to engage in a few discussions with people into these things, into theology. Russ Roberts: I like the line you have in the book from the Spartan mother, the mother in Sparta, who says to her son, 'Come back with your shield or on it.' I thought that's an incredibly powerful way to think about skin in the game. Right? You said, if you run away, you can run faster without your shield. You mom doesn't want you sitting[?] at home without your shield. Nassim Nicholas Taleb: Yeah, and society has put a huge premium on individual courage. [?] not to claim[?] courage not to gamble in a casino or throw yourself off a cliff, but courage in order to help others, in battle or protecting something larger than you. |
46:57 | Russ Roberts: I want to talk about the modern challenge that skin in the game faces that I don't think we've talked about before, which came to me as I was reading the book, now, for the second time. You give the example of Hammurabi's Code, where a builder, if the house collapses that a builder built, the house--the builder is put to death, I think. Is that right? Nassim Nicholas Taleb: Yes. It prevents a builder from having hidden risks in the foundations. Russ Roberts: Right. Because the builder knows more than the buyer. There's asymmetry of knowledge. And so, to prevent the builder from taking advantage of that, cutting corners and making a flawed building--and, if I remember correctly--maybe I'm wrong, but the building collapsing killing somebody. Nassim Nicholas Taleb: Yeah, exactly. And also, there's a symmetry[?]. If it kills the first-born son of the owner, the first-born son of the architect is put to death. Russ Roberts: So, in our modern world, I would argue, we've moved increasingly away from skin in the game. The welfare state is an example of it. Corporate bailouts that we have or examples of it--we don't like--a lot of us are uncomfortable with this idea of skin in the game. And, given how appealing it is to you, and somewhat to me, I'm thinking, 'Well, so why is that?' And one answer is, of course, that: Buildings don't just collapse because you cut corners. They collapse because of bad luck, a hurricane, a lot of things happen outside the control of the architect or the builder. The idea of executing him for something that isn't his fault doesn't sit so well with us. So, we love do-overs. We love giving people a second chance. We love extra-credit homework to get your, bring your grade up. And all these things. And of course, this encourages people to act imprudently. It has all kinds of costs. But the other side is also somewhat unpleasant to people. Nassim Nicholas Taleb: No, but I mean, the medicine--and I think a second chapter, course of second chapter, I discuss the case of medicine. Of course, if a doctor amputates the wrong leg, you have to amputate one of the doctor's legs. You know, because we start looking at things statistically. And we have looked at medical performance by doctors, or risk caused by doctors. Statistically. I mean, if you do it once, that's fine; if you do it twice, maybe; a third time, you're going to be in trouble. You see? So, the idea of an architect killing one person may be, definitely needs to be penalized. But, let's go back to the central idea that you have detected, very few economists have detected--and again, in the history of literature, in the literature of economics, we can only find two or three papers on the subject. And it is as follows. Most of economics is perceived to be incentives and disincentives. So, skin in the game would be to incentivize people if they do well, and also disincentivize them. That's not it. No. Skin in the game for me is about filtering. It's evolution. You cannot have evolution if you don't have skin in the game. In other words, you are filtering people out of the system. And I give the example of bad drivers. Now, why is it that on a highway, when I drive on a highway, you don't, I don't really encounter people who are, you know, go tapioca[?] and drive crazily, kill 30 people? Why doesn't it happen? Well, it doesn't happen because bad drivers kill themselves. Partly because they kill themselves, and partly because, okay, we catch them, we take away--we filter them out of the system by taking away their driver's license. And we're good at doing that, for those who have survived. So, the--this is filtering. Filtering is necessary for the functioning of nature. Necessary for the functioning of anything. And that's called evolution. Now, restaurants, if you allow bad restaurants to survive, soon, you know, you'd be eating cafeteria food and immortal[?] cafeterias, because basically university cafeterias are an immortal--I see the [?]relaxed state, you know, institutions. So, whereas you have the pressure and you have great food. I get my squid ink[?] in places because they are mortal[?]. So, that filtering. So now, that point of skin in the game, you pointed out a paper to me, and I found another couple, in economics, is about that if you put evolutionary filters, you get the same behavior as before irrational. And would you like to comment on that? Russ Roberts: Yeah. So, I want to back up a little bit. Because, when we did this interview last August on this, a related topic came up. And you said, 'Skin in the game is a disincentive.' And I said, 'Yeah, it's not just you get rewarded if you do well, but you get punished if you do badly.' And I totally misunderstood your point. Your point is that you don't have to be "rational," as an individual. The normal idea of skin in the game--so let me try to re-state it the way I think of it as an economist. The normal idea of skin in the game is what economists call incentives. So, if I know I can get rich, I'm going to try really hard. If I know I can lose all my money, I'm going to be cautious. And your point is, even if you're not aware of those incentives, even if you ignore the incentives, people who are wise and make good investments are going to be around, because they don't hit that absorbing barrier. And people who make bad investments are going to be wiped out, be taken out of the pool. And that is a very different level of rationality. You might call it meta-rationality, or systemic, or-- Nassim Nicholas Taleb: or people like them [?], collective-- Russ Roberts: Collective rationality. Or systemic rationality-- Nassim Nicholas Taleb: One comment here, one footnote on rationality, before you continue: the other problem that I have in my chapter on Minority Rule or Collective Behavior versus Individual Behavior, that you can easily have, you can easily have what you define as irrational people, okay?-- Russ Roberts: Rational or irrational? Nassim Nicholas Taleb: Irrational people. You can define irrationality however you want. And the collective may behave in how you may define as rational. So, collective behavior doesn't flow from the sum, a naive, you know, arithmetic sum of individual behavior. Because of fundamentalism at least built into it. Russ Roberts: That's Vernon Smith's point, right? Vernon Smith, who got the-- Nassim Nicholas Taleb: Yeah, yeah, of course, of course, of course. Vernon Smith, yes-- Russ Roberts: Vernon Smith, who got the Nobel Prize at the same time as [Daniel] Kahneman. Kahneman was saying, people do all these irrational things. And Vernon Smith's point was: Sure they do. But the market encourages, through partly this filter of what we might call profit and loss or survival and thriving, the market is going to be rational because it's going to punish people--even if they are not paying attention. They are not paying attention, they are going to be punished. It doesn't matter that they have to notice it. Nassim Nicholas Taleb: Actually, yeah. I have another argument which is [?] in the book, which is the market is not driven by the arithmetic sum of participants but by the more evaded[?] buyers, the minority rule, which we discussed last time. And if you look at it based on minority rule, then you realize that you can't really study the behavior of individual or gain any inference about behavior of the market. So, that's one thing about rationality. And I've seen, even beyond the market, when you say about humanity, humans mean having, if the humans collectively, each one has, makes a mistake of, say, for example, be having intransitive preferences--you prefer apple to oranges, oranges to pears, but pears to apple--and whether sequentially or immediately something like that collectively, it doesn't mean that the whole world will collectively be, have the same washout so beautifully in aggregation. But that's quite central beyond markets, because of when we look at society versus individual, when we look at self-esteem[?] versus other forms, more collective form, you know, of preference, I mean behavior. So, you have, you have--and mathematically you can see that very clearly, if you do the mathematics. That's what I regret--I mean, regret--the problem with all these, all these knowledging[?] consequences taken by economists make no sense when you look at collective versus individual. |
55:41 | Russ Roberts: And the paper I sent you that you referenced is a paper Gary Becker wrote a long time ago--I think in 1962? I can't remember the year. And Gary's gone, alas. But, he wrote a paper that I thought was kind of a silly paper as a graduate student. Not that I think anything of Gary Becker's is silly. But I never understood it. Which was: Even if people don't make rational--even if people aren't utility-maximizers--when prices change, when prices go up, they are more likely to buy less of something simply because the domain from which they can choose from has gotten smaller. And the example he gives--let's just assume people choose randomly. There's no rationality. They are not maximizing anything. And they just choose randomly. And he shows that if people just choose randomly they are more likely to choose less of something when its price goes up and more of something when its price goes down. And he used that as a justification for demand curves. Despite the fact that you might not find utility maximization very palatable. And, that's part of what you are saying. You are saying individuals could be erratic, but the system is going to purge people who make bad decisions, and enhance the survival of people who make, happen, perhaps by random choice, good decisions. Nassim Nicholas Taleb: Yeah, try and--so, some of--something on thereontologist[?] agents. And when he called, Vernon--it was, like a big revelation at the time, a wonderful idea. It's as follows. We could have zero intelligence players, and a very intelligent market. Russ Roberts: Yeah. It's crazy. |
57:19 | Russ Roberts: Let's talk for a minute about inequality. You make the point in the book, which reminds me a little bit of the points we talked earlier about, probably over time versus at a point in time. You argue in the book that the way people are looking at inequality is wrong: that they should look at lifetime incomes. And, if they do that, they'll see that people move in and out of different classes of income over the course of their lifetime. And that therefore there's no such thing as "THE Rich" or "THE Middle Class." Is that an accurate way to describe what you are saying? Nassim Nicholas Taleb: Yes. Exactly. I mean, the problem of looking, the measures of inequality we have are in effect ensemble inequality. In other words, you take all Americans and look at how much the winner controls, as so on. So, you say, the top 1% has 50% of the wealth, and things like that; let's have a revolution. And let's tax them. But, what people don't get is, and I'm sure statistics, that 10% of Americans will spend one year on top 1%, something like that, about half of Americans will spend one year in the top 10%. And the way to analyze inequality is not--is in fact like the same the dynamic probability of ruin. You get a look at it over time. Over your lifetime. Of course, you are going to spend years not making money. You are going to be on the bottom. And you are going to years making, having some time, making a lot of money. So, the way you look at the health of a country isn't so much in the opportunity to rise, okay, or the opportunity for the number of people who are middle class, is in the probably of losing your status of top dog. You see? And nobody--I mean, very few people look at it that way. For example, take the Forbes 500, 1985 versus 2015. You'd be shocked. Thirty years later, a very small proportion, something like 10% of families, were both[?]. See? So you have an engine in America to destroy the very strong. Although it creates inequality and does also create opportunity. And opportunity is not--I mean, if someone rises, someone at the top has to fall. And it's easy to fall in America. Take France, and you get shockingly depressing results that effectively--some people stay in the same class all their lives. You know, the upper middle class of civil servants, or friends of the state, civil servants, or heads of companies related to states. Once you did the, you study at certain universities, you are done for life. You have that effect in America, but those who rise are usually those who come from, out of nowhere. And if you take Florence and you notice that places, you know, the wealth in medieval times was in the same families as the wealth that's found today. Largely. So, people discuss mobility naively, so I just propose a measure of inequality based on transition probabilities, a completely different approach. It will give you a much rosier image of America. Now, another interesting thing that comes with America, the health of companies. The same applies to corporations. In America today, the corporation tends to stay 12 years on average in the S&P 500 [Standard and Poor's 500]. And that's very good news. It is very good news. Look, in Europe, what happens when companies becoming cozy with the state, manage to stick around. You see? So, it's the same thing--we are going to look at it, the same thing as inequality. Plus there are other metrics in inequality, very technical, and measurements of Ginis [Gini coefficients] and stuff that are not right. In other words, people give you the illusion that this has been growing over time, when they mean, it may be just the wrong computation. Russ Roberts: Yeah, well, I like to point out that if you got back to, say, 1985, some of the people in the top 1% today weren't even born. Certainly in 1970 or 1975. But, having said that-- Nassim Nicholas Taleb: But, they are talking about families. Families, also. Russ Roberts: Right, and their families were not wealthy. Nassim Nicholas Taleb: In France, it's families, yes, 60% is dominated by families. |
1:01:58 | Russ Roberts: So, I'm sympathetic to your point, as listeners will know. I think it's very important to remember that people can move in and out of different levels of income. But I do think, being financially well off myself, I think my children have a lot of advantages that other children don't have. And it's not just genetics. Some of it's genetic. My kids have pretty good genes, I think. But, they also have pretty good opportunities--connections I've made, things I've been able to teach them, that are going to make it more likely that they do not fall into the bottom half of the income distribution. And people in the bottom half are going to struggle to get into the top half, because they don't have some of the advantages that my children have. And people have gone so far with this to say it's immoral--just to go to an extreme here--it's immoral to read to your kids before they go to bed because it gives them a leg up on the competition. That repulses me. But, but I do accept the point that there is some--there's a much smaller chance for my children to fall into subsistence poverty, say, than somebody in the bottom half of the income distribution: they start far from it, and they have certain advantages that keep them from it. So I think there are some issues there. I think the most important thing to keep in mind is, I think that people want to get ahead. They don't necessarily want to get ahead of others. And I think we should always discourage the natural human urge to get ahead of others rather than just ahead. But, there are some challenges, I think, in the American system today that make it harder for people to be upwardly mobile that I think are bad. Nassim Nicholas Taleb: You'll encounter a problem that I didn't put in the book but I may in the future or may in some other book or writing, which is that: what do you consider as a unit? So, the remedy for that is, if I cannot transmit my wealth to my children, what's my motivation? Why do I have to work? I am working to give them a better future. So, the effort is not worth the unit. If you consider a less moderate[?] modernity and the individual is the unit, then of course it's unfair because my children are going to get more money than others. But if I consider the unit is my family and my bloodline and everything as I do, and therefore depriving me of the possibility of transmitting my wealth because your children are part of you. You see? If they are hurt, it's worse than if you are hurt. You see? How can't you give them your money? So, you've got to think along these lines, and I've been doing a lot of thinking, I didn't fully in the book about what is the definition of a unit. Is a unit you? Is the unit your tribe? Is the unit your descendent? Is the unit you and your [?]? Is the unit you and the Stanford, we're in the Club for Insightful Economic Discussions? What is your unit? And your inability to transmit some of what you have for your unit because you feel that's you is a limitation that no government should be allowed to make without further investigation, or more, deeper thinking about the problem. Russ Roberts: Well, the other point I want to make is that we shouldn't just care about how much stuff we have. Obviously, stuff's important. But what we really care about, I think, is flourishing, and using our skills. Nassim Nicholas Taleb: Yeah, but that's another thing I've noticed: all these discussions about inequality don't come from people who are at the bottom of the pyramid. They come from people, professors of--not you, of course, but left-wing professors of economics who feel they are making a lot of money, but they are envious of the richer. And I've cited lots of papers, going all the way from the ancients to the modern, less modern than we are, but people are jealous of people around them. Envious. So if you ask someone on the bottom, 'What would you like?' they'd like a better 'fridge, a new car, and that's it. But, if you ask someone, a professor at Harvard, of one subject[?] sociology at Harvard what they'd like, they would like their neighbor to be poorer. Russ Roberts: It's uncharitable-- Nassim Nicholas Taleb: This is because it is sort of the gold medal--exactly--silver medal, is who hates a gold medal. Russ Roberts: It's an uncharitable view of my fellow academics, although they're not in the Stanford Club for Insightful Economics Discussions. That's a club I'm definitely going to have to start. We're going to have t-shirts. I love that. SCIED. |
1:07:18 | Russ Roberts: Before we finish, I just want to add one challenge to you, Nassim, which is not--it just came to me. Which is, we were talking about the filtering power of skin in the game, and yet we do want that restaurateur to come back and make the second restaurant better. We do want Bill Belichick after his failure at Cleveland not to be wiped out; we want him to come back and try again. And that entrepreneur in Silicon Valley who has the three failures, we don't say, 'Oop, you're out of the club.' So, it is a little more complicated. Nassim Nicholas Taleb: It is and it's not. In other words, the fact is, and the beauty of the idea of skin in the game is that you should have the same, like when you drive, you have the same risk as you inflict on others. And that was the symmetry of the archetype in Hammurabi's Code, is that what you inflict on others, you should also inflict on yourself. Russ Roberts: You should eat your own cooking [?] Nassim Nicholas Taleb: Exactly. So, for tail risks, this works effectively. And for medium risks, of course you survive, but everybody survives, so you are not inflicting any big danger of others. In the previous discussion on Skin in the Game, I spoke about people being morally calibrated, most people. Everybody is morally calibrated, you see? Removing the tail risk, preventing people from coming back if they inflict a lot of risk on others. Like, for example, warriors. Every warrior--traditionally, society has--we are what we are today because warriors are in battle. So, if you are complete, uncontrollable warmongers like many people in Washington today or some journalists, many think that[?] people, you would end up dying in battle, and these people don't die in battle. So that's what I meant. The restaurant owner, of course is going to be filtered. Or, the theme[?] is going to be filtered--he won't have a bad restaurant; it will be something else. But he is not inflicting undue risk on others. He's only inflicting risks on his investors and on himself. And eventually if he is very bad he'll run out of money. Russ Roberts: Which is sufficient punishment. He's not executed. Unless he kills people through food poisoning. Nassim Nicholas Taleb: Exactly. If he eats his food, he'll be out of--exactly. |
1:09:44 | Russ Roberts: Well, let's close with the following. We--I can't remember; I didn't check what year we did our first interview but it was some while back-- Nassim Nicholas Taleb: 2007. Russ Roberts: 2007. It was about The Black Swan, as I said the last time we talked. I actually liked Fooled by Randomness better: it's--Fooled by Randomness remains one of my all-time favorite books. You wrote that book a long time ago; I think you said almost 20 years ago. And, a set of other books just sort of emerged without planning. The Black Swan came next. And then, I think you had The Bed of Procrustes, 7 aphorisms from your Twitter experience. Then you had Antifragile; and now you've got Skin in the Game. And you call this entire project Incerto. Nassim Nicholas Taleb: Yes, Incerto. Russ Roberts: I wanted to pronounce it correctly. I don't know what Incerto means, by the way. What does incerto mean? Nassim Nicholas Taleb: Uncertainty, in Latin. Russ Roberts: Okay. So, you who have explored through a set of books and papers and now, for me, our conversations, a topic that is inherently unknowable, which is uncertainty, but one can get better understandings of it as you think about it more and more. And, my question is: Are you done? Is this the last book of Incerto? Is there another book in the works? Are you going to go count your royalties, which are quite nice? Nassim Nicholas Taleb: It is uncertain. The subtitle for the uncertain is: An investigation of luck, uncertainty, probability, opacity, human error, risk, disorder, and, especially, decision-making in a world we don't publish that. So, it's about things we don't understand and how to make decisions, while visibly the set is going to expand over time, and I may write another book. And I think I have any idea what I'm going to write next, but I would like to take a break. Writing doesn't bother me. What really bothers me is the book tours. When you become a marketer rather than being a producer; and you know, there's something about marketing that makes me feel like I'm doing something--I'm betraying my work[?], because you're spending time away from your real work. So, I don't like book tours, or not too much. But I like conversation with you, because, as I told you, every time I have a conversation on EconTalk and also skin in the game, I've learned from EconTalk: there is a symmetry [?]. So, this is what really scares me about writing a book is the packaging of it, not so much the composition. But, I've been writing papers in the meanwhile, and papers are book-tour free. There's no paper tour. Academics have paper tours. But I don't have to do it. I just post it and that's it. Russ Roberts: So, I have a great idea for you. I think for your next book, if there is one, you tell your publisher: I'm not going to do any touring. All I'm going to do, the only promotion, will be to make my 9th appearance on EconTalk-- Nassim Nicholas Taleb: Well that's what I told them. Believe it or not, that's what I told them for this book-- Russ Roberts: and then, that way, your publisher has an enormous incentive to help promote EconTalk. Because your publisher will have skin in the game with me. Which I think, this is phenomenal. Nassim Nicholas Taleb: You don't know publishers. This is what I told them. Actually, almost what I told them. I told them I'm going to do a few appearances with friends and I said no book tour. And guess what? We put the book on embargo; now it's unembargoed[?]. But then they came up with a list of things they want--they want to send me here, there, talk to this journalist. No. So, I'm not really--I told them no book tour and they still gave me book tour. Russ Roberts: Yeah. It's just a diminutive one, though. It's a short book tour. Nassim Nicholas Taleb: Hopefully. Russ Roberts: Keep up the good work. |
READER COMMENTS
steve
Mar 5 2018 at 10:43am
[Comment removed. Please consult our comment policies and check your email for explanation.–Econlib Ed.]
Nonlin_org
Mar 5 2018 at 11:28am
Some good ideas but also some bad ones. On religion, atheism is of course a religion hence no criticism of religion but criticism of other religions – we all do that. Here’s proof:
Science = Observation + Assumptions, Facts Selection, Extrapolations, Interpretations…
Assumptions, Facts Selection, Extrapolations, Interpretations… = Sum of Axiomatic Beliefs
Sum of Axiomatic Beliefs = Religion …therefore,
Science = Observation + Religion
Makes sense for the Christian pope and the pope of atheism to both rely on the hospital and well wishers. Russ, you almost got it with “everyone worships” but you keep reverting back to “criticism of religion” and “evolutionary emergence” (not in this podcast). And of course, secularism = atheism.
No Nassim, removing the bad driver is not “evolution” as it does not transmutates the other drivers into super-drivers (per Darwin’s fantasy). It just maintains a common standard.
Good point on statist, elitist Europe – goes to show why the super rich are all pro socialism – let’s not fall into that trap. Also good points on the rationality of risk avoidance, absorbent barrier, etc. A good podcast overall, thanks.
Mike
Mar 5 2018 at 12:18pm
Maybe it reflects poorly on me, but I can’t stand to listen to someone with such an over-inflated view of their own brilliance. I had to shut it off when Taleb claimed to have solved for the first time a paradox from the Ethics. Aristotle is easily the most influential Western philosopher and the Ethics one of his most studied and celebrated works… To claim that you have solved a problem there that has eluded two millennia of the brightest minds of Europe and the Islamic world is truly astounding.
FWIW, Aristotle famously places the virtues as means between two extremes and courage is defined between cowardice and excessive or irrational risk-taking / rashness. “Prudence” isn’t used as a translation of any of the virtues in my copy (Broadie and Rowe, which I highly recommend), but assuming he means something like the modern common understanding, it doesn’t seem to conflict with Aristotelian courage at all.
Dr Golabki
Mar 5 2018 at 1:16pm
@Mike
It’s been a while since I read Nicomachean Ethics… but based on Taleb’s description I couldn’t tell what the paradox was. I agree with you that what we was describing sounded totally aligned with Aristotle’s view.
Rucksack Revolution
Mar 5 2018 at 1:50pm
I found Taleb’s views on religion to be pretty unpersuasive. His argument, as I understand it, is that we should be very careful about calling religion “irrational” and trying to eliminate it. We should even be careful about removing particular “planks” (eg ignoring the Jewish prohibition against the eating of pork) from the religious edifice since the whole package has necessarily had great adaptive value.
I see a couple problems with this. First, it seems like if we followed Taleb’s advice, Christians and Muslims for example should never have stopped persecuting gays, stoning adulterers, etc. After all, these “planks” might be key reasons why the religion conferred a survival advantage! Who are we to pick and choose?
Secondly, you could make the same sort of argument for many deeply held sets of ideas. If they’ve been around for a while, they must have survival value. Maybe we shouldn’t have stopped burning witches at the stake for example. Or maybe we should continue to allow women to be mistreated in the workplace. It’s true that some of these practices aren’t codified in the way that certain religious edicts are, but their status as cultural norms/rules seems similar to me.
Thirdly, due to technology, the world is changing faster than it ever has before. A set of beliefs that were adaptive at any one time could very quickly become maladaptive. It’s possible that we need to use reason to consciously pick and choose ideas that will be good going forward. We can’t rely as easily on traditional ideas given how fast the world is changing.
Finally, our reason is itself an evolved adaptation. Individuals that had the ability to sift through traditional ideas, picking and choosing the best ones presumably had greater survival odds than those who followed tradition more slavishly. Maybe we should accept our distrust of tradition as itself something adaptive and not be so scared to knock down sacred planks.
DrPapper
Mar 5 2018 at 2:38pm
[Comment removed pending confirmation of email address. Email the webmaster@econlib.org to request restoring this comment and your comment privileges. A valid email address is required to post comments on EconLog and EconTalk.–Econlib Ed.]
Dr Golabki
Mar 5 2018 at 5:19pm
On religion:
Off the bat Taleb seems to imply that he’s taking the question of whether or not god actually exists off the table. We’re not going to evaluate theism v atheism on who is objectively right or wrong. Instead, we’re going to evaluate them on which belief system is the most pragmatically useful. This was a bit confusing in the interview because Taleb seemed jump right past this important premise, but fair enough. Interesting tact.
Taleb’s argument for religion seems to be that…
1. We care about survival
2. Our beliefs and culture play a part in our survival
3. As such we ought to judge belief systems on their historical robustness
4. It happens that religion has been remarkably robust…
5. …so let religion go at your own parol
First, I think you could have used that argument to defend slavery before the civil war. Not that religion and slavery are the same, but just to make that point that a historically robust belief is not the same as a good belief.
That said, atheists have to acknowledge religion is a dominant feature of pretty much all socieities. I think there are 2 plausible explanations for this –
1. Humans evolved to seek meaning and reasons behind everything for good evolutionary reasons totally unrelated to religion. Religion cropped up from this as an unintended consequence of that evolution and persists because it’s not sufficiently bad for human survival to be selected away.
2. Collective action beyond the small family unit is incredibly important for human survival and religion is a very powerful tool to form collective groups.
I think “1” is the reason why religion arises to frequently, but “2” is the reason religion is so important in human history.
“2” seems to be what’s mainly on Taleb’s mind and a lot of religious traditions makes sense in this context. Why do religions demand “irrational” sacrifice (sacrificing and animal to god or fasting during a holly)? Because it proves to everyone that you are willing to sacrifice for the community, so we can trust you will sacrifice for us when it does matter (skin in the game). It’s also a communal experience that brings groups together.
Rucksack Revolution
Mar 5 2018 at 7:31pm
One additional comment about the religion section of the podcast: Taleb’s main criticism of atheists was that, in a lot of ways, they behave like religious people (eg both the Pope and Richard Dawkins go to the hospital when sick). At this point I was waiting for him to claim that he discovered the concept of revealed preferences. Taleb then seems to conclude that they are misguided in their atheism. Thankfully Russ noticed what a poor argument this was. Do ISIS fighters behave like Dawkins in all situations? Belief obviously does influence behavior and it’s quite possible that certain beliefs are a net negative for humanity. Showing that atheists and believers sometimes behave the same way is not an argument against anything.
Dan Hanson
Mar 5 2018 at 8:24pm
Like most of Taleb’s stuff, a mix of insightful observations along with speculation presented as facts.
I really did not like his use of the phrase ‘playing with the house’s money’ to explain gambler’s ruin. The common use of the phrase is to describe a particular form of money management fallacy – the idea that you can improve your expectation by ‘locking in’ your winnings. Of course this is false. Your expectation is nothing more or less than the total amount of money risked multiplied by your advantage, All money management does is change the distribution of wins and losses.
Gambler’s ruin is different. It says that even if you have a positive expectation, you will crash your bankroll to zero if you overbet your advantage and continue to do so. The Kelly criterion is a mathematical expression of this, and tells us that you should bet at most the percentage of your bankroll equal to your advantage. For example, if you have a $10,000 bankroll and your advantage is 1%, you should bet no more than $100. If you bet more and keep repeating the bet, eventually you will hit a run of bad luck that will wipe you out.
In practice, most professional gamblers bet more like half their advantage, in order to minimize their bankroll swings.
So, what Taleb was getting at, I suppose, is that if you made a winning bet or trade, you can add that money to your bankroll and increase subsequent bets. But if you lose, your bankroll shrinks and your next bet should be smaller.
This proportional betting scheme is the only way to avoid gambler’s ruin while staying in the game and optimizing your profit. It has nothing at all to do with the source of the money in your bankroll. The concept of ‘bankroll’ itself gets fuzzy if you have other sources of income and cn replenish it, or if you also need to use it to pay your bills.
Phil
Mar 5 2018 at 11:36pm
I would suggest that what the guest calls “rationality“ is actually more existential efficacy. Rational belief is a degree of belief that maps to the degree of the relevant evidence. It has nothing to do with the accident of survival.
MF
Mar 6 2018 at 10:51am
I usually really like the episodes of EconTalk where Taleb is a guest, but this one had a couple of things that really didn’t pass the sniff test:
1. Russ claiming (1:02:45) that some people say that “…it’s immoral to read to your kids before bed because it gives them a leg up on the competition…“. Come on Russ, can you name even one credible source that actually said that in a serious way? That seems a lot more like a strawman you built to knock down than a serious argument to defend against…If you can actually cite someone’s work who made that claim please post a link.
2. Nassim’s claim (1:06:40) that “if you ask someone… a professor at Harvard, you know of something… sociology at Harvard what they would like, they would like their neighbor to be poor.” Not buying it. Did you read a paper where someone made that claim? Did you ask a professor at Harvard what they would like?…Did they say they want their neighbor to be poor or poorer? I agree that government interventions to decrease inequality are ineffective at best, malicious at worst but I would stop short of putting words in a hypothetical professor’s mouth that what they really want is for someone else to be poor. If Nassim did indeed read something by a Harvard sociology professor who wrote “I want my neighbor to be poorer” then please post a link.
It sounds a lot more like Nassim is taking a group of bad ideas and ascribing what he thinks might be their collective motivation (if there is such a thing) to a single hypothetical person in order to slam said hypothetical person as being a bad sort of person…NO THANK YOU! The idea that our society is better served by letting hard work and market forces determine how much inequality exists is an idea that totally stands on its own merits, not on the back of slamming those with opposing views as petty or cruel. If you cannot think of a better path to defending your idea than by claiming the other side is just intrinsically horrible then your chance of convincing those on the other side to come around to your way of thinking drops to about 0%. So are you trying to bring those on the other side around to your way of thinking or just interested in slamming them to make yourself look good?
Beck
Mar 6 2018 at 3:13pm
I don’t understand how his definition of rationality solves any problem.
If you use an ex-ante definition, sure, you have the fundamental problem of needing to understand the entire universe in a given instance.
But if you use an ex-post definition, that what is rational is that which increases survival, what time period do you use? Maybe X increasing survivability over 100 years but not 1000. Extrapolate that out, and while you no longer need to understand the entire universe you now need to wait for the entirety of time.
Dr Golabki
Mar 6 2018 at 3:28pm
@ Phil
I get the point that Taleb is a bit sloppy with terms like rationality, but I don’t think this is totally fair to him.
Take the religious sacrifice example. Scenario: Once a year a shepard sacrifices and burns the best animal in his flock and the rest of his community offers similar sacrifices.
The modernist says, well that’s clearly irrational because you’re destroying something of value. What’s more, it must be terrible for the community as a whole because your making the whole community poorer. The only way it would make sense is if you believe your conducting a literal transaction with a real god through the sacrifice.
Taleb would say, ignore the question of wether or not the god of this specific community is going to reward them. The modernist misunderstands the transaction.
The shepard isn’t wasting his animal, and he probably isn’t buying anything from god with it either. He’s buying the respect and trust of the community, which turns out to be incredibly valuable to the community as a whole… but also in the shepard’s own rational self interest.
Now the modernist will say… well then the sacrifice is just signaling. Wouldn’t it be much better for all parties if the price of the signal was much much lower?
Taleb’s “skin in the game” point is, actually, the value of the signal really is related to the value of sacrifice. If the shepard sacrificed something of little value it would not give the community much confidence in his willingness to sacrifice for the community in other situations.
One real world example of a small sacrifice is that I recently interviewed a job candidate and the interviewee took the trouble to write a well thought out hand written thank you. This probably cost him an extra 5 or 10 minutes over an above the time it would have taken him to write a boiler plate email thank you note, but it still had an impact. Maybe we’d be better off if we were more willing sacrifice to each other.
Russ Roberts
Mar 6 2018 at 3:31pm
MF,
It was perhaps a slight exaggeration, but this is from Adam Smith, Professor of Political Theory at the University of Warwick. He is willing to tolerate bedtime reading (because of family bonding) but it does create inequality and maybe we parents should think twice about doing it:
But I also may have confused that story with this speech from Al Gore where he talks about the harm from growing in with commuting parents:
A G McDowell
Mar 6 2018 at 4:01pm
Some of the observations about religion do not agree with my own experiences at a mainstream (Church of England) church.
Most services are not meditative. Services are broken up by hymns (stand to sing) and responses from the congregation. The effect is rather to keep people awake, reasonably alert, and involved.
Members of the congregation with illnesses act as if they believe that prayer can help them. They ask to be named in group prayers. They attend healing services – when they are ill and elderly, some having to be wheeled in by attendants. If the service leaders have a more sophisticated view of prayer than this implies, it does not stop them from allocating time to this part of the service, and generally going along with this.
(I do note that praying for good weather appears to have disappeared completely, after being a joke for some time. The church recently cancelled an associated meeting on a Thursday at church on Sunday, following a (correct, as it turns out) weather forecast of snow. Apparently modern weather forecasts are just too accurate to make divine intervention on demand plausible).
Arnold Layne
Mar 6 2018 at 9:54pm
Taleb brings up interesting points about rationality and odds of ruin. Opened my mind to things I had not thought of.
I think he’s on shakier ground with respect to religion. I must be missing something, since surely someone would have pointed these things out to him in their discussions.
1) He says religion is a take-it-or-leave-it proposition. This has not been my experience with most religious people I know. They pick and choose. What they get out of it, and what is adaptive, is the sense of community. You don’t need to believe in all tennets of your religion to find community.
2) He mistakes scientology with christian science. It could have just been a brain freeze, but anyone familiar with the religious landscape would not make such a rookie mistake.
3) He concludes, based on zero evidence, that it’s the dietary laws that bring people together. This is along the same lines as all just-so evolutionary sociology explanations for thins: practically unfalsifiable.
Also don’t like the straw-manning of socialist harvard professors. If you ask them, they’ll say they’re for equality and believe it. Taleb doesn’t explain how he has this special insight into their thinking. Russ tries to explain that this is not the most charitable view of their opponents.
I really respect Russ for this. I know he feels strongly about things, but always tries his best to be charitable. I like how he corrects himself after starting to say atheism is fashionable. Taleb, not so much.
Dr Golabki
Mar 6 2018 at 10:37pm
@ Russ (and MF)
When I heard the “immoral to read to your kids” comment I thought it was absurd, but I quickly forgot about it. But now I feel like I have to comment.
Russ, your examples are both seemingly examples of the exact opposite. (A) a guy who explicitly says we should “encourage” parents to read to their kids, and (B) an Al Gore speech where he laments that sometimes “a parent gets home too late to read a bedtime story”. Both clearly think reading to your kids is a moral good.
That’s not “perhaps a slight exaggeration”, it’s a total reversal of the claim made.
Now there may be someone out there that really thinks we should be somehow preventing parents from reading to their kids, but I promise, there’s no left wing constituency out there that thinks Vonnegut was writing about a utopia in “Harrison Bergeron”.
What this reminds my of is the inverse version of the following conversation…
Libertarian – I oppose the minimum wage because I think it is actually hurting many poor people in America.
Reactionary leftist – I’m outraged! I can’t believe you want poor people to starve in the street!
Dr Golabki
Mar 7 2018 at 8:25am
@ MF and Arnold
On the Harvard professor point… Taleb certainly explained this poorly. Part of the reason for that was (I think) that he’s actually used this example on this podcast in the past.
I think his underlying point his actually reasonable, which, as I understand it…
1. There’s a metric ton of literature showing that measures of quality of life (like “happiness”) are much more closely to related to relative wealth than absolute wealth (which then maybe has interesting implications for egalitarian policies).
2. Almost all this literature comes from quite well off people, who are usually professors of sociology at prestigious universities like Harvard. These people have no skin in the game. They have incredible job and personal security and would generally not be directly impacted by there own policy proposals (although their proposed policies might cut into the wealth of their neighbors in their bucolic Boston suburb).
3. For people with real “skin in the game”, the absolute really matters. If a 20% decrease in wealth means a significant increase in the chances of death for you and your family… you will find it hard to care to much about the relative.
Dave Lull
Mar 7 2018 at 9:53am
Branko Blagojevic ”. . . wanted to test Taleb’s idea of dynamic strategies in games of chance. Rather than diving into the math, [he] just setup experiments.” For his report of these “experiments” see his “Nassim Taleb, Absorbent Barriers and House Money” here: https://medium.com/ml-everything/nassim-taleb-absorbent-barriers-and-house-money-8b21cff2e338
[incorrect HTML removed—Econlib Ed.]
Eric
Mar 7 2018 at 10:09am
And yet, is this behavior the same when facing the biggest decision, which is one of his own central principles?
It is ironic that the example of Christianity under Rome is discussed. That was a time in which Christians were executed because they declared “the Messiah Jesus is Lord” and refused to affirm “Caesar is Lord”. When it came to the issue of personal survival — something central to Taleb’s reasoning, Christians made a radical departure in behavior.
They still do so today. 2014 hit a high point in history for persecution of Christians. 2015 surpassed that and set a new high. Then 2016 set another new high. (Not sure where 2017’s tally ended.)
A dead believer cannot aid other believers. Their sacrifice doesn’t signal that they can be counted on to help others in the future. Those who became Christians under these circumstances could not expect that it improved their prospects for a long and healthy life in this world. Exactly the opposite was true, not only by observation but also by instruction. Jesus said in advance that they should expect this. Peter’s wrote a letter (1 Peter) that reinforced this expectation, saying that this treatment was not surprising.
Additionally, even as believers die today, those that remain still follow the instruction of expressing love and forgiveness even toward their enemies. For instance, we’ve seen that following church slayings in Egypt and also the U.S. to mention just two recent examples.
Does Taleb consider this acting “the same way” “when facing big decision”s ?
Dr Golabki
Mar 7 2018 at 11:58am
@Eric
I do agree, that most religious people aren’t doing a rational self interest calculation. Whether they are martyrs or just going to church on sunday, the action is motivated by something deeper (unless your a sociopath). I don’t think Taleb disagrees with that. It’s just that there’s also an underlying “rational” motivation.
I think Taleb has two possible responses to your broader criticism:
1. Sure, some people take religion too far and ends up getting them killed. That’s true of everything. Some people take drinking water too far and it ends up killing them. The fact that a few people make bad religious choices is to be expected.
2. Your thinking of the wrong unit. If the unit isn’t the individual but the christian “tribe” than there probably is a good rational case for martyrs. Jesus dying for our sins may have saved our souls… but also probably wasn’t terrible for marketing for Christianity.
Michael Smith
Mar 7 2018 at 3:18pm
Near the hour mark, Nassim says something along the lines of “if someone rises, someone at the top has to fall.” This zero-sum view of relative mobility seems to miss at least two cases. First, if the population is growing (e.g., due to immigration), you could have people rising into the top say 1% without anyone falling out of the top 1%. Second, even if the population were steady, if you have lower birthrates among the say top 1%, then you could have people born into a lower income/wealth level rising into the top 1% without anyone falling out.
I have often wondered how much mobility is due to some people moving up and others moving down, and how much mobility is due to people rising into positions made available by population growth and/or differences in birth rates.
Has anyone looked into this or read any interesting research on these issues?
Seth
Mar 7 2018 at 9:15pm
Sounds like Hayek.
“We may not like the fact that our rules were shaped mainly by their suitability for increasing our numbers, but we have little choice in the matter now…” -FA Hayek, The Fatal Conceit
Lauren
Mar 8 2018 at 9:51am
Michael Smith remarks:
Just to clarify, I wondered if Taleb was conflating two points while he was speaking on the spot. I know I, too, was surprised at the zero-sum view, so I wondered if what Taleb was saying by “if someone rises, someone at the top has to fall” was purely an arithmetic point. That is, if you start with 1000 people, the top 1% of that would be 10 people. If after a time, someone originally ranked below the top 10 increases in income enough, he could leap into the top 10. Nothing anyone previously in the top 10 may have changed or decreased. However, because by definition the “top 10%” can only include 10 people, if someone rises from, say, position 11 to position 9, then the person who was previously in position 10 has to fall out of the top 1% just by the arithmetic that the top 1% of 1000 people can only be 10 people.
That would actually be a valid point to make. But it’s pure arithmetic, or a kind of static view. It does not seem to be where Taleb goes in his next sentences or ideas.
I found myself unsure of what Taleb was getting at, so I appreciated Michael’s questioning this matter.
Eric
Mar 8 2018 at 11:12am
Notice that Taleb makes a correction to Russ’s description. He essentially said, “Your description used the wrong unit.” Taleb is explicitly not treating the religion as the unit of survival. He is building his case on the idea, “It’s people who have it that survive.” Yet that claim seems to be directly contradicted by the case of Christians from the earliest days (and in much of the world to this day). As I pointed out, those who follow Jesus as the promised Messiah have had decreased prospects for long life and physical well being — exactly as Jesus and his apostles told them to expect.
As an atheist, Taleb seems to prefer to think the “story” (which he regards as fictional) is an arbitrary added veneer given to justify the beneficial survival behavior, which would be rational behavior even for atheists. The improved individual survival is his foundation for attributing rationality. Yet Christianity from the beginning was a counterexample at its core (not just at extremes or fringes). It has always been rational if and only if its core claim is true and irrational otherwise. The apostle Paul pointed this out explicitly less than 30 years after the execution of Jesus.
In a great many passages (too many to quote here), the core foundation is that persecution and even early death are outweighed by the value of what is gained that goes beyond death. Viewed in this way, there is no sacrifice, only wise investment. But that obviously depends entirely upon whether the claim is factually true.
Christianity was a foolish choice, if one was seeking personal survival in this life.
Jeff
Mar 8 2018 at 12:11pm
Phillip Tetlock, former EconTalk guest, has pushed back on Taleb’s central claims in this book. He points out that skin-in-the-game is neither sufficient (military blunders) or necessary (Kahneman’s research). Tetlock claims it is at most “helpful.”
Taleb responded by blocking Tetlock on Twitter, so I wish Russ would have pushed Taleb on these points to see how he responded.
Seth
Mar 8 2018 at 12:48pm
@Eric — And yet Christians exist in large numbers today.
Eric
Mar 9 2018 at 5:10pm
Yes, quite true. Even during times and places of persecution, it is sometimes the case that the net number of Christians actually increases. That is not always so, but it has been true in China, for example. The death of Jim Elliot (quoted above) and his fellow missionaries eventually led to the conversion of their killers and their tribe. It was said that even in Rome as Christians were being publicly executed, others who looked on were moved to convert despite the consequences. In early centuries Tertullian wrote:
My point was to examine the position of the guest Nassim Nicholas Taleb, which doesn’t seem to hold up. He is explicitly not making the claim that the religion survives and grows. Instead, he is trying to explain religion as being “rational” in the sense that its teachings promote behaviors that lead to better survival for each individual that believes them. The behaviors cause the survival, even if an atheist did them without the religious “story”, which Taleb considers fictional window dressing.
(This may connect back to his distinction about ensemble probability vs. time probability and gambling and survival. It seems to me he’s trying to make a case that religion helps each individual to not “go bust” (die) and to survive over the long run.)
Practical Test Example: The Revealing Case of Matthew A.
On February 12, 2015, the Islamic State published images of their killing of 21 men in Sirte, Libya. 20 of them were Coptic Christians from Egypt. Though given the chance to recant their Christianity and save their lives, they did not and were killed. Already this does not fit with Taleb’s idea of religion enhancing individual survival. (Similar examples could be easily multiplied, including replacing the Islamic State with the Roman State and demands to affirm Caesar as Lord when there were few Christians.)
But the most interesting part of this example is the 21st man, Mathew Ayairga (or Ayariga) who was not from Egypt and had not been a Christian at all. When he was asked, “Do you reject Christ?”, it would have been easy to answer safely and survive. Yet, instead he answered, “Their God is my God.” So he was killed with the others.
None of that fits Taleb’s model of behaviors promoting individual survival. None of it would be “rational” on his model of behavior that makes sense even for an atheist seeking survival. Taleb doesn’t seem to anticipate that these beliefs, choices and behaviors are about something more than survival in this life and are based on a rationality that directly depends on historical events that point beyond the grave.
Nonlin_org
Mar 10 2018 at 9:46am
@Eric
Taleb is Greek Orthodox Christian, not atheist. According to one of his books, he fasts regularly, so he’s quite observant too.
@ Dr Golabki
“1. Humans evolved to seek meaning and reasons behind everything for good evolutionary reasons totally unrelated to religion.”
It’s so very funny when atheists quote their purely religious dogma trying to distance themselves from religion which is an impossibility as demonstrated.
Seth
Mar 10 2018 at 4:56pm
@Eric – Improved probability doesn’t mean that it works in every individual case.
I also don’t believe he said that people take up religion to improve their chances of survival, which addresses your Libyan story.
Rather, it’s an unintended consequence of some religions.
Roger D. McKinney
Mar 10 2018 at 10:50pm
On Jewish dietary laws, Taleb needs to consider archaeology. It has shown that the God forbid the Hebrews from following dietary practices of the pagan cultures around them who ate certain foods as part of the worship of false gods. For millenia eating certain foods at certain times was an act of worship.
Most religions are rational in Taleb’s sense in that the major ones at least promote a common set of morality, thou shalt not steal, murder, commit adultery, etc. This common set of morality allow for community and survival. Hayek mentions in Fatal Conceit the necessity of religion for transmitting such values when no one can see the short term value of them for himself.
If people were good atheists in that they acted consistently with their philosophy, the lack of morality would destroy society and maybe end humanity. So atheists have to be inconsistent with their beliefs and borrow religious morality and declare it good because they know the outcomes of widespread immorality.
Christianity is rational in a way that other religions are not. As the great philosopher Alfred North White Whitehead wrote, modern science developed in the West because Christianity placed far greater emphasis on reason than did other religions most of whom have capricious gods.
To Taleb’s “survival” as a source of rationality, I would add flourishing as a measure of the veracity of a religion. McCloskey’s “hockey stick” of economic development as well as modern science grew out of traditional Christianity’s obsessions with reason and confidence that God is a God of reason.
I’m thinking of what some converts to Christianity in China have written. They credit Christianity for the rise of the West. Therefore it must be true. In other words, no other religion has empowered mankind to flourish as has Christianity. It “fits” mankind the best so that man can be his best. That’s a powerful apologetic.
Finally, I think the a priori principles should be applied to religion as Mises did with economics. The Church Scholastics did that quite well, beginning with Aquinas. They have often been ignored, but rarely confronted by atheists.
Eric
Mar 12 2018 at 10:48pm
[Comment removed. Please consult our comment policies and check your email for explanation.–Econlib Ed.]
Eric
Mar 14 2018 at 6:36pm
First, I’d like to apologize for referring to Nassim Nicholas Taleb as an atheist. The confusion was mine. He didn’t refer to himself that way. In any case, “This episode focuses on rationality, religion, and the challenge of thinking about probability and risk correctly in a dynamic world.” (description) The focus of my critique is not on him but on the claims he makes in this episode about these ideas and “the notion of survival“, which Russ considers “as being central to the lessons that you have to teach in terms of decision making under uncertainty and skin in the game.”
Taleb discusses a different view about risk in terms of time probability (vs. ensemble probability). See the extended discussion about the risk of going “bust” over time and rational strategies for individual survival over the long run. Then the conversation shifts to applying similar ideas about rationality, risk and individual survival to religion as a major part of the episode. In that setting, going “bust” gambling is replaced by risks to the life of the individual (e.g. the risk of death). Notice that Taleb is not talking about the survival of the religion (the ensemble) but about individual survival.
He makes a key claim about rationality and religion being based on behavior and big decisions that (he claims) are independent of the stories that “wrap it up”.
Is that claim made by Taleb true? Does this model (looking at religion as decisions and behaviors mitigating risk and promoting individual survival regardless of story wrapping) actually reflect reality? At least in regard to the historical example of Christianity, I would claim the development of Christianity is a stark counter example to the thesis Taleb is making about rationality, religion, risk, decisions and individual survival. In fact, it would be difficult to design a religion more completely contrary to Taleb’s model. Please allow me to show why Christianity breaks the mold Taleb is trying to propose in this episode. It doesn’t fit his box.
The unalterable core of Christianity is the affirmation that Jesus is the fulfillment of God’s promise through the Jewish prophets of a coming king of a special kingdom from God unlike manmade kingdoms (e.g. Daniel 2; 7:13,14). That king is the “Messiah”, which translates into Greek as the “Christ”. By definition, there is no authentic Christianity apart from the Christ/Messiah/promised King. The charge against Jesus for which he was executed was his claim to be that King/Christ/Messiah foreseen by Daniel and other Jewish prophets (e.g. Mark 14:55-64; John 19:18-22).
Since then, when Christians affirm and give their primary allegiance and obedience to Jesus as the Messiah/Christ, King above all Kings, Lord above all Lords, that is obviously unacceptable to every power in this world that wants to be supreme and demands unqualified allegiance and submission. This leads to the violent suppression of the spread of this competing allegiance. It was true under the Caesars. It has been true of the authoritarian governments since then. Jewish prophets foresaw that it would reach a peak of persecution at the end of this age (e.g. Daniel 7).
Consequently an essential foundation and recurring message of Jesus and his apostles was 1) to expect increased risk and persecution as a result of this path, and 2) that faithfulness to Jesus and his commandments is more important than individual survival in this life. Following Christ is understood to involve readiness to be treated as he was, potentially including an early death. That is why one must “count the cost” of that path. An important identified goal of the death and resurrection of Jesus was to set people free from the slavery of the fear of death (e.g. Hebrews 2:14,15; 1 Corinthians 15:50-58). This sets people free to seek what is truly good and of greater value. In this life, “For your sake we are being killed all day long; we are regarded as sheep to be slaughtered.” but this life is not the end (cf. Romans 8:18, 36-39).
Taleb claimed, “when facing big decision they act the same way as an atheist“. Yet, the original Christian sales pitch is essentially this: “Come join with King Jesus and receive increased risk of a shortened life now. It’s worth it.” If that sounds irrational, it is supposed to sound irrational — according to Taleb’s proposed model of rationality. That’s the point. In Taleb’s model, religion’s rationality is that it mitigates risk and promotes individual survival, regardless of whether its stories are true.
Jesus taught that this way of thinking about big decisions is exactly the wrong path. Reducing risk and seeking individual survival in this life are misleading as priorities. By contrast, even Christians themselves have known from the start that it would be foolish to follow Christ — unless Jesus had historically, factually, truly risen from the dead. What could be more contrary to what Taleb has “to say about religion”?
SaveyourSelf
Mar 15 2018 at 1:02pm
The most important Econtalk episode ever!
Discounting the concept of ‘rational’ and replacing it with ‘rationalization’. Pointing out our inability to perfectly recognize ‘rational’ vs. ‘irrational’ in advance. The importance of failure in determining what we call rational. Reconceiving of survival, evolution, and adaptation as ‘filtering’. Exploring filtering through market mechanisms. Considering filtering through non market mechanisms. Recognizing that filtering can and does occur even in the absence of reason. [If anything, reason is defined by and draws its meaning from the outcomes of a filtering process]. Separately but also significant, the surprising differences in wealth across lifetimes and generations compared to single moments in time across a population.
Nassim finally hit the pinnacle, I think, in his lifetime of work. Thank you, Russ, for interviewing him and for your frequent clarifications on his points and for contributing to his and our thinking.
Marilyne Tolle
Mar 18 2018 at 1:42pm
“‘I don’t think parents reading their children bedtime stories should constantly have in their minds the way that they are unfairly disadvantaging other people’s children, but I think they should have that thought occasionally,’ quips Swift.”
So Pr Swift suggests parents occasionally impose a moral tax on themselves as they read bedtime stories to their children. What would it achieve? and more to the point, why stop here?
Why not take the “level the playing field” argument to its logical conclusion – by applying it to all inherited “privileges”, whether intangible (like the emotional stability and trust in life afforded by loving and attentive parents) or tangible (wealth is the obvious one and already taxed, but what about good looks and good health)?
Few equality advocates have the intellectual honesty of considering the full implications of what they stand for.
Kurt Vonnegut’s sci-fi book “Harrison Bergeron” does it for them. In the story, above-average people are assigned handicaps to level the playing field: the good-looking are made to wear a mask, the strong are burdened with bags of lead, the intelligent are made to wear disorienting earphones to dumb them down etc…
Life is not fair. Get over it and get on with it.
ed
Apr 2 2018 at 10:46am
[Comment removed for supplying false email address. Email the webmaster@econlib.org to request restoring your comment privileges. A valid email address is required to post comments on EconLog and EconTalk.–Econlib Ed.]
Comments are closed.