Playback speed
Share post
Share post at current time

#11: Bryan Caplan

You Will Not Stampede Me: Essays on Non-Conformism

Bryan Caplan is a professor of economics at George Mason University, research fellow at the Mercatus Center, adjunct scholar at the Cato Institute, writer at EconLib and Bet On It, and best-selling author of eight books, including You Will Not Stampede Me: Essays on Non-Conformism, the subject of this episode.

0:00 - Intro

2:04 - The Next Crusade

3:44 - Moderating X

6:11 - Inventing Slippery Slopes

8:04 - Right-Wing Antiwokes

10:20 - Nonconformism and Asperger’s

12:02 - Making society less conformist

16:44 - The rationality community

20:30 - Polyamory

23:28 - Caplan vs. Yudkowsky on methods of rationality

26:40 - Updating on AI risk

29:35 - Checking your nonconformity

31:10 - Making LinkedIn not suck

33:53 - The George Mason economics department

38:35 - Does tenure still matter?

40:03 - Improving education

46:50 - Should people living under totalitarianism conform?

49:30 - Natalism and birth rates in Israel

51:19 - Hedonic adaptation in the age of AI

53:52 - Should we abolish the FDA?

57:15 - Being a prolific writer

1:00:30 - Bryan’s writing advice

1:02:35 - Outro


Bryan’s Twitter:

Bryan’s Blog, Bet On It:

Buy You Will Not Stampede Me on Amazon



Apple Podcasts:

My Twitter:

My Substack:


Theo Jaffee (00:00)1

Hi, welcome back to Episode 11 of the Theo Jaffee Podcast. We're here today with Bryan Caplan.

All right, so let's get into some questions. First question, in your essay Crusades and You, you talk about the eight crusades of hysteria and herding that you've lived through. Islamist Iran, the war on drugs, Free Kuwait, the war on terror, the Iraq War, the 2008 financial crisis, COVID and BLM. So do you have any ideas about what the next crisis might be, crusade, or do you just have no way of knowing?

Bryan (00:35)

Hmm. Hmm. Gee, that's a really tough one. Yeah, if you could figure out what the next crusade's going to be. I mean, a lot of this does hinge upon there being a shocking event. I think there wouldn't have been any George Floyd protests without George Floyd. It really does depend upon having the right shocking event at the right moment. In terms of what would be next, hmm.

Yeah, I mean, I really wish I knew. I mean, normally I will say I don't feel like I've been very good at foreseeing which things would happen next. Obviously, I didn't see the Israel-Palestine thing. I didn't see Ukraine coming. I was writing about it as a possibility, but that's very different from saying that's the one.

Theo Jaffee (01:21)

Do you think Ukraine would count as a full -blown crusade on the level of the others?

Bryan (01:25)

No, no, it's more of a minor crusade. I don't think we've had any true full -blown crusades since COVID. You don't have them all the time. That's at least one of the saving graces is that probably most years there isn't any one issue that everyone is supposed to be thinking about and getting worked up over. But maybe one year and three is in that category.

Theo Jaffee (01:48)

So in the identity of shame, you talk about the dangers of large, unselective groups.

So one such group is X, Twitter. So how do you think it should go about moderating itself to, you what's the right amount of selectiveness, if any, to avoid trampling free speech?

Bryan (01:55)

Well, it's an interesting point. It's not like, at least I've never met anyone who identifies with Twitter itself or X itself. Even Elon Musk is not going to say, I love everything happening on my platform. It's all fantastic. So that is very different from what I'm talking about in that essay, which is if you identify as Irish, then you sit around talking about how great everyone is Irish ever was and defending Ireland against any possible criticism. I don't think that...

Twitter, or any social media platform actually falls in that category. In terms of what they could do in order to improve their brand, I think a lot of what they have done since Elon is improved the brand from being a place where woke voices only are wanted to an actual vibrant center of argument, and one where they are not trying to stamp out any particular view. And they're just saying, you there's bad views, and it's not our job to go and get rid of them.

So in the end,

that's most of what the brand is. I think that is actually a brand worth defending. Same thing goes for Substack, by the way. So Substack recently has come under pressure to go and hunt down possible Nazis and get rid of them. And I wrote a piece saying, this really is a strong example of the slippery slope where once you get rid of them, who's next? It seems unlikely that that would be the only group that you would get rid of and then you would stop because the people that want to get rid of them aren't the kind of people to stop.

Theo Jaffee (03:35)

First they came for the communists and I did not speak out

Bryan (03:38)

Yeah, yeah. It's important to realize that often the slippery slope argument is wrong. You need to go and look at particular cases and see what's going on here. I do think wokeness is one where if you did not have a slippery slope argument before, wokeness would cause you to invent it just to see how eventually things that seem to be completely normal become forbidden thought crimes, which is just weird. All the way to, there's two genders, it's like 10 years ago.

So that's something that people will get in trouble for thinking. What is the alternative view even? And now people do get in trouble for it, strangely.

Theo Jaffee (04:16)

Can you go into a bit more detail about what you mean by inventing a slippery slope?

Bryan (04:21)

Right. So the slippery syllable argument, which we've all heard, says if you go and make one exception to a good rule, you won't just wind up making one exception. You'll wind up making other exceptions and further exceptions, and finally there is no rule. There's a great scene in the Brazilian movie City of God where they start off with this one character who says, well, look, I'm going to go and commit some crimes only against bad people.

and we're not going to actually kill any innocent people here in the Brazilian underworld. Then what happens is they come to a point where either you have to kill an innocent security guard or get shot yourself and shoot someone and say, eh, exception approves the rule. And then the voice of admiration says, and then the exception became the rule. And then you get a montage of all the horrible things they start doing. So that's the slippery slope argument in general. Not always true, obviously. There are exceptions that we make that don't spiral out of control into...

eviscerating the original rule. It requires some judgment, but also just some experience in seeing what kinds of exceptions eventually spread far and wide. What I would say is that in the case of Woken, it's one where the exceptions that started being made just expanded so rapidly and in directions that would just have been confusing to almost anyone if you had forecasted 20 years ago.

If you just imagine going back in time 20 years and saying the following things will be reasons for a person to be shunned, you would just like, what? The story, of course, is not, you don't just wake up and say we're gonna start shunning people for the following list of things almost everybody believes. Instead, you start with marginal cases and then you shun some more and more and more and finally you end up where we are.

Theo Jaffee (06:09)

Are you as worried or maybe not as worried, but how worried are you about right -wing anti -wokes compared to the woke left?

Bryan (06:17)

Yeah, I'll say about as low as you can be while still being positive. I think they just have so little cultural influence. And the cases that people have pointed to of them abusing power, I think when you actually study the facts, I don't think it is reasonable to think of them as abusing their power. So Florida is probably the main case that people talk about. This is one, look, you've got public schools, they got a curriculum, and what's gonna be in the curriculum? Should it be the...

what a pile of woke dogma or should it be regular stuff? And you're like, well, you can't do both. And choosing between those, yeah, I don't see why it should be woke dogma. On terms of any kind of censorship on college campuses, and if you actually know how college campuses work, this is just absurd to be a worry. It is such a remote possibility that anyone is going to have to worry about this in real life, right? You know, it's a big world, so you can find very isolated examples, but it's really rare.

And if you understand how universities work, and I do, because I've been in universities now for 27 years, the entire DNA of the system exists to go and promote wokeness and crush dissent. They have a bunch of rules that have hindered them from doing it, including, of course, tenure. Woo, tenure. Valuable for me, because I actually am a dissident, not necessary for the others. But in any case, one of the...

The simplest examples to me is, agreement studies departments, many people feel like it would be 10 to about to censorship to get rid of them. It's like, well, suppose we had departments of creation studies being funded by taxpayers. Would it be censorship to get rid of those? Like, no, I think it's a violation of the first amendment that you have taxpayer support for them in the first place. When you have an academic discipline that actually is just dogmatic propaganda where you cannot be a practitioner of the discipline while saying,

highly critical things about it, then yeah, I don't think that there is an issue of academic or intellectual freedom. It's the issue is the other one around of taxpayers being forced to support a secular religion.

Theo Jaffee (08:25)

So what do you think about the relationship between nonconformism and Asperger's? Because Peter Thiel has said, you know, individuals with Asperger's have an advantage in Silicon Valley. And Elon Musk has said that he has Asperger's, and of course, he's wildly successful. So what do you think about that?

Bryan (08:42)

Yeah, great question. I would say two things. First of all, that Asperger's people do not really need to think of nonconformism as a conscious philosophy because they're doing it already. So in a way, there's this old line, if I'm not here to help the saved, I'm here to save the sinners. So similarly, the reason you write a book about nonconformism is not primarily to go and tell Asperger's, people with Asperger's to stop conforming, they're already not doing it, but rather to go and get people, the vast majority that,

are paralyzed by fear of strangers judging them and point out that that is a silly fear to have in the modern world. See, the main reason why my book is useful for people who are on the spectrum is that I do emphasize being strategic about it and realizing a lot of times being nonconformist is fine or helpful, but there's other times when it is actually going to hurt you in real life and to recognize the difference between those cases.

Theo Jaffee (09:41)



Bryan (09:42)

Right, and as to how you would do it, I'd say step one is try small deviations and see what happens to you. Right, so start small, see if people freak out at you. If they don't, you can probably go a bit further. If on the other hand, the smallest deviation gets you crushed, that's a different story. So if you say, well, I'll just do a small deviation, I will refuse to do a foreign language in high school. Yeah, you might not even be able to graduate high school if you do that, sorry.

Theo Jaffee (10:07)

What do you think can be done to make people on the whole less conformist? Not like individual people, but society. Like, to what extent is this even possible and not baked into human nature?

Bryan (10:18)

Well, it does vary quite a lot between countries. If you go to Japan, I think they're obviously a lot more conformist than we are. I think they themselves will agree that we are less conformist than they are. So since it is something that varies, it can't be that everybody is always at the same maxed out level. Obviously, even in Japan, there's people that do things like dye their hair, and the first Japanese person to dye their hair was definitely not conforming, in a country where pretty much 100 % of the people are born with black hair.

Let's see, so what can be done at the societal level? A lot of it does hinge upon individuals doing it. And if individuals do it, it becomes easier. So that would be where I would start. Probably, in terms of what arguments are the ones that are most helpful, the honest one is just saying, look, we have a lot of these emotions that come from our ancestral environment where we lived in bands of 20 to 40 people. And the modern world is so different from that.

Historically, there just wasn't any such thing as anonymity, and now anonymity is the main thing we have vis -a -vis almost every other person in the world. And then to say, well, we've got these emotions that don't really fit our modern environment, so you can either keep doing that stuff that doesn't really optimize for the situation we're in, or you can try to do something else. Obviously, it's really hard for people to go against very strong evolved emotions, but for that, you just say, like, just baby step it.

Just do a little bit. Just find some small thing where your emotions tell you to conform, but your reason tells you that you can totally get away with it. You actually want to do it. It will benefit you. And then just break from the mold to that small degree. And we'll start from there.

Theo Jaffee (12:05)

Do you

think the average person would even respond to like an evoPsych argument like that?

Bryan (12:12)

The average person know, I mean, of course, there's the general base rate of almost everyone's impossible to persuade of almost anything. So I just begin with that, all right? Then the next step is, all right, given that what can be done? It's like, well, there's a subset of people that are a bit more flexible anyway. So out of people that are a bit more flexible, I mean, I would think that, I would say that out of people that are open to arguments of any kind,

appeal to Darwinian thinking is in a way one of the easiest because it's so widely accepted in principle. So you are starting with a principle that is widely known and accepted among people that would even listen to an argument. You say, well, you're alienating creationists. All right, yeah, I didn't think I was going to do very well with them anyway. And then, let's see, what was I going to say there? Oh, yes, and then it's also one where it's very easy to...

get people to see that introspectively this is correct. When you just say, well, suppose that you could go and get $1 ,000 by wearing an embarrassing shirt in front of a bunch of people that you knew would never know who you were. Would you do that? It's like, I don't want to. Yeah, but why not? It's like, well, because it's gonna hurt my reputation. We stipulated in the thought experiment that it won't really hurt your reputation. So does that make you feel better about it? It's like, I still don't really.

It's like, all right, well, but it's like a thousand bucks. How much could it really matter? How about 10 ,000 bucks? Will you wear the stupid shirt in front of a bunch of people that will never have any idea who you were in order to get $10 ,000? There's got to be some point where you would do it. But along the way you also learn, oh gee, like I just care a lot about the opinions of people where really I don't have any good reason to care about. It's gotta be evolution here that is tricking me.

Theo Jaffee (14:01)

Was that the actual amount of money, by the way? Yeah. Yeah, OK.

Bryan (14:04)

No, no. I'm alluding to a famous experiment on the spotlight effect where they, as part of the experiment, they just made people wear a stupid shirt and then walk through a room and then ask, they asked first of all, the person, how many people noticed your stupid shirt? And second of all, they asked the people, did they notice the shirt? And there was a massive disparity where people just thought people were paying a lot more attention to their shirt than they really were.

which is another reason, by the way, not to worry about nonconforming is that to a large degree you're just invisible. People are so caught up in their own heads and they're thinking about themselves all the time, it's just hard to realize how little other people are thinking about you. Once you realize that, it's very liberating.

Theo Jaffee (14:50)

So you talk a lot about stuff like, you know, focus on the truth and don't let other people influence what you think unjustly and quantitative decision making and betting on your ideas and a lot of things that remind me a lot of Eliezer Yudkowsky's rationalist movement. So how similar would you say your methods of rationality are to the kind of standard Yudkowsky less wrong rationality methods? And secondly,

Bryan (15:05)

Mm -hmm. Mm -hmm.

Theo Jaffee (15:17)

What do you think about the rationality movement? Do you think they're true nonconformists or just kind of collectivists?

Bryan (15:25)

Yeah, so let me start with the second question first. I've got a very positive view of the self -styled rationality community. They've always done right by me. Sometimes it seems like they get a bit cultish to me and they get fixated on some strange ideas. But then again, if you go and compare them to almost any other group, then it's a lot less clear what's going on. In terms of my specific levels of agreement and disagreement.

So the most glaring one is I'm not really worried about artificial intelligence. I even have a bet with Eleazar on the end of the world on January 1st, 2030. And he's saying, well, it's not the end of the world, it's the end of humanity on the surface of the Earth. Oh, sorry, my mistake. But I misspoke. But that bet says that if there's any human beings left on the surface of the Earth on January 1st, 2030, then he owes me some money. You might wonder, well, how?

Do you do a bet on the end of the world? And the answer is the person who is the optimist, namely me, prepays. And that's what I did. So I'm still feeling fine there. I think that Eliezer in particular and a lot of other people have just allowed their youthful fondness for sci -fi to carry them away on flights of fancy and paranoia. Obviously, they disagree. I don't have any really good argument to change their minds at this very moment or under these time constraints, but that's where I stand.

In terms of other things, I think that they are pretty crazy about polyamory too as being something that is widely going to work out for people. I agree with Isla that there's probably five or 10 % of human beings that are psychologically equipped to be happy doing this, but that leaves a whole lot of others who aren't. And especially I think that for families with kids, it's probably a really bad idea unless you just don't care about.

getting to raise your own kids or have them grow or you don't mind having them grow up in a broken home. I think that is actually really bad. Not by the way because I think that it messes up your future. I just think it messes up your childhood. Just unpleasant for kids to have to be going back and forth between multiple homes and dealing with adults that are in conflict with each other.

I think that's another case where they are underestimating the power of evolution. I think jealousy has so strongly evolved, I think that most people just cannot get rid of it. And if you say, well, we'll all be rational about this, all right, well.

It's the kind of thing where people's emotional constitution generally doesn't actually adjust very well. It is very standard among practicing polyamorous to wind up saying, yeah, well, there was this period where we were totally lying because of jealousy or the jealousy tore us apart. So we think that is probably another big issue. But, you know, overall, I've had great relations with the rationality community. They're fun people.

They're not very pushy, except on the AI risk, even there, I've yet to meet someone that yelled at me for not worrying about it, which is different from almost every other community that's worried about some terrible disaster, if you call it into question.

Theo Jaffee (18:35)

Do you think

your arguments on polyamory apply just as much to kids living in more traditional societies where you have like, in most of these traditional societies of polyamory, it's like one man and multiple women and they all live in the same house and do you think that kids there are also not well off?

Bryan (18:55)

That's a good question. So that's usually called polygamy rather than polyamory. There is quite a bit of social science of polygamy saying, well, a few things. One of them is, in very primitive societies, that's not really how it works. In very primitive societies, it's more like you just have pair bonding for two or three years while the kid is a toddler, and then the relationship dissolves. But since, if you live in a band of 20 to 40 people, you still see both of your parents, so you've got that going. You don't need to...

Theo Jaffee (19:04)


Bryan (19:25)

have a shuttle system between the huts of people who live within sight of each other. By traditional societies, you mean more of the ancient empires or something like that. Ones where the very most successful guys have had hundreds or I think there's one guy out of over a thousand kids. It's definitely one where kids have very little contact with their dads. So there's that. Also, it's very noted in societies like that, there's just a lot of conflict between the mothers.

most grotesquely in things like the Turkish Sultanate, where there was a period when the Sultan's first job was to murder all of his brothers. Gruesome, or mostly half -brothers, murder all your half -brothers. I they even murder the full brothers just to be safe. Anyway, that's pretty gruesome.

Theo Jaffee (20:06)


Bryan (20:15)

Let's see. And then the other major issue that most social scientists have had with polygamy is that if it's widely practiced, then it means that you've got a lot of guys who don't get to marry anyone are left alone, and there's a lot of other side issues from that. You know, I think that in the modern world, I'm not at all concerned about polygamy becoming so widespread that we start seeing these negative consequences. I mean, we don't even see billionaires having harems in the modern world in the sense of...

They've got a bunch of women and they have kids with all of them and they all hang out. Elon is sort of the closest, but even he's not actually doing that. He's not really doing that. So I think that we are so culturally far from it. Let's see, like the Harvard anthropologist, Joseph Henrich, I think he's testified in some hearings that are somehow related to.

Theo Jaffee (20:50)

Yeah, I thought that too.

Bryan (21:09)

prevent legal non -recognition of polygamy. And I think that's pretty paranoid too. This is going to lead to some horrible negative effects because it's just a small fringe thing. You can imagine that it would spread, but I don't see it spreading. I think the main thing that has spread is just broken homes, but not from polyamory, just from monogamous people don't stay together.

Theo Jaffee (21:33)

So back to my first question on rationalism. How similar do you think your methods of rationality are to teleasers?

Bryan (21:41)

Hmm, let's see. I think there's a lot of similarities. I mean, I would say that I'm especially influenced by Phil Tetlock's super forecasting, where a lot of his advice is start with base rates and then do adjustments up and down. Scott Alexander has this line where, specifically about AI risk, where he says, well, this just leads to base rate ping pong, where I have my base rate, you have your base rate. My base rate is like number of times that the world has ended.

and his base rate is number of times that a superior intelligence has come into contact with an inferior intelligence. This is one where...

In principle, you could sit around saying, oh, we can't figure out what the base rate is, but I don't think it's actually that hard in practice unless one side is determined to go and get a certain kind of answer. So I do put a lot of reliance on base rates. A lot of my arguments with Tyler Cowan come down to, they'll say, oh, here's something that could happen. And it looks like that's never happened before, base rates say no. And he's like, well, but you're not engaged in the argument. And I'll say, yeah, well, you're not engaged in the base rate.

So I think the base rate is a lot more important. People tend to get really sucked in by the details, which leads them astray. Whatever you're going to tell me, I'm going to treat it as a...

modest adjustment of the base rate rather than something that's rocking my world and saying, oh my god, I can't believe it. I will pile on and just say that I haven't seen that Tyler has any great predictive abilities. Super smart guy, very knowledgeable, but in terms of saying anything falsifiable that's gonna happen before it happens, I think he's probably below average for a thinker, maybe above average for a human, but that's not his forte.

In terms of other methods, of course, base is rule. This is a very big part of the way that I approach the world, as it is for anybody that's part of Tetlock as well. Just things like you can see some evidence in favor of a view and rationally become less confident because you're expecting to see even stronger evidence in favor of the view.

something that people have trouble with. But you see a headline and it says, you know, 100 people murdered by an immigrant terrorist. And then you say, well, but if we go and average over all the headlines of the past three years, it's only 200. And I think that a person that had a reasonable view would have thought it'd be 500. So actually, oh, this is in fact a reason to become more optimistic. Emotionally, of course, this drives people crazy, but the logic is completely sound.

You've got to specify, well, what did you think was going to happen? What would have been consistent with your view? The style of the normal person who just opens up the newspaper and says, see, everything I said has been proven. That is something that Bayes will stop you from doing because you'll say, well, wait a second. What would have to be on the newspaper headline for me to say that my view was disproven? What would it even look like?

You're always going to be able to go and find something that is an example of your complaint and then claim vindication, but that's ridiculous.

Theo Jaffee (24:45)

Yeah, so going back to what you said about base rates, where you said your base rate is that the world will not end and Tyler, sorry, Scott Alexander's base rate was how many times the superior intelligence has come into contact with an inferior intelligence. Back in July, 2022 for the audience, Brian and I had lunch and one of the things we talked about then was AI risk. And he mentioned his bet with Yudkowski about how...

Kaplan thought that it was not going to end the world. And since then, of course, chat GPT has come out and you've made another bet that looks like you're going to lose it about AI capabilities. So on that, I know you haven't made a huge update, but have you updated on AI risk at all?

Bryan (25:22)

Mm -hmm.

Yeah, of course, very slightly. So before I was skeptical that there'd be an AI that would be able to get A's on my economics exams, I did a bet on that. First of all, I went and gave GBT 3 my economics exams and got a D after hearing a lot of people saying, it's so incredible, it will blow your mind. And I even had a friend say, oh yeah, it'll be able to get A's in your test and got a D. And I'm like, all right, well, they're overselling again.

just like the last hundred times they've ever sold. So I did do a bet on that. And then when GBT 3 .5 came out, it was able to get A's. So I will say, all right, that's considerably more impressive than I was expecting. The progress was a lot faster anyway, but there's still a world of difference between you can get A's on my econ exams and you're gonna destroy the world through one way or another.

mean, there I've just also had a lot of more particular arguments, like there's gonna be a kill switch, a lot of kill switches. It's not that human beings are just going to hand over the reins and let the AI do what it wants for itself. Then in terms of just, I mean, if the base rate for anything designed by human beings, often there have been things designed by human beings that have ended up being terrible for human beings, but only because some human beings consciously unleash them on other human beings.

Which is where I would say, I think is actually where almost all the AI risks should reasonably be put. It's not that the AI will achieve autonomy and then will go and do bad stuff to us. Rather, it's that there's gonna be some humans that will say, help me come up with the best possible plan to go and kill as many other humans as I can. So that seems a lot more likely, which is what we've seen with almost all of the great technological achievements in the last 200 years. I think you'd have to be a fool to see electricity and then not wonder.

Could this be used for bad purposes? Yeah, of course electricity can be used for bad purposes. Of course mass production can be used for mass murder. Nuclear weapons can go and exterminate vast populations. But in all these cases, it is not that the technology takes over. It's that human beings do bad things with their tools.

Theo Jaffee (27:40)

And while we're still talking about rationality, what do you think are the best ways to check yourself to make sure that you are being a non -conformist and not just a contrarian or a collectivist?

Bryan (27:53)

I think a

big part is coming up with concrete tests of what's going to happen if you do something that is not conforming and seeing what happens. So, I mean, obviously just applying simple rationality processes and saying just because most people think it doesn't mean it's false. So it's putting just little weight on the fact that something is a popular view rather than putting negative weight. I think the contrarian is someone's putting negative weight on a view's popularity. If other people think it's true, I'm going to think that it's false.

The rational thing is to say, well, I'm not gonna put a lot of weight on it just because we know there's so many areas where human beings have embraced silly views. There's just a lot of popular views that are wrong. In a way, that itself kind of begs the question, right? Because like, well, how do we know that there's so many popular views that are wrong? And that's not something, that was something where I would just go case by case and say, well, here's a list of a bunch of things that are widely thought but turn out to be incorrect. So, and these are not just small.

cherry -picked or lemon -picked examples. These are pretty big examples of things that people are really wrong about and have been really wrong about in the past. And it's not that hard in hindsight to see that they're wrong.

Theo Jaffee (29:03)

By the way, this reminds me a bit of a Charlie Munger quote where he said something like, being a good investor requires the temperament that doesn't derive too much pleasure from either following the crowd or going against it.

Bryan (29:13)

Yeah, we are very reasonable.

Theo Jaffee (29:16)

So we talked earlier about social media with X, but another social media is LinkedIn. And I go on LinkedIn periodically, and I find that it sucks because it's conformist. And it seems like everyone on there is just trying to please other people. So like, do you think that there's a way to fix LinkedIn, to fix professional social media in general? Or is it just kind of a property of like professionalism that it ends up conformism?

Bryan (29:27)


Yeah, yeah, I think it is heavily a property of professionalism. Important thing to remember is that most original and creative ideas are terrible. And especially on something that is a practical task that a lot of smart people have been working on for a really long time. If it's, like what's the best way to go and say, fly a plane? It's like, well, you're not the first person to think about this, you know? There's a lot of really smart people. There's a lot of money on the line, probably.

there's already immense selection pressure to do a good job on this. When someone says they got it all figured out, they're probably incorrect. So that is one thing to keep in mind. Let's see, in terms of fixing, and then, you know, so then like, if you know my book, The Case Against Education, I say a lot of what people are signaling education, sure it's intelligence, sure it's work ethic, but a lot of it's just conformity, just saying, like, I know there's no I in team and I'm going to be part of the team, be a loyal member.

will not rock the boat. Probably like some of my best nonconformist advice actually is focus on being friends with your boss instead of being liked by coworkers. This is one where it's like, oh, that's what kind of a suck up are you? It's like, well, a person who appreciates the boss probably got there by their hard work and greater understanding of the field and that they actually have a really tough job of dealing with a lot of recalcitrant people.

Every manager has to herd cats, and I'm going to be one of the easy cats to herd because I think I have something to offer this person, and if I do a good job, I think this person is likely to have my back. Another way of thinking about it is if you're a nonconformist, who is going to be easier to win over? A bunch of coworkers or one boss? It's going to be a lot easier to win over one boss. If it's just one person, this is someone where...

You clearly indicate my loyalties on your side and my goal here is to be a highly useful member of this team. If you just talk to almost any boss, they'll say, wow, like, I just need a lot more people like that. It's just hard running this because people complain so much and are so hard to please and just don't appreciate that I'm in a tough spot. So just showing some empathy for a person who has to make hard decisions is something that is actually nonconformist in a very deep way.

Theo Jaffee (31:59)

So the econ department of George Mason is full of nonconformists. You, Tyler Cowen, Robin Hansen, Alex Tabarrok. And famously, you're not just popular within academia, but outside it. Probably mainly outside it. So...

Bryan (32:03)

Oh yeah.

Yeah, you got that right.

Theo Jaffee (32:16)

How do you think this can be replicated at other schools? Like, I go to the University of Florida, I can't think of any UF professors who are famous in the way that you and Callan and the rest are.

Bryan (32:28)

Hmm, yeah, great question. I mean, a lot of it depends upon getting some people who have paid their dues and gotten the regular signals, who then are willing to take advantage of this crazy tenure system to do something cool. Unfortunately, there's just not that many people like that. Once you get one person to do that, then often you'll find that there are other people that were sympathizers and wanted to, but they were just too scared. So you need to get...

a focal individual who's willing to stick their neck out, which on the one hand is not as hard as it sounds because with a tenure system, they know they've got this massive job security. The real difficulty is that despite the incentives being in favor of nonconformism on that level, the system weeds out the nonconformists before they get there usually. So that is a big part of it. If you really wanted to go and foster it, the idea of...

having schools creating independent centers of nonconformist thinkers. That's of course how Grievance Studies got off the ground is you go and find someone who says, like, my work isn't appreciated because I'm the only one who understands how fantastic Albanian culture is. Give me my Albanian Studies department and then I can really do it. Unfortunately, that's a case where you're getting nonconformists who are definitely defying society, but at the same time, like,

they're really just wanting to create their own cult. It isn't like they want to have some very thoughtful exploration of all the possibilities or anything like that. In terms of where I would start, I would generally start with economics first because economics does have this long tradition of just being willing to entertain socially unacceptable hypotheticals and consider possibilities that other people just say that's an evil thought, crime, don't think it. Secondly, honestly, philosophy departments, they are famous for hypotheticals and while...

their discipline has gotten worse over time, still there is a sense of that we can consider an idea without agreeing with it. Whereas if you go over into your agreement studies departments, that is a really alien idea to them. Like, what do you mean we're gonna consider the possibility that actually there is not a lot of discrimination against African Americans? That's crazy, we all know there is. Yeah, but what if there isn't? Well, there is, so we're not gonna talk about it. And you, by wanting to talk about it, are an evil person.

It's like, oh, ah, my mistake. So if you did want to go and foster this kind of thing, you'd basically need to find some people, find a few people that already foot the bill, give them some money, and then let them have independent hiring authority so they can replicate themselves. Not perfect, but I think it's the best formula for success.

Theo Jaffee (35:15)

Well, you talk about it like a formula for success and like a plan if you wanted to do this, but it seems like GMU didn't, you know, plan to have an econ department like this. So how much harder would it be to do it spontaneously?

Bryan (35:32)

I think actually it was planned. So I've been around for at least half the life of GMU having any kind of a public profile. So basically there were donors that wanted this kind of thing to happen.

and they gave money so that it could. I think the first big donation was to bring the Center for Study Public Choice here, so bring future Nobel Prize winner James Buchanan and his team here in 1983, if I'm not mistaken. Then there were further donations. There was another big donation to bring Vernon Smith's team, another future Nobel Prize winner. And by this point, by the late 90s, we were consciously talking about...

we want to become the Hoover Institution of the East. So this was actually a conscious plan. Now, it doesn't mean that, and in fact, there really was one single individual man who was at the epicenter of all of this, which is Tyler Cowan. He was the one that is great at bringing together donors and the existing faculty and new talent and making it all happen. So he deserves a ton of credit for that.

Theo Jaffee (36:20)

Hmm, I don't know.

So do you think that this kind of existing infrastructure of academia and tenure and donors matters as much nowadays? Like you talk a lot about tenure and how great it is because you can research and write about what you want. But today we have people like Noah Smith and people like Scott Alexander who make a lot of money just writing on SubSec.

Bryan (37:00)

Yep, so I'll say that it's great for me personally. I think it's a terrible system actually. Tenure is a disaster. It has a few benefits that are swamped by overwhelming costs. So in no way think that I'm pro tenure. I think tenure is terrible. What, yes, but what I will say is that for people who want to do contrarian stuff but are risk averse or just don't have a ton of star power charisma, it remains one of the best bets.

Theo Jaffee (37:13)


Bryan (37:28)

So Scott Alexander especially, he was able to go and get where he is through having this incredible personal charisma and ability to just create a new community almost out of nothing. But most people are just nothing like that and would not be able to do more than eke out a meager existence on Substack or other kinds of social media.

It's great that they exist, and what they're doing is wonderful, and yeah, there's no doubt that what Scott's doing is way better than 99 % of professors. However, I don't think there's room in the market to have a thousand Scott Alexanders, which there was.

Theo Jaffee (38:08)

So on education, in a portrait of my school, you talk about your ideas for how you'd run a school, but not like a lot of specifics for curriculum. You mentioned reading, writing, and math. So a couple of questions. One, do you think computer science and programming should be elevated to the same level as math? And two, how would you scale this approach beyond five to 15 students?

Bryan (38:34)

I'd say it's reasonable to think about putting CS at the level of math, but in the end I wouldn't, because I would say, look, math is one of the things that you need for CS, but there's a lot of other things that you can do with math, whereas CS is something where if you don't want to be a programmer, then the actual career value is not that large. I mean, would say that if there's someone that is really good at math, has a good background there, and then when they're 18, they decide they want to become programmers, they can do it.

On the other hand, if there's someone who does not do much math and then they're 18 and they say, I want to go and get up to speed on doing enough math to do CS or engineering or physics or whatever. So yeah, like at 18, like unless you're a complete genius, it's pretty much too late. It's just too cumulative. You've missed this critical window. It's just gonna be too hard to ever catch up. But you're like, that's, it's very reasonable. And definitely if we could go and say that you can do CS instead of a foreign language, that would be one of the best curriculum.

revisions that we could make because I think a ton of people would rather do CS than foreign language and they would get a lot more value out of it. I mean, it's very standard for people to spend two or three or four years of high school on foreign languages. Almost none of them learn the language to any remotely usable level. Even if they did, there's not that much use of it. CS on the other hand, this would be giving useful job skills to a generation of students. So that would be a big improvement.

But I'm not quite sold enough on it to think that everyone should be doing it standardly.

Theo Jaffee (40:03)

Yeah, I mean, I've always kind of thought of it as, you know, at least on the same level as chemistry or physics, which every student learns in high school. Yeah, yeah. And when I was in elementary school, I remember hearing about like, oh yeah, we're all going to be learning about computers and computer science soon. Obama was talking about this 15 years ago and then it just never happened. So I went through elementary, middle and high school. I took two CS classes only because they were APs, but they're just not in the standard curriculum at all.

Bryan (40:09)

Oh yeah, yeah, of course. Better than chemistry or physics.

Oh yeah. What's going on is that curricula are very backwards looking. In fact, if you want to understand the curriculum, it is best to remember that it all evolved out of a system that was designed to teach three things. So it was designed to teach law, medicine, and theology. This is what Anglo -American universities did for hundreds of years. They just teach law, medicine, and theology. And if you're thinking, wait, medicine? It wasn't until like,

1900 or so, the doctors started saving more people than they killed. Yeah, that's true. But still, they were teaching this crap for hundreds of years. Law, on the other hand, almost by definition, lawyers have to be effective because they are the ones that are judging their own success, in a way. And then theology, again, my view is it's a fake subject. So it's not fake in the same way as early medicine, where you're actually killing people with it. But still.

So anyway, if you just realize this is what our system grows out of, and then everything else pretty much just gets tacked on, seems to add on other things afterwards. But really the idea that we are here to go and train people in these three professions, the fingerprints of that are still on the system that we have. And so then we have a lot of requirements that make very little sense in terms of the modern world, but make a lot of sense, you know, so basically they make very little sense forward looking in terms of what will be beneficial to the student.

make a lot of sense backwards looking in terms of we've always done it that

Theo Jaffee (42:03)

And then.

Bryan (42:03)

little complicated because modern sciences weren't taught until the late 19th century math was. But the idea that you would put modern science in the curriculum, that I think starts with the German, top German universities, then spreads to Johns Hopkins, and then moves over to the rest of US academia after that.

Theo Jaffee (42:23)

So you also mentioned that this approach to education is like only for people who are already interested in it and have the aptitude for it, and it would only be 5 out 15 students. So how would you scale this approach beyond that? Can you? Or would you have to do something totally different?

Bryan (42:38)

Well, not totally different. I mean, I would say that when you have kids that just lack any intrinsic motivation, this is where you really need to do some soul searching and say, why do I want to make them do something when they have no intrinsic motivation? The good answer to that is for extra extrinsic motivation, because they're a child, they don't understand what the labor market is like, you don't want them to grow up and be unable to take care of themselves. So for things where you have very strong evidence that it will be a severe handicap to them, just let them do whatever they want.

That's when I think it is a good idea to go and push it on them whether they like it or not. On the other hand, there's a lot of things that we do in school right now and push kids where you say, well, why do they need to know this? Well, we don't know much better than we've always done it that way. So I think I've got an essay just called Unschooling plus Math where I say that there is this homeschooling philosophy called unschooling, which almost everyone thinks won't work at all, it'll be a total disaster. There are defenders who say, no, it's not a total disaster. I think they're right. But.

There is one notable deficit that I have seen unschoolers have, and the little data that we have is consistent with this, which is that unschoolers are deficient in math because very few people get intrinsic enjoyment out of math, and yet it is so vital for so many high status occupations. So I say, look, if you're willing to just go and do unschooling with the tweak that every day you do have to do an hour or two of math, then I think that does solve most of the problems with unschooling.

Theo Jaffee (44:02)


I wonder, like, for some people, maybe I'm just being anecdotally, but for me, like, going through elementary and middle school, I hated math. I could not stand algebra and geometry and algebra too. But when I got higher up into calculus, I started to really like it.

Bryan (44:17)

Mm -hmm. Yeah, that's like one person in a hundred. So it's a great kind of person to be to like, I didn't like the boring easy math, I only liked the hard stuff. Yeah, but normal people just don't like it and it's not because it's too easy, it's because it's too clear that they're wrong. It's so depressing. So you put in like, math is the opposite of labor theory of value. You can put in a hundred hours into a math problem and if it's wrong, it's wrong. It doesn't matter that you tried hard.

That's a lot of what's so bitter about it. And there's also just no room to go and say, well, there's some sense in which I'm right. Which you'll see in almost all the humanities, and there's a math, like no, there's no sense in which you are right. You are just wrong.

Theo Jaffee (44:54)

Yeah. So to what extent do you think people living under totalitarian governments should be non -conformist? Like...

Bryan (45:04)


Yeah, great question. It's likely to get you killed, so that would be a reason not to do it, definitely. It's one where you need to be a lot more careful, because by definition, totalitarian regimes will go and harshly punish you for very minor deviations. Even there, I would say that you can't really survive in most totalitarian regimes, maybe any of them, without having...

enough nonconformism to say, wait a second, I'm gonna go and die if I don't break the rule. And so I gotta figure out a way to somehow weasel my way out of this, whether it's being sent to the Eastern Front to go and fight during World War II, or to get an illegal job, or break rules against corruption in order to get enough food to feed your family. So you could not be a full conformist and survive in totalitarian regimes.

Unless you happen to just be born into a ruling family or something like that where you're taken care of and you're never given a dangerous job and you've got plenty of food and all that other good stuff. But otherwise, you know, so I'm thinking here about, so in North Korea, after the Soviets withdrew their subsidies in the early 90s, they had a massive economic collapse. Their whole economy was based upon getting a bunch of subsidies from the Soviets, which they no longer had.

And then there's the question of, well, what do we do about all these people who are working in a fully 100 % government -owned economy? And the answer was, well, let's see. We're running short, so we're going to fire them. And then what are they supposed to do? The answer was no answer. What happens if you're in a fully state -monopolized economy and you lose your job and they don't give you a new job? Either you starve to death or you work illegally. There's a great book called...

see, nothing to envy, where they just went over the plight of North Koreans who lost their jobs during this period. And yeah, it's like, well, I can either get caught for being a black marketeer and get sent to the slave labor camp or executed, or I can starve to death. I guess I better take my chances with the slave labor camps, and maybe I can make enough money to bribe my way out. So you do need that. But obviously, totalitarian regimes are very harsh on people that stand out.

So, you know, in a way they exemplify the otherwise irrational fear that most people have that if you do anything different, society will crush you.

Theo Jaffee (47:36)


in your essay, Natalism as Nonconformism, you wrote that one of the most important things you can do, both in general and as a nonconformist, is to have kids. Israel has a total fertility rate of 2 .94, which is not only much higher than any other developed country, but it's actually higher than it was in 1989. You mentioned religiousness and secularism a little bit, but...

Bryan (47:57)


Theo Jaffee (48:03)

Can you go into a little bit more detail about how did they manage to do this and what can other countries learn from it?

Bryan (48:05)

Right, so I'm not an expert in Israel. I think a lot of it actually is just exponential selection where the high fertility groups in the country were, namely the ultra -orthodox, have just become a much larger percentage of the country. So that way, as long as you can sustain the fertility rates of every subgroup and the high fertility subgroups are much higher than the others, so they're rising a share of the population over time, then it's...

almost follows as a matter of pure arithmetic that you will get your fertility rate will go up. So I think that's a lot of what Israel did. People also talk about things like just having a very pro -natal attitude. So that probably something too. Even there you have to wonder, well, isn't that really just a reflection of the fact that they've got so many kids and there's so many large families? I mean, just to be clear, as I say in that essay, I'm not claiming that natalism in general is not a conformist.

Because if you're in a highly natalist subculture, then the conformist thing is to be a natalist too. Rather what I'm saying is that if you're in a typical first world country where we have very strong anti -big family norms, that's where you need to be a non -conformist in order to have a lot of kids.

Theo Jaffee (49:24)

So in a conservative confession you talk about hedonic adaptation. So are you at all worried about a future where we'll figure out how to do something like wireheading directly affecting our brains reward system and then like essentially running out of hedonic adaptation?

Bryan (49:31)

Mm -hmm.

Or sort of the other way around, right? Wouldn't it just be that we will be, we'll just make ourselves identically adapted to whatever we've got? Isn't that really the worry?

Theo Jaffee (49:54)

No, the worry is essentially that we'll put ourselves on like a heroin drip and live like a, you know, kind of terrible people.

Bryan (50:02)

Oh, okay. So you're talking about right now you achieve something, you feel good for a bit, but then you're motivated to go achieve another thing because the thrill wears off. Yeah, okay, I get it now. I guess I would say I am a little bit worried about that. I mean, it's the kind of thing where evolution will save us in the end if that happens because the people that go on the heroin drip will have no children and they will be wiped out and the people that remain will be those that had an aversion to it.

Theo Jaffee (50:10)


Bryan (50:29)

It might be that we have to first have our population fall by 90 or 99 % in the extreme scenario before we reverse, and then we just are replaced by people who are so horrified by the idea of a heroin drip that people just won't do it. So I think that is the long run answer, but yeah, obviously having a period of a few hundred years where we go into a massive decline is somewhat worrisome. I mean, I will confess that I usually just don't worry that much about...

things that are over 100 years out because I just figure it's such the world and the future is so different. There's really much that we can do about it. And in fact, when someone starts talking about I'm gonna get ready for the world in 100 or 200 years, my thinking is, I think it probably more likely than not you're just gonna make things worse and you're gonna go and likely to try to crush progress and hold it off is more likely than that you're going to.

Theo Jaffee (51:02)

in the log run rail dead.

Bryan (51:23)

create a futile, a fertile, rather a fertile groundwork for further progress that it seems pretty remote. And if you just think about someone in 1800 saying, what can we do in order to get lots of progress here? It's like, what would they even have thought back then? I suppose there were a few enlightenment figures who would have said, well, we just need to have a lot of freedom in order to go and explore new ideas. And we need to make sure that industry is not overly burdened with regulations so they can implement the ideas. So there'd be some people like that, but.

think that anyone who had anything much more specific than that would have just been messing things up, probably.

Theo Jaffee (51:57)

in your essay, Bioethics Tuskegee vs COVID.


Bryan (52:00)


Theo Jaffee (52:01)

talk about the problems with bioethics. This was probably my favorite essay in the compilation, by the way. So some people, I've been hearing this a lot recently, say we should entirely abolish the FDA. Because even though that will almost certainly lead to problems on balance, it will be a tremendously good thing because no amount of regulation is worth blocking med tech from getting to market, like transformative med tech that can cure cancer or something. So what do you think about this idea?

Bryan (52:11)

Mm -hmm.

Yeah, I'm all in favor. I've been in FDA abolitionists for a long time. Really, I was in my senior year of high school and I had never heard anything other than arguments in favor of the FDA. So I really was actually brainwashed in my history classes about how there's this whole horrible period before the FDA where pharmaceutical companies were killing people left and right. And then finally, wise government came and established it and now we're protected. And the only danger is that we might not be protected enough. So this was actually explicitly taught. It was in the curriculum.

Then in 12th grade, I read some economist saying, well, you realize if there's a drug that saves 10 ,000 lives a year and the FDA delays it for seven years, that you killed 70 ,000 people. And when I read that, I'm like, hmm, I don't see any way around that argument. That is about as good as any argument it could ever possibly get. And then the question is, how many lives are being lost by approving drugs too soon? And looking there, it's like, hmm, yeah, it was hard to come up with very much. Thalidomide, which is the...

drug that was used to, let's see, what's the right way of putting it? It's the drug that most people point to as showing that we need the FDA. The main story there is that thalidomide, the reason why the dangers were caught was that thalidomide was approved in the UK and then people discovered that it caused a whole lot of birth defects. Whereas in the US, it was not the FDA that caught it, it was another country that had lighter regulation that allowed us to do it. Otherwise, it probably would have been approved.

Funny footnote, it was finally approved as a treatment for, I believe, leprosy. Just don't give it to pregnant women, because then it will still cause horrible birth defects. Let's see, but anyway, this case against the FDA seems very strong to me. The idea, and just when you just see the asymmetric response to, like, someone was killed by an approved drug, well, you have to change everything. Whereas people who, people lose their lives because they'd wait for a drug, well, that's, like, that's not even a thing. And you...

Theo Jaffee (54:15)


Bryan (54:23)

Like during COVID, I was gratified at least and kind of amazed at how quickly the drugs were approved because my friends say, oh, everything's gonna be great. And I'm like, look, even if we got the drugs that totally work, which is itself is a good outcome, above average outcome, how do you know they're not just gonna be held up for years? And this was a case where suddenly people woke up to, yeah, I guess we delay it, then it's gonna kill a lot of people.

You know, combined then with a lot of unfair demonization of normal, cautious people saying, how do you know it won't have bad side effects in five years? And the honest answer to those people is, yeah, we don't. We have to wait five years. But we're losing a lot of people now, and we're just gonna gamble that it's actually going to be a net positive, probably a good positive based on historical experience, but yeah, we can't prove that you're wrong. That'd be the honest thing, but obviously it's politics for honesties in very short supply.

Theo Jaffee (55:20)

So I think we have time for one last question. So you've written a lot. I think something like over 2 ,000 essays on Econlib, a bunch more on Substack, several books. And so what advice would you have for writers other than just read a lot and write a lot?

Bryan (55:22)

Sure, sounds good.

My honest answer is I don't even feel like I am that hard at working. I don't feel like I'm that hard of a worker. I feel like I am daydreaming a lot and goofing off a lot. The main thing I can say is that every day I get something done. I get something done every day. And that just adds up. Do 20 years of chipping away. This is the plot of The Count of Monte Cristo, right? Every day the guy chips away a little bit from his prison cell on this island of France. And after seven years, he escapes.

Similarly, if every day you just get a little bit done, it adds up a amount. I am honestly puzzled by all the people who are tenure professors who have so little output. It's like, what do they even do all day? Like, if I can get this much done while goofing off this much, what are they doing? So, like, I... In the end, I'm kind of puzzled. Like, part of me thinks, are there just, like, lots of, like, people who are horrible alcoholics and drug addicts or something? And...

They just do the bare minimum and then otherwise they're putting all of their energy into their vices. Is that why they don't get much done? Are they just going and putting a ton of research, or not a ton of research, a ton of hours into their teaching, even though their teaching doesn't appear very good either? They just sort of spin their wheels a lot. I am mystified about what other people are doing that leaves their productivity be so low. In terms of how you can get motivated,

For me, a lot of motivation comes from my iconoclasm. I really don't like hearing people say things that I think are false or dubious in a giant self -righteous tone. It motivates me to go and argue the other way. And especially if you are an iconoclast like me, there really is a lot of low -hanging fruit of ideas that are true yet barely discussed because most people are too afraid to write about them. So, I like that piece on...

you have bioethics in Tuskegee, right? It's pretty obvious when you read it, but I think most people would be like, look, we can't possibly go and talk about Tuskegee as if it wasn't the worst thing that was ever done. And it's like, well, look, obviously it wasn't the worst thing that was ever done because there's just much worse things that have been done. I'm gonna say it's way worse to kill a million people than to give a horrible disease to 200 people, right? But.

To say that, it's like, oh God, we can't possibly say that this is sacred. It's like, who says it's sacred? I mean, I can understand why you might not want to talk about it at work, assuming your coworkers even know what you're talking about. But for someone like me with tenure, why not go and stick my neck out and just say what I think is correct? There's always this fear eventually you're going to be crushed. I do have actually a bet with someone who says that eventually in my own...

mind, I will declare myself to have been treated very unfairly by my university. So far, so good.

Theo Jaffee (58:36)

And for individual essays, just like you talked about amassing lots of creative output. But for each individual essay, do you have any specific advice on that?

Bryan (58:46)

Yeah, well, here's a lot of my advice. Anytime you get an idea, instantly write it down, because you're going to forget. I have a queue of hundreds of ideas. Normally, I just put down a title. Sometimes the title isn't clear enough, so I just write a sentence or two to remind myself what the idea was. And that means that I never have any issue with I can't think of anything to write about. I have the opposite problem. I have way more ideas than I feel like I would ever have time to write about. But I just try to keep refreshing the queue and just adding more in, so that way I've got a good...

a good set of choices. A lot of where I get my ideas is just for iconoclasm, where I just see something and I say, huh, well, that sounds wrong. So yeah, but people would be upset if you said it. Huh, well, in that case, it probably hasn't been said by anybody yet. I can't remember anyone saying that before. All right, then I'll do it and then I'll be the person that says it. It is in a way scary to me how often I can quickly become the number one Google hit for anything I care about, because it just shows that what I care about is stuff that most other people don't even want to talk about.

It's not bragging, it's just what I care about. Often there is, if there's anyone that's interested in it, it's only the audience that I make because otherwise it just didn't exist as a topic.

Theo Jaffee (59:52)

All right, well, thank you so much, Brian Kaplan, for coming on the show.

Bryan (59:58)

I'm very happy and just let me let you know that you can get this new book, You Will Not Stampede Me, Essays on Nonconformism, for just 12 bucks as a paperback on Amazon or $9 .99 for the ebook. I've also got four other books of my collected essays that are already out there available for the same price. I got three more coming. And then I've got all my other books, including my New York Times bestseller, Open Borders. And on May 1st, I've got my second graphic novel coming out, Build Baby Build, The Science and Ethics of Housing Regulation.

So I'm really excited about that. The book looks fantastic. It took longer than I thought it would, but I stand by the product. It's great.

Theo Jaffee (01:00:35)

Alright, looking forward to it.

Bryan (01:00:37)

Okay, thanks a lot. Great talking to you again.


This transcript was generated automatically with Riverside and probably contains lots of errors.

Theo's Substack
Theo Jaffee Podcast
Deep conversations with brilliant people.
Listen on
Substack App
RSS Feed
Appears in episode
Theo Jaffee
Bryan Caplan