Playback speed
×
Share post
Share post at current time
0:00
/
0:00
Transcript

Eli Dourado

Hard tech, growth, societal collapse

Timestamps

(0:00:00) Intro

(0:00:43) Non-TFP econ metrics

(0:04:59) Are ideas getting harder to find?

(0:06:52) Economics as a field

(0:10:43) Tyler Cowen’s influence

(0:13:00) Culture and economic growth

(0:16:13) Uniqueness of the United States

(0:19:43) Education

(0:23:54) Revisiting “notes on technology”

(0:37:28) Investing in hard-tech

(0:44:30) Government’s role in tech

(0:49:22) Talent

(0:54:34) NEPA

(0:56:56) AI

(1:01:20) Societal collapse

(1:09:54) Advice to grow the economy

Links

Transcript

[00:00:09] Dan: Welcome. I assume that many listeners on the show are familiar with the fact that American economic growth started to stagnate starting in 1973. This is probably one of the most discussed topics in tech and econ circles, and my guest today is an expert on the issue. Eli Dourado is the chief economist at the Abundance Institute and one of the most interesting thinkers and investors in hard tech. We talk about whether ideas are getting harder to find, culture's influence on economic growth, and Joseph Tainter's theory of societal collapse. I hope you enjoy it. Let's jump right in.

All right. I'm here today with Eli Dourado. Eli, welcome.

[00:00:41] Eli Dourado: Dan, it's great to be here.

[00:00:43] Dan: Great. First question, what do you think is the most important metric you think about outside of total factor productivity? Maybe another framing of this would be is there anything that you would not sacrifice for us to achieve 2% total factor productivity?

[00:00:58] Eli: Yes. Much more important than economic growth actually is human rights and other kinds of just decency well-being type things that I would place that higher than economic growth on my hierarchy of needs. That does matter more to me. I think we've attained a reasonable level of that. I think they're not really in conflict with each other. At the margin, the more important thing for most US-based people is economic growth, but I definitely wouldn't want to sacrifice the more civic qualities that we already have. It could improve, but it seems good enough. At the margin, I think it's more important to have economic growth, but I wouldn't sacrifice those basic freedoms, freedom of speech, et cetera.

[00:01:57] Dan: Yes. Yes. That makes total sense. What about something quantitative, specifically from economics? Are there any economic indicators that you think, "Hey, if we're going to shoot for 2% TFP growth, we should be watching this one and make sure it doesn't get impacted,"?

[00:02:10] Eli: Yes. Sometimes people talk about inequality and they think of growth and inequality as being at odds. I happen to think that's not true at all. If you look at the data, inequality has been increasing in the low TFP growth era. You see massive inequality in TFP growth by industry, which implies also by geography. We have a huge tech boom in San Francisco. People in Arkansas are not necessarily doing so well from that and the remedy is to increase TFP growth across all industries. That actually improves inequality, but I would worry if we were getting some sort of boom that was leaving people behind. That would concern me significantly.

Yes, you don't want to have growth without it being broad-based or with it being less broad-based than you were before. If it was the case that we were getting huge TFP growth, but say 20% of the population was just worse off than before, that would trouble me quite a lot.

[00:03:29] Dan: What's the right way to think about inequality? Let's say that the wealth of everybody goes up, including at the bottom quartile as well, but inequality increases. Is that a good thing, is that a bad thing, how do we think about that?

[00:03:40] Eli: It's fine on its own if everyone's going up. The thing that worries me is political stability. This is something I even wrote about in my dissertation. People have to feel like they're getting part of the surplus of being members of society, being part of society. If they don't, then that's when you start getting people getting concerned about-- or you start raising, for me, concerns that people might defect from the system. That can take a number of forms. Apathy or just outright hostility to the existing system. That's actually something I worry that we're going through today is we haven't delivered growth, we definitely haven't delivered growth in all parts of the country and all parts of income spectrum in all industries and so on.

There's a fraction of the population, I don't know how big it is, that doesn't care about our society, about the common good and that worries me. I think of that as a reason to have more growth. I don't think of it as at odds with broad-based TFP growth, but I can imagine a scenario where they are at odds and that would concern me.

[00:04:59] Dan: Got it. Got it. Got it. Our idea is actually getting harder to find.

[00:05:03] Eli: Okay. I think you probably know my general take is that whether or not they're getting harder to find, that's not the reason why we're stagnating. I'll just preface it with that. I think that they are getting harder to find in some fields and easier to find in others. Fundamental physics, it's probably getting harder to find. There still is new physics to discover, but it's big teams and billion-dollar pieces of equipment and so on that we need to make progress on that. In other areas, say biology, is the one that I feel most strongly about. If you think about what a graduate student in biology, by themselves, in a normally equipped lab is able to do today versus the best teams of well-funded senior biologists could do 50 years ago, the grad student today can do more, can sequence DNA, can edit DNA, can create, new synthetic organisms, et cetera.

In biology, ideas are getting easier to find. In physics, they're getting harder to find. It just depends on the tools that are being developed, the new levels of abstraction that they're generating. New tools that serve as platforms that are a new abstraction, you don't actually have to learn about how to do a million things to unlock the new capability, those are areas where it's getting easier to find. Like software too, computer programmers do not need to know how a computer works today for the most part. At least most computer programmers don't need to know that. You abstract from all that burden of knowledge and it's easier than ever to make progress for them.

[00:06:53] Dan: Economics is a field. I think many people have a feeling that it's gotten a lot weirder over the past few years. COVID broke a lot of people's general models of how the world should work. I'm just curious what you think of the field as economics. You view it in higher or lower regard than you did pre-COVID.

[00:07:09] Eli: I've been on this interesting path of being trained as an economist and coming to identify less and less with the field over time. When you're in grad school, you're getting a PhD or whatever, you really want to be accepted as part of the field or whatever by your peers and by the senior members of the field. More and more, I just, "Don't put labels on it," and just, "I'm going to do what I find most interesting. Whether or not it's economics or whatever is irrelevant to me." I'm not very self-conscious about the field anymore. I'm curious what do you think has gotten weirder about it? Give me your case.

[00:07:48] Dan: Probably the most salient one is just inflation and government spending during COVID. Just basic models of how the world should work, it just seemed like nobody was very good at all at predicting how the economy would behave. Housing has been very weird. There's several of these things where I think the markets are just behaving different than people would have expected.

[00:08:08] Eli: Okay. I was thinking you meant more academic economics, the profession with a capital P.

[00:08:16] Dan: That works too. Yes, that's interesting as well.

[00:08:19] Eli: I think that the thing that has been underrated by economists sort of doing forecasting is demographics. The problem, say 20 years ago and for a few years before, was the global savings [unintelligible 00:08:36]. You had very low interest rates and so on. That's an artifact of the baby boom and the fact you had a high percentage of the population was in peak earning years and also peak saving years. If you're earning a lot and saving a lot, that means you're producing a lot, you're not consuming as many goods, so there's this glut of goods on the market and everyone wants to save. That's going to be a period with low interest rates because it's penalizing saving.

Now, we're through that. If you think about these multi-decade-long trends when the baby boomers retire, that shifts the trend pretty significantly because all of a sudden, the big demographic hump is no longer producing a lot and saving a lot. They're producing zero and just saving. They're spending down their savings. You'd predict that that would have a big impact on interest rates and inflation and we'd no longer be at this point where you could just keep interest rates low forever and not get inflation out of it. I think that that may be what's going on, but, A, I'm not 100% confident that that's the main explanation. B, very few people were building that into models.

C, you're right, it's crazy [unintelligible 00:10:10] if this keeps going, the amount of the federal budget deficit and the amount of increased expense that we're going to have supporting old people over the next 10 years or longer, people are losing their minds, and I don't see how we can support it. Then again, if you had told me the path that we would be on in certain deficits 20 years ago, I would have said, "That's not sustainable. That can't happen." We've been continually surprised by the amount of the deficits.

[00:10:44] Dan: We're talking a bit about your background in academia and I understand, anyways, Tyler Cowen was your PhD advisor.

[00:10:50] Eli: Yes.

[00:10:51] Dan: Where did he most influence you?

[00:10:53] Eli: Oh, gosh. So many ways. I think the number one thing is having multiple models of the world in your head at the same time and having a portfolio approach to thinking about the world, not holding any one of them too strongly, being able to see multiple sides of a question. It sounds basic, but a lot of people don't do that, and Tyler's very good at asking the right question to make you do that, be able to explain all the sides and actually see them as having some value. Even if it's your minority position or whatever in your portfolio in your head, it has some value and it has some explanatory power and you don't want to dismiss it. That open-mindedness is really important.

The value of studying widely, of bringing in models from different parts of economics and from outside of economics, and seeing the applicability of those models to the question that you're interested in. Tyler, a lot of his early work was influenced by finance models. He was thinking about Fischer Black and stuff like that. Of course, I've read all the Fischer Black stuff and it became influential for me as well, but thinking about, "Okay, how does this influence questions of macroeconomics if you have a finance underpinning behind your thinking?" Then just completely other fields that you got to go think about in terms of those different inputs. I would say those are some ways. Plus, he's just a very kind and very-- he's a student of human nature. If you're working with him, he will study you. He will develop a model of who you are. He's, like I said, very kind. Always had something constructive and helpful to say.

[00:13:01] Dan: Yes. that last piece, the older I get, I feel the more important I realize that is in whoever you work with.

[00:13:06] Eli: Yes, exactly.

[00:13:07] Dan: That's great. We're going to talk a bit about just technology and regulations as drivers of economic growth. The one driver I wanted to ask you about is culture. How much weight do you place on cultural factors in driving economic growth?

[00:13:23] Eli: Certainly, it has an impact and I don't know how to operationalize it. For one, culture seems to be influenced a lot by material factors, so economic conditions. Even the style of music, you have economic growth in the '80s, you have poppy, happy music. Then there's a depression or recession in the early '90s and you get grunge. I think there's some inputs there. The other piece of this that you could look at is we were growing pretty fast from, let's say, 1920 to through the early 1970s. I do think part of the thing that led to stagnation in '73 or so was that by the early to mid-1960s, we started to see some of the economic growth. Culturally, we started to, A, take it for granted, and B, view it as somewhat fake.

Even Ralph Nader's like, "Oh, it's unsafe." It's like, "Yes, we're growing, but it was unsafe." He didn't probably say it this way, but I would say, "Once you account for the unsafe, if you factor that in that economic value, we're not growing as fast as we think we are." Then you have Rachel Carson, Silent Spring. This is another way in which the growth is fake because we're not accounting for the decrease in biodiversity or in environmental amenities that we have. You see if the culture starts to see, has maybe even some valid critiques of the growth that you're having, there's just going to be a backlash and a push for increased attention on those parts of those parts of the notional growth that they say you're not getting. I think that's maybe what happened.

I don't know how to measure culture or-- so multidimensional. It's not just more or less culture. It's culture in all these different dimensions. Certainly, if you look at internationally, cross-sectional, across countries, the US has a certain culture that does seem to be well-suited to entrepreneurship and other kinds of activity that is conducive to us being at the frontier. Whereas other cultures are either more inherently pessimistic or more deeply cynical or something, that makes them not well-suited to that.

[00:16:14] Dan: Yes. I actually had this question ahead of time. It's good you led to this, which is why has there only been one United States and it's not like the US is perfect. If you look at human history, our story of growth looks probably very different without the US and it seems to be an outlier. It's not totally clear that someone else would have stepped in to take the place in the same way that we did. That almost leads you to believe that there's either a positive growth sentiment is not part of human nature or corruption is a part of human nature, all of these things that the majority of countries deal heavily with. Why do you think there was only one?

[00:16:46] Eli: I think the US benefited a lot from geographic isolation, being so far away from any threats from peer countries. The history of Europe is they're all invading each other all the time. They develop in that way. That means they do need more centralized government. When you have a wartime footing, you need centralized command and control. Maybe you don't, but let's at least go with that. You probably need more centralized command and control than you would otherwise, at least. The US hasn't had that need because there hasn't been a real threat of invasion. It's been less centralized command and control. Parts of the American West, for most of American history, have been ungovernable basically from Washington.

Tyler has this paper about does technology drive the growth of government. Without computers or whatever, you wouldn't be able to keep all the records that you would need to have a large administrative state. Just seeing the US as for most of its history is basically ungovernable, created a unique culture that valued the individual and the contributions of the individual. Whereas in Germany, okay, they, have these train systems because they have to be able to move war material to the front lines or whatever, the US is just, "We can't control what happens in most of our country." I think that's a big part of it is that when you have centralized control, it's much higher variance.

You can have really good outcomes that come from central direction, but you can also have these disastrous centralized policies. The US just taking a more decentralized governance path has just chugged along at a more or less constant rate without any major setbacks. That, I think, maybe has contributed to it. The other thing is that there is a path dependence here. There's some recent papers came out I guess last year, NBER working papers on zero-sum thinking. The conclusion of the paper is, "Okay, positive-sum attitudes and thinking is a causal factor in economic growth, but the causation also runs the other way." If you grow up in a stagnant or declining economic environment, you're more likely to be a zero-sum thinker, because if in fact the economy is zero-sum and not growing, then you're going to be a zero-sum thinker. Once you're locked into those thinking patterns, it's hard socially to get out of them.

Anyway, I don't know if that's a complete answer to your question, but those are where my mind goes when you ask it.

[00:19:43] Dan: Yes. I'm curious about education. How important do you think education policy is for the eventual goal of increasing total factor productivity? It strikes me that it's now becoming pretty easy to teach yourself very high levels of any technical thing you want to learn online and you could just meet people online and get engaged in communities. It's a lot easier. Is education policy going to be more or less relevant over the next few years?

[00:20:08] Eli: The funny thing about education is it probably hasn't been about conveying information for a long time. Even say 50 years ago, you could get a library card and go down to the library and get any book you wanted. Once you learn to read at least, you could, in principle, go down and read the books, read the journal articles even, and teach yourself anything you want to know. I think a lot of education is really about motivating students. Coaching them, keeping their interest levels up. Another important component of education, of course, is babysitting. The reason we have schools in large part is because parents have to work.

The other finding that I get, and I think it's pretty robust, is that we're not necessarily teaching in the most effective manner. The most effective manner is tutorial style with mastery-based learning. If you had something that, you could get maybe much better outcomes. We're not, in general, doing a good job. The students are, for the most part, meeting grade level expectations. If you then go back and survey American adults, something like over half of US adults, I think 53% was the latest data, don't read at a sixth-grade level, which means they can read multiple texts and compare and contrast and compare the sources and draw conclusions from the different things that they read in the multiple sources.

If you're thinking that's the standard, everyone should read at the sixth-grade level, we're not doing it. We're not illiterate, we can read, but we can't-- Obviously, students are attaining those standards as they're going through school, but then once they're out of school, it goes away. One thing I've been thinking about lately is to what extent can AI tools, LLMs, et cetera, make the mastery-based system more accessible for everybody. If you have an LLM that is coaching you, that is trying to keep you engaged, and it's always at your level, giving you material that is appropriate for what you already know, reviewing what you already know at a space interval, making sure that you retain it and is fun and engaging in that way--

You could imagine a inversion of the school where you have some tablet or whatever that does the teaching. Then schools are still going to have a babysitting function, so you still need human teachers there most of the time, but they don't have to be specialists in how to convey certain information. They can be there to meet the child's emotional needs to make sure fights don't break out, care for the child. It's a different person that you would hire to be a teacher in that world, but you still need teachers. My question for the next 10 years of education policy is to what extent can we realize the dream of-- Neal Stephenson wrote about the Young Lady's Illustrated Primer in The Diamond Age, and if you could have something like that available to all students and then decouple the teacher figure from actual doing education, I think that would be maybe the best way forward.

Education matters, but it's not so much how effective are you at communicating the information to children, but how good are you at fostering a level of learning and creating an association in the children that doesn't make them want to graduate and then never read again.

[00:23:54] Dan: I want to revisit your 2020 posts on notes on technology in the 2020s. You go down a list of a bunch of different areas and talk about potential technologies that might push total factor productivity up to that 2% rate. Just broadly speaking, before we dive into any specific area, do you feel more or less confident, sitting here in 2024, that this is an achievable goal than you did when you wrote this in 2020?

[00:24:23] Eli: In terms of the goal is to increase TFP, I think it's very achievable. I don't think all the detail I had in that post was right by any means, but I feel it's still very much achievable.

[00:24:35] Dan: What I'd like to do is go through each one and just see if you have-- Do you have any interesting updates off the top of your head in the last it's four years? In the last couple of years, have there been any interesting breakthroughs that are worth noting?

[00:24:47] Eli: Yes. One of them, off the top of my head, I think I was pretty bearish on battery improvements in the post. Today, I'm way more bullish on it. I ended up making a battery investment. That's why.

[00:25:03] Dan: What changed there? Why are you more bullish now?

[00:25:06] Eli: Better understanding of the bottlenecks in batteries and actually seeing the-- There's a company that I invested in that it's going to smash that bottleneck. What I realized is the bottleneck is cathodes. In most battery startups or whatever are working on better anodes, which is not the bottleneck, but cathodes, we only have two in lithium-ion. We have the nickel manganese cobalt cathodes, and those are higher density than the other one and a bit more expensive. Then there's lithium ferrophosphate. It's just iron-based cathode and those are cheaper but less dense. Those are the two paths forward.

We are seeing the prices of battery cells start to stall out in the sense that, for a long time, we had 20% annual cost improvements, and now, we have 5% annual cost improvements over the last 5 or so years. That is stalling, but I think a new cathode that is denser and cheaper will just be a game changer. I invested in this company called [unintelligible 00:26:20] and I think they have that and they've already tested it in single-layer of cells and confirmed it is much denser and it is also much cheaper than NMC or LFP. That's one way I got wrong and actually made a bet on the other side of it.

[00:26:40] Dan: Okay. Actually, another bet you made was on space. You bet Robin Hanson $100 against his $200. Your bet was that a human would set foot on Mars by the end of Q1, 2030. Now sitting here, four years later, would you double down or are you less confident?

[00:26:55] Eli: No, I think I'm going to lose.

[00:26:57] Dan: You think you're going to lose? Okay.

[00:26:57] Eli: To be fair, I got two-to-one odds. I told Robin I was 40% confident. He offered me two-to-one odds, which was generous. I am less confident now only because I think the schedule will slip. I do think Starship will enable that to happen at some point, but it might be four years later or something like that.

[00:27:19] Dan: Okay. What about in biotech and health? The main things you talked about there were mRNA vaccines and DeepMind's protein folding. Both of these strike me as more technology breakthroughs that were waiting to be commercialized. Do you have any updates off the top of your head on either of those that make you either more or less bullish?

[00:27:36] Eli: I've been thinking a lot about gene therapy in the last couple of weeks, and related to mRNA but not only mRNA, the biomolecular kinds of treatments. I just think the potential is just so enormous. My concern right now is that the regulatory system is not right for those kinds of breakthroughs. When you have these biomolecular treatments, they're big molecules. The existing drug approval system is designed for small molecules in terms of even preclinical testing for toxicity. Toxicity in small molecules happens because the molecule accumulates in some organ or something that and causes damage.

What they do is you put this in a mouse and pump up the-- It's actually cruel. We pump up the doses so high that we determine what the LD50 is, what the lethal dose is that 50% of the mice die. Then you make sure that the human dose, on a per kilogram basis, is way lower than that when you do it in human trials. Just like with these molecules that are bigger, they're more the natural language of our existing cells, there's not really that same concern. The immune system will clear it automatically because it's used to dealing with molecules of that size if it is too much. To the point that we've had to redefine toxicity for biologics. It doesn't even mean the same thing, but they still require toxicity testing.

I worry a lot about that. I'm concerned that we won't reap all the benefits of those breakthroughs, but I think they're very, very significant. You could have a different system where maybe you regulate the vector or the platform, how you deliver the genetic payload to the cell, but maybe you don't regulate the exact payload and it's doctor's discretion. In the same way that a surgery, the doctors don't get every individual surgery FDA-approved even though they're completely unique and customized to the needs of the patient.

There was some positive news this week. Congress basically instructed FDA to come out with a way for way to identify these more genetic platforms, gene therapy and other biologic platforms. FDA has issued some draft guidance saying how they'll do it. I don't know enough about it to evaluate whether that's going to be sufficient or it doesn't go far enough or whatever, but I think is incredibly powerful. On the multi-decade timeline, it's going to be very important.

[00:30:40] Dan: Okay. Let's talk about energy. The main topics you discussed were wind, solar, batteries we already talked about. The big one that I didn't know a lot about is geothermal. It seems nuclear gets a lot of attention right now because there's a bunch of policy disputes going on all over the world about, "Should we do nuclear? Should we not?" You get a bunch of people that take a hard stance on each side. You stated that geothermal has potential to maybe make nuclear not even relevant if it really works. What's been the latest on geothermal? Would you have any updates to the posts in the last couple of years?

[00:31:11] Eli: Yes, we've had some next-generation test wells or actually even commercial projects. Fervo did a project with Google for a data center and the results were good. A lot of the model in terms of making geothermal economical is going to come down to drilling costs and drilling costs seem to be getting very low even with mechanical drilling. I made an investment in energy-based drilling, but my friend Austin Vernon keeps telling me, "Oh, mechanical drilling is also getting cheap." Just using regular drill bits and going through granite, there's a lot we could do to optimize those bits and, and the surrounding systems to go through granite. That means that we can get much deeper and get to hotter temperatures and so on.

Everything seems bullish where maybe some update is, I would say, just pure thermal energy, that seems the lowest-hanging fruit for geothermal. If you think about a paper mill, it runs 150 degrees C steam through pipes to help with the drying of a paper. That's low-temperature heat that geothermal can provide very easily. There's a bunch of industrial. Dairy farms, they need heat to pasteurize milk and stuff. That's something you could decarbonize very quickly with geothermal, potentially. Electric conversion, I think is still on the table very much, conversion to electricity.

One other thing that maybe I've updated on is I just think steam turbines suck. If you think about how much they're still important to us, this is a couple hundred-year-old technology. Coal plants, it's you burn this rock and you use that to make steam and then you use that to run a turbine. Geothermal is also using a turbine, nuclear is also putting neutrons to make heat to boil water, to run through a turbine, and even some fusion concepts, it's still using neutrons to create heat, to boil water, to spin a turbine. More and more I'm appreciating the benefits of solid-state stuff. A solar panel has no moving parts and it's easier to manufacture or you get more cost improvements in manufacturing at scale because there's no moving parts.

The holy grail as I'm thinking about it is it's something that is better for converting heat to electricity efficiently that is solid state and could be manufactured in a way similar to solar panels. There are some ideas for this. A lot of them maybe rely on higher temperatures, which you can't get to with, say, nuclear even, but if we could do something that, that would breathe life into a lot of thermal sources.

[00:34:17] Dan: Got it. We'll move on from this post. You articulate really well this idea that just because you have a tech breakthrough doesn't really mean that it gets digested in the GDP numbers or is useful to anyone. What you end up really needing is you need someone to productize it and commercialize it and actually make it useful. Just as a thought experiment here, let's pretend that all tech progress on the breakthrough side paused tomorrow, but we still had our existing breakthroughs, so things like CRISPR or GPT-4 that haven't really been dispersed throughout the economy. Do we have enough to budge TFP today? How excited are you about the existing just science breakthroughs?

[00:34:52] Eli: I think there's a lot. It is a lot of gains. As you may know, supersonics is near and dear to my heart. That's something we had 50 years ago. Concord's first flight prototype flight was in 1969. Bringing back supersonic flight, it would be a huge thing, but not even that. We have ramjets. We've had ramjet engines since I think the 1940s. I think the Soviets had one. Okay, we haven't commercialized that. You could go easily Mach 3, Mach 4 based on existing technology and if we could commercialize it. A lot of the energy stuff is about deployment at scale. Wind, solar, geothermal even.

It depends on what you want to count as a tech breakthrough because I think there is significant learning by doing. This is an important point is that a lot of the progress is made outside of the science lab, but it's still an innovation, it's still an idea, and it comes from deploying, having contact with the real world, seeing what goes wrong, iterating. If you think about why SpaceX and Boeing have such divergent paths. Well, Boeing just designs the product. Whether it's a rocket or capsule or an airplane, they design the product end-to-end and then build it and then fly it and expect it to work flawlessly, whereas SpaceX is, "Well, we're going to build this half-baked version. We're going to fly it. We're going to see where the weak points are, why it blows up. Then we're going to figure out what the problems are and address those in the next iteration."

Iteration speed is so important for development. I don't know if you count that as pure ideas or pure deployment, but it seems really important as attested by the fact that SpaceX has a higher valuation than Boeing's market cap with 1/10 of the employees. There's plenty of things. A lot of the genetic stuff, we don't need any fundamental breakthroughs to just deploy that at scale. You could cure a lot of illnesses. You would need maybe some innovation in the sense of designing the particular molecule for this condition, but no real breakthroughs are needed. You just need to actually design the stuff, administer it to people, see what goes wrong, adapt, and so on, and so on. I think there's plenty. Particularly if you count the more iterative aspect of it, I think there's plenty that we could do.

[00:37:29] Dan: Got it. We don't have to make that trade-off, so that's great. I'm curious, as someone who goes really deep on all of these different technologies, but then you also invest, how do you think about understanding which tech breakthroughs are going to be able to be turned into a useful product? I feel like this is one of the really big questions of venture capital. I feel like in these types of domains, it's actually a little bit even more challenging than in a software business because you're trying to figure out, "What does it mean that we have the scientific breakthrough? Does it actually have a useful application in the real world?" For the investments that you make, how do you think about which technologies to bet on that you think will actually end up productizing?

[00:38:08] Eli: You're not betting on a technology. Usually, you're betting on a product. If a founder comes to me and he's like, "I have this technology." First question is, "Well, okay, what's the product?" Don't invest if there's no product. If it's not a proposed product, I just wouldn't invest in it. If that product doesn't obviously have a big market, I wouldn't invest in it. If they can build this thing that they say that they can build, will it be a billion-dollar company or a multi-billion-dollar company? If the answer is no, then that's a screen right away to say no. Then a third one is just do I understand how this works? If you don't understand how it works, then I would say don't invest in this idea.

This is actually a big problem in hard tech VC is that there's a lot of VCs that are mainly software funders, but they've seen the success of SpaceX or whatever and they want to have a little bit of a hard tech portion of their portfolio or something, but they don't actually know enough to diligence the idea or they don't invest enough a lot of times. It is an unhealthy dynamic in venture where some people-- Peter Thiel says, "Okay, you want to be contrarian," and everyone's like, "Oh, yes, we want to be contrarian." What they actually, not Peter, but most other VCs, what they mean by that is they want to be six months ahead of the consensus. They want to make the bet now in terms of what will be the consensus belief six months from now. You'd get a big markup in six months because everyone comes to believe it. If that's your play, your object of study is not so much the company, as it is your peers in the industry. You're thinking about like, "What will these other firms believe in six months that I can bet on now that then they'll believe in it, and it'll be a markup for me and my investment?" Yes, just basically understanding at least just at a very basic level how does this technology or invention work is table stakes, I think.

[00:40:25] Dan: When you invested in a deep tech company and you're looking at the founder, what's the most important skills that you look for? I'm assuming that this is different than you might look for in a software founder.

[00:40:35] Eli: In a hardware founder, it's really important to know the field really well, know the industry really well. In a software founder, it's like, "Does he need to have deep experience across the software industry and know what the trends are and know what these technology stacks are or whatever?" In a hardware founder, they need to know their industry cult, I think. They need to really know like, "Why did this company fail? If I use this material to build my thing, what are the trade-offs associated with that? If I had to manufacture this literally myself, with my own hands, how would I do it?"

I think that expertise is really valuable. Then I think the other thing that's really important-- I think this is probably also important in software, which is just iteration speed, again. How important does the founder see speed and the the cycle of having at least something to test against the real world very quickly and then to iterate and fix what went wrong. I think that's really important as well.

[00:41:58] Dan: Got it. You commented on this, but this was going around Twitter and the internet a lot. Jensen Huang said in an interview that he would have never started Nvidia if he knew how hard it would be. You commented on a post basically saying you think this sentiment is actually a pattern in hard tech that a lot of these founders end up saying like, "I don't know if I would have done it if I knew how hard it would be." The implication that you draw here is that maybe a little naivete is actually good.

Is that just a fact of life that this stuff is painfully hard? To me, that would be bad news because how much naivete can we have out there that feels like it would be a bottleneck to get people to go into this stuff. I'm curious how you just think about that.

[00:42:37] Eli: I think it is basically a fact of life. Maybe it's bad news. I think the good news is there's a lot of naivete out there, particularly with young people. Yes, it is way harder to build a company than you think, even if you adjust for that fact. Even if everybody tells you like, "It's going to be way harder than you think," it's still going to be harder than you think. I contrast this a little bit with Google's lab X. They have this moonshot factory. It's led by Astro Teller, who's a super smart guy. The whole MO of that lab is like, "We're going to study this potential product, and we're going to cut bait on the ones that seem too technically hard."

I think the result of that is that you're cutting bait on everything. Whereas if all of those were a founder-led startup that has to go through a year of hell to get the product on the market, but they want to give up, but they can't. There's some things that they probably cut bait on that would have worked if they were instead had been done by a founder who had funding, but when it got hard, he just had no choice but to keep going. Yes. You have to do things. We do these things not because they're easy, but because we thought they would be easy.

I see that over and over again in hard tech startups is it's always just harder than you think. Then I think the other thing is you have founders who are very technical, and they have to learn the business side of it. They have to learn how to fundraise. It can be excruciating. you do 300 pitches or whatever, and you still don't raise your brand. That's like, "Guess what? You have to keep going." It is hard.

[00:44:30] Dan: You've got founders, you've got corporate labs. What about the government? When should they take on a hard tech project? Obviously, at some point in time, they take us to the moon, but in this day and age, what role do you think they have to play?

[00:44:41] Eli: It's a good question. The way that they can be productively involved is in funding first-of-a-kind projects. Thinking about geothermal. There's, I don't know, what you want to say, four different modalities that you could use that I would consider next geothermal energy production. Whoever the first company is to show up to want to do a discreetly new well. If they would say, "We will find you to do this project," I think that that would be really valuable. Then they do the project, hopefully, it succeeds. Then investors see that it succeeds and the private capital is there to go take it to the next step.

There is an office of clean energy demonstration that's supposed to do this. They don't have any geothermal money, interestingly. They got funding for a lot of other things but not geothermal. Then there's a loan programs office. They don't quite do this. They take the next step. They don't want to take too much technical risk, but they are willing to be-- if the technical risk is burned down, but private markets still don't see that, then they step in. I think that's one thing. I'm thinking about, say, nuclear. We have a national nuclear reactor laboratory. It's called Idaho National Labs. Now, they have not actually built a new reactor since the 1960s.

That has come to market, building test reactors would be great. They are building one right now, but they haven't done it for over 50 years. Our nation's nuclear reactor laboratory going for over 50 years between test reactors. Then I think something like partnerships with industry. They have, at INL, an exemption from NRC rules. They have a lot of facilities that are out in the desert. You can imagine a regime where they basically say to all US nuclear startups like, "You can come here, use our facilities, build your reactor in six months. You can test it destructively. You can blow it up out in the middle of the desert where no one will get hurt. Then you can iterate."

Iteration is so important. That's something that's not done in the nuclear industry. It's basically 12 years and $12 billion between reactors. Providing that there's facilities in safe harbor and regulatory exemption would be really valuable in nuclear. I'm less enamored of just production subsidies. If you think about the IRA, one of the things it does is it puts these huge subsidies on clean energy generation. That, to me, feels more like productionism. There's this constant problem of an industrial policy if you coddle an industry too much, it just gets fat and happy. US shipbuilding is a perfect example. We have as many shipbuilding workers as pretty similar to numbers to Korea and Japan.

We produce way fewer ships than they do. The productionism we've done there has not worked well. I think something similar could happen in energy generation. If you don't actually face market pressures because your subsidies are so generous, you might stop innovating a little bit. To answer your question, I think governments can be helpful, but it's hard, and you have to think through carefully, whether you're helping, or you get in a lot of bang for your buck or at least not being harmful in the support that you're giving to an industry. I spent a couple of years at Boom working on policy and so on, and had a great relationship with NASA people.

I remember one time when we had to figure out how to learn to use some prediction software that was NASA software. Our engineers tried to learn this, couldn't do it. We hired an outside consultant to teach them. The consultant was basically learning it as they went and then trying to teach our team. It was like, "That didn't work." I called up some contacts at NASA, and they were like, "Yes, just fly out, bring us a team, and we'll walk you through it." They did, and it was great. It was just an amazing resource for the industry to have these NASA experts who are accessible and willing to talk to engineers in the private sector.

[00:49:23] Dan: Yes, interesting. I was just looking this up in prep for this call, but it looks like the number of people that graduated with a computer science degree in the US, it increased from about 40,000 in 2010 to over 100,000 today. It's still growing really, really rapidly year over year. Do you think we have too many people going into software?

[00:49:40] Eli: Probably. What margins and what constraints are we talking about. If you believe the physical world is over-regulated, and it's hard to get things done in the physical world, though that increases the rate of return in the software world where you have the First Amendment and where that means that limits any preemptive publishing of software or something like that. The returns of software are probably higher than they should be, relative to the returns in other sectors, if you believe that those other sectors are over-regulated in a way that depresses rate of return.

Now, I don't know, the computer science education, my understanding is a lot of that's super theoretical and not actually very useful to industry. I think we're probably almost certainly no matter what over producing people who understand the Church-Turing thesis, so these more abstract and less relevant points of computer science.

[00:50:44] Dan: What about the corollary to this, which is, do you think we have enough top talent doing biotech, energy, transportation, these more physical world industries?

[00:50:52] Eli: I think it certainly it would be better if the rate of return in those industries was higher, we would get more talent flowing to them.

[00:51:01] Dan: I guess this is the point. Is your view that rate of return in the industry is what causes talent to flow to it?

[00:51:06] Eli: I think so. I think it was at least part of it. Yes. There's more opportunity. If there was a higher returns in biotech because, say, FDA adopts some reform that's more permissive, also still smart and good for the industry and maintains trust in those products and so on, then, yes, I think there'd be a gold rush right in biotech, and more people would flow to that industry. Making sure that you're not artificially depressing the rate of return in these other areas, I think is how you should think about managing talent flows. It's not about not managing the talent flow directly, but making sure that the rate of return is sufficiently high in those sectors.

[00:51:59] Dan: Got it. We've alluded, too, in a lot of these questions, this idea that regulation broadly defined hinders productivity growth. I'm curious if we have a new Elon Musk, let's say, and he's 12 right now. We say he's going to go forth and lead the country in some way. At the margin, do we need him more to be a policy expert and go and adjust policy and get into politics? Do we need him to just say, "Screw it. I'll work within the existing balance of how everything works. I'll just do three more really good companies." Where do we need that incremental Elon Musk?

[00:52:31] Eli: Yes, I would say do the thing. I would say the policy stuff is important. Obviously, I'm working on the policy stuff, so I think it's important. I also believe that you could even actually be more effective in policy change a lot of times, if you have an example or a thing that you're trying to push through. When I was at the Mercatus Center, I wrote a paper on supersonics. That paper circulated within DOT, got some interest and stuff like that. We were able to move the needle on supersonic policy more. I felt like I could see myself being able to move the needle more once I got into the industry and was actually like, "We want to build this thing, and you need to change policy so that we can build this thing."

You can create jobs and whatever else, but there's economic activity associated with it that the regulator doesn't want to stand in the way of. Creating an intellectual foundation is important, but at the margin, I would say, it's more people pushing on the boundaries from within industry and trying to do things that have regulatory risk. I think a lot of times, if what you're doing is actually a singular thing, regulatory risk is sometimes lower than people think. I think regulators do want to want to accommodate. When I say singular, I mean cryptocurrency, everyone's doing cryptocurrency stuff.

If you are just, "I'm a crypto person and I want to engage the SEC and change the rules," that's going to be tough because they have to take everybody into account. If you're actually have a unique thing that you're trying to do-- I think regulators tend to be open to engaging and figuring out a solution for you, particularly if you're also investing in lobbying, and they're getting letters from members of Congress saying like, "What's going on here? Why are you stopping this one entrepreneur from my district?" I think there's paths forward if you're one of these singular entrepreneurs.

[00:54:34] Dan: Got it. You've been talking about NEPA for many years. I think you're actually the person where I first heard about it, and now it's everywhere. How close are we? Is there any realistic world where in the next couple of years it gets either repealed or reformed to the extent that it's not a serious hindrance to growth? What is your confidence on that?

[00:54:53] Eli: I think there's interesting discussions going on in the hill now and an awareness of it. I don't think that the solutions that are being discussed are the go far enough. I think a possible trajectory here is something passes, it's still not enough, and we keep beating our heads about it. Then I think, ultimately, the driving force for reform here is actually the climate movement. That's not going away. If we reformed NEPA and it's still really hard to deploy clean energy, there's going to be more interest in even two years from now. Say, we did something now, it wasn't enough, two years from now, the industry is going to be like, "Hey, this didn't work. We need more reform."

I'm actually super excited for the climate movement because they're the one mainstream movement that's not complacent. They actually want and need a solution and will push hard for it and have a very strong conviction about it. I think there's just basically no way we can deploy clean energy on the scale that we need it or the climate movement believes that we need it. I think we could use actually more than even they think they need, but that quantity of energy is not getting deployed in any reasonable timeframe without bigger NEPA reform.

That would be my point of optimism is that whatever we do that doesn't work, I don't think it's a one and done, we do a reform, and then that sucks up all the attention and everyone says like, "We've done NEPA reform, and now we can move on." I think it will be the case that we will continue to enact reforms until something actually works to deploy energy because the climate movement thinks it's so important.

[00:56:57] Dan: There's a view out there that AI is the last tech breakthrough that we would technically ever need because the theory goes you create AGI, it can solve all of your other problems for you. You just ask it like, "Hey, can you build a spaceship that gets us to Mars? Can you cure cancer? Can you do anything that we care about?" What do you make of this?

[00:57:17] Eli: I am learning along with everybody else. On one hand it's very cool tools. I'm super thrilled to have these tools available. On the other hand, I see many problems with a just-- It was a super optimistic or pessimistic approach, regardless of where you come down. I think the most likely thing is not that much changes, sadly. Part of this is pattern matching from past experience. I'm old enough to remember the '90s, the web. Everyone's like, "Oh, this is going to change everything." On one hand, it did. It changed a lot of social stuff. It didn't really fundamentally remake the economy in the way that we thought it would.

The example I like to give is if I want to sell my house, I'm probably still paying 6% to a realtor, which was the biggest no-brainer thing that I thought the web would fix. It's very easy to list the house and disrupt the real estate cartel. It turned out, no, it didn't do that. I think there's going to be a lot of domains where AI is powerful and interesting and could theoretically do something, but it's going to be a long time in adoption. Health sector, you could replace doctors for a lot of things with a medical chatbot or even something interpreting test data or whatever. They're still going to want human doctor eyes on it, I think, for a long time.

That wasn't your question, though. I think your question was, is it going to be capable of doing a lot of these tasks? I think the tools are pretty good, but they have some error rate. If you think about any interesting question is actually a long series of smaller, more isolated tasks. If you have a 1% error rate on each step, and it's a 50-step problem, it's not going to be a reliable solution to the task. Maybe there'll be something self-correcting or something like that. I don't know. My inclination, just looking at the web-- What else is going to save us? Crypto was going to save us, smartphones, we're going to be revolutionary.

IoT. I remember internet of things and smart cities. We're going to revolutionize the economy. I'm not saying AI is like that, but I've gotten to be more skeptical that I want to see it. I want to see the change in my hopes. Because I've hoped for fast growth for a long time, and I've just been disappointed so many times that I don't see that. Then I think the other thing is I'm not sure people's mental models around AI are correct. If we think about even super intelligent AI. I don't even know what this means, but let's say that there's some entity that's much smarter than us.

Is it uniquely capable in a real world way? I think about, "What are the most competent and capable entities on our planet?" It's corporations. Corporations, they run the world. Corporations run the world. Are corporations smart? If you think about it from an information processing standpoint, they are very slow institutions. Input filters into the corporation, and then it gets passed around for weeks or months before a decision is made. In a sense, every single corporation on the planet is dumber than you and me, and yet they run the world, and we don't. Anyway, in speed of information processing doesn't seem to correlate with actual capabilities in the real world. That's at least a lingering doubt in my mind that people have the right model here.

[01:01:20] Dan: Got it. I want to talk about your most recent posts on Joseph Tainter's book, The Collapse of Complex Societies. His basic theory is that you increase societal complexity, and then at some point, you start to get diminishing returns to that complexity, which eventually, people get fed up, and then this can lead to collapse. He makes a claim that it has in many cases in the past. The question I had for you, just riffing on this, is it seems like our society is way more complex by many orders of magnitude than past societies. Why should we believe that we'll hit some point where it becomes too complex?

[01:01:58] Eli: Yes. The reason we can be more complex is because we have a different technological dispensation. We have the technological tools to turn that complexity into additional output at a much higher rate than past societies did. One way to think about complexity is how much specialization do you have, say in the labor market, or how many administrators do you have? If you think about, I don't know, the Maya or Minoans or whatever, if they got to a point where the fraction of their economy was administrators that ours is, I think they would have gotten fed up way before we do.

We could tolerate a higher percentage of people being administrators, because to some extent, that administration does add value in our economy in a way that it wouldn't in an agricultural society. It's not just how complex are you in absolute terms, but how complex are you relative to the technological dispensation? In particular, what is the marginal return to that complexity? The answer is going to be different depending on how much technology you have available.

[01:03:20] Dan: Oh, I see. You'd be more complex, but you're getting more returns.

[01:03:24] Eli: Yes. As long as you're getting positive returns at the margin to each additional unit of complexity, everything's fine.

[01:03:30] Dan: Yes. Until it starts to slow down.

[01:03:33] Eli: Until it's not, yes. Until you're starting to actually lose output from the additional complexity.

[01:03:40] Dan: How much do you worry about this? Do you worry about Tainter's idea coming true for the United States or leading Western nations today?

[01:03:46] Eli: I'm not, by nature, a worrier. I think I would say not terribly worried. On the other hand, do I think, "Is this a dynamic that seems to be going on right now?" I would say yes. One of the things that Tainter identifies is that it manifests in apathy to the wellbeing of the polity. People don't care anymore about their society functioning or their society continuing to exist or doing well. It's a decline in-- maybe patriotism is a related word here, but not caring about their government, or they're not identifying with it and seeing it as somewhat oppositional. I think we have that in spades. A, we have a lot of complexity. B, we have a lot of apathy or even hostility to the current system.

In so far as you have people repeating Russian talking points about foreign policy and stuff like that. If you think about the Roman peasants got so fed up that when the barbarians came in, they were like, "Come on in. We don't care." That seems to be a thing that at least, I don't know, if you want to say 10% of the population or something that, we're there. I do think it's relevant. I think the cure is actually higher returns. More growth seems to be basically the cure to disease that Tainter is talking about. Both in the sense that exogenous growth that we just got more growth like, "We need to have higher returns, and that would be great."

I think the other thing is to get more growth, we actually do need to, in a concerted way, try to reduce complexity and root out the negative value forms of complexity, like the bureaucratization and the over-administration and the over-regulation of the economy. If you did that, that would also generate more growth. Different than exogenous growth, but just actually addressing the problem that Tainter identifies head on and trying to cultivate our garden of complexity, let's say, weed out the bad parts of it. I think that would be really beneficial.

[01:06:17] Dan: It seems like we've been really good. You mentioned you have two options. Continue to increase growth and abundance and then decrease complexity. It seems like we've been really good at just outrunning and finding growth in little pockets over the last hundred years. Even if it's been slowing down, we're still eking it out relative to how bad things could be. It doesn't really seem like we've done anything on reducing complexity. Do you think this is a problem that we're capable of solving? Do you see any indications or areas of the economy where we've been able to reduce complexity?

[01:06:49] Eli: Yes, I don't know. I worry about this. I think coming out of, I know, George Mason and doing an econ PhD there, it's like the home of public choice economics and a lot of public choice analysis. On one hand, it's fully positive. It's not normative analysis. It's just thinking about how these actors interact or whatever. A lot of it also is if you want to translate it to a more normative framework, a lot of the analysis is very low agency, and it's just like, "This is the way it is." My view is we don't have agency if we approach it with that attitude, if you just assume that's the way it is, that's a self-fulfilling prophecy.

If you can approach it with higher agency, these policies and stuff, people do make and influence policies all the time. Why not us? Why wouldn't we be the people to do that? I think we can make progress on it. I do worry that we are fighting an uphill battle in the sense that it's much easier to add complexity than it is to systematically reduce it.

[01:08:08] Dan: How do you stack rank? Just call it internal collapse from the Tainter model against the more commonly talked about existential risks nuclear war or AI kills us all or something. Which one do you think we should be more concerned with?

[01:08:23] Eli: They are different. Even the collapsed societies, it's not everybody dies. If you're approaching it from a, Will MacAskill EA perspective, it sounds he's actually concerned about literally everybody dying and all the future value of getting wiped out. I don't think the Tainter collapse model would support that being as important of a risk. I'm much more concerned about the Tainter style collapse in part because, A, it's happened a couple dozen times already in history. This is a thing that we know happens. I think the other thing is a lot of the existential risk stuff is it's iterated first principles reasoning, which I don't actually think is a very good method.

It could be a very rational person, but you take a couple of premises, you draw a conclusion, that becomes your new premise, then you do it over and over again. It doesn't take much error anywhere in that chain to make your ultimate conclusion completely fallacious. I just don't actually think that's a very good method. Asteroid risk is a thing, that has also happened. There is empirical data on that. We should take that seriously, maybe. The dinosaurs didn't have a space program, which is bad for them. Yes, 20 something times in the last 5,000 years, we've had this complexity cycle. That seems worth a few thoughts we're thinking about.

[01:09:55] Dan: Final question here, let's say, a young high schooler, they stumble upon your blog and they say, "I want to help get total factor productivity to 2%. This is my goal. You've inspired me." What do you tell them to do with their career?

[01:10:08] Eli: I would say like, "A, be curious." I would say like, "Actually don't look at it as a career. I would look at it more as a series of jobs," which sound the same thing, but it's actually different. I don't think you're optimizing the next 40 years of your impact. I think that's just an impossible task. I would think of it in more five-year chunks. Think about like, "What do you want to do in the next five years that's going to set you up to add value?" I think a big piece of it is, "Find what you think is the most interesting thing going on right now. Given your values and your interest in growth, I'm assuming that's going to be a growth-producing activity.

Find out what you think is most interesting, and find a way to put yourself at the center of that activity. Then just get there and just indiscriminately add value. Don't worry about being compensated for it. Just add value to everybody around you. That's how you advance in your field. People will want you around if you're just indiscriminately producing value because they want to be around people who indiscriminately produce value. Yes, I don't think there's a master plan, but follow that interest and follow that passion, and then just add value."

Then, "Over a course of like five years, what would you find most interesting in the world might change, and then just do it again. You'll have a series of big contributions that probably take you somewhere interesting but not a planned career."

[01:11:52] Dan: Eli, thank you so much for your time today.

[01:11:54] Eli: My pleasure. It was great questions.

0 Comments
Undertone
Undertone