The next decades might be wild

By mariushobbhahn @ 2022-12-15T16:10 (+130)

This is a crosspost, probably from LessWrong. Try viewing it there.

null
Stefan_Schubert @ 2022-12-15T17:52 (+13)

Fwiw I think it would have been good to explain technical terminology to a greater extent - e.g.  TAI (transformative artificial intelligence), LLM (large language model), transformers, etc.

It says in the introduction:

I expect some readers to think that the post sounds wild and crazy but that doesn’t mean its content couldn’t be true.

Thus, the article seems in part directed to readers who are not familiar with the latest discussions about AI - and those readers presumably would benefit from technical concepts being explained when introduced.

mariushobbhahn @ 2022-12-15T18:03 (+5)

Fair. I'll go over it and explain some of the technical concepts in more detail. 

Also, I also expect many people who are familiar with the latest discussions in AI to have longer timelines than this. So the intended audience is not just people who aren't familiar with the field. 

Sjlver @ 2022-12-15T19:18 (+10)

I really liked this... the post made me think, and will continue to do that for some time. It doesn't seem all that unrealistic to me 🤯

One little nit: you seem to write "century" when you mean "decade".

mariushobbhahn @ 2022-12-15T21:32 (+3)

Thanks for pointing out the mistake. I fixed the "century" occurrences. 

Corentin Biteau @ 2022-12-15T23:25 (+7)

Interesting ! 

I'm curious: do you have any thoughts about the impact that the decline of fossil fuels may have during these decades?

For instance, the recent claims by the International Energy Agency that fossil fuels should peak within 5 years ? (i have my own thoughts on the topic but I'd like to know how you see things going)

mariushobbhahn @ 2022-12-16T08:37 (+4)

I'm bullish on Solar+storage. But I think that it will take a while to adapt the grid, so I think it will take at least a decade before we can even think about phasing out fossils. 

Corentin Biteau @ 2022-12-16T09:04 (+1)

Ok - so there would be a phase where fossil fuels are declining, but renewables can't compensate yet, if I am to understand correctly.

What consequences do you think this gap should have? 

I'm interested by the effect this might have on the economy, since there appears to be a very strong link between energy and GDP, at the global level (see for instance this report, p.11 to 16).

mariushobbhahn @ 2022-12-16T12:17 (+2)

No, I think there is a phase where everyone wished they had renewables but they can't yet get them so they still use fossil fuels. I think energy production will stay roughly constant or increase but the way we produce it will change slower than we would have hoped. 

I don't think we will have a serious decline in energy production. 

Corentin Biteau @ 2022-12-17T13:54 (+1)

Ok, this is interesting. Your take is that energy production will stay constant or increase.

In that case, what makes you think that the International Energy Agency is wrong in saying that fossil fuels production will peak within 5 years, and then probably decline ?

For instance, where would the additional oil come from to compensate for the decline of, say, Russia (announced their peak in 2019) or Saudi Arabia (announced their peak for 2027) ?

 

(for the record, my personal take is that it's likely that this would lead to a decline in global energy production since fossil fuels represent 85% of energy production - unless there is a massive increase in renewables that you do not see happening soon)

mariushobbhahn @ 2022-12-17T13:57 (+2)

I don't have any strong opinions on that. There is a good chance I'm just uninformed and the IEA is right. My intuition is just that countries don't like it if their energy gets more expensive, so they'll keep digging for coal, oil or gas as long as renewables aren't cheaper.

Corentin Biteau @ 2022-12-17T14:08 (+3)

Oh, yes, they'll try.

I was more wondering about the fact that coal, oil and gas are finite, and getting harder and harder to extract, so it won't be exactly a choice. The IEA mentioned that fossil fuels would peak even if climate policies are not followed. But yeah if we could extract an inifinite amount of fossil fuels we'd probably do that - and use cheap solar on top of it for more electricity.

If you're curious about energy depletion, I tried to adress that in this post

mariushobbhahn @ 2022-12-18T08:05 (+1)

Interesting. Let's hope they are right and we are able to replace fossils with renewables fast enough. 

Corentin Biteau @ 2022-12-18T11:21 (+1)

I personally don't think we can replace fossils with renewables at scale, for reasons explained in the post (renewables have worse properties than storable, dense liquid oil, and there's not enough metals to build everything).

Anyway, I think we'll have more news on that front seeing how we fare in the next winters with the current energy crisis.

titotal @ 2022-12-18T12:32 (+3)

It's worth pointing out that the IEA is actually most famous for almost always  underestimating renewable energy uptake.  So if even they think fossil fuels are on the way out, I'm inclined to think they're right. 

Davidmanheim @ 2022-12-21T12:49 (+5)

Fantastic post overall, but I have a nitpick about what I think may be an important issue. 

Additionally, automated coding becomes pretty good, i.e. it reliably more than doubles the output of an unassisted coder. This has the effect that much more code is generated by ML models and that more people who don’t have a coding background are able to generate decent code. An effect of this is that the overall amount of code explodes and the quality degrades because it just becomes impossible for humans to properly check and verify the code.

I strongly doubt this part, and my expectation is the opposite - human-written code is pretty mediocre, and most production systems rely on millions of lines of code, much of which is unnecessary. The reason code isn't better is because programmer time is more expensive than the short-term benefits of improving the code.  Automated tools are likely to reduce this gap, and write far tighter and simpler code. In addition, maintainability becomes far easier when automated rewrites are possible. And checking code generated by narrow AI isn't actually as hard as checking human code for correctness when there are NAI systems which are far better than humans at writing well documented unit tests. Similarly, it can excel at generating computationally verifiably correct  code - something humans can do, slowly and painfully.

mariushobbhahn @ 2022-12-21T15:27 (+4)

Interesting perspective. Hadn't thought about it this way but seems like a plausible scenario to me. 

Jakub Kraus @ 2022-12-16T03:36 (+3)

You mention AI taking over many jobs and people having trouble specializing in useful ways, but you don't include any discussion AFAICT of why this won't cause massive unemployment.

On a related note, I'd like to see some ideas about how AI will change our education system. What kind of homework will high school students have in 2026? What will a typical day look like for undergraduate and graduate students?

mariushobbhahn @ 2022-12-16T08:35 (+3)

I think narrow AIs won't cause massive unemployment but the more general they get, the harder it will be to justify using humans instead of ChatGPT++

I think education will have to change a lot because students could literally let their homework be done entirely by ChatGPT and get straight A's all the time. 

I guess it's something like "until class X you're not allowed to use a calculator, and then after that you can" but for AI. So it will be normal that you can just print an essay in 5 seconds similar to how you can do complicated math that would usually take hours on paper in 5 seconds on a calculator. 

Brian E Adams @ 2022-12-16T05:24 (+2)

Technology causing unemployment is a concern as old as the Industrial Revolution and yet has never been born out.

It creates so many new jobs you cannot fathom. Imagine explaining the job of an SEO consultant to someone in 1990.

Makes me think of Wal Mart. They went to mostly self-checkout (which is awesome). Did they lay off a bunch of cashiers? Heck no. They just launched curbside (also awesome). Joe Schmos like me now have personal shoppers! Thanks to technology.

I think AI would have to be delivering an era of radical abundance before jobs somehow began disappearing, and then, by definition, you wouldn't need a job anymore.

Jakub Kraus @ 2022-12-16T16:51 (+4)

Here are some thoughts on why AI might be different than previous technologies.

  1. Faster pace of change. E.g. it took less than 3 years to go from GPT-3 to Chat-GPT, and Chat-GPT might already be changing the nature of knowledge work (see this story on Twitter), whereas GPT-3 mostly has not. I think it's plausible that code-writing AI will cause similar disruptions for software engineering within the next 5 years. In contrast, people had several decades to respond to changes from the industrial revolutions.
  2. Automation and autonomy. AI breakthroughs are especially useful for automating tasks compared to previous technologies (although previous technologies certainly enabled some automation), and automated tasks require less human labor.
  3. Higher time costs to transition between jobs. Some of the new jobs that AI creates may be more technical in nature and require coding skills that take years to develop. Truck drivers can't retrain as AI researchers. A counterpoint is that there will be new manual labor jobs as well. I like your example of self-checkout enabling curbside pickup.
  4. Performance improvements at many new tasks with a single AI breakthrough. E.g. language models demonstrate in-context learning, and fine-tuning them produces coding and math models. A counterpoint is that there have been other general purpose technologies like electricity and the digital computer.

I do think it's plausible that we'll see a period of radical abundance before any serious unemployment difficulties, but I'm not sure that this abundance will reach everyone equally. I could imagine low-income countries missing out on many of the benefits for years, which might cause dangerous political unrest.

Brian E Adams @ 2022-12-17T04:14 (+2)
  1. Yes. Though I have a higher opinion of how adaptable humans can be.

  2. Using my own work as a benchmark (residential real estate pricing), AI automation would be a huge benefit to enable me to spend my time on higher level analysis. There's a lot of AI that my role can absorb while my job still being safe.

That's especially true in my industry in which adverse selection is prominent. AI making me more effective will be necessary merely to keep up with our competition. It won't replace us because that will be the default starting position from which we as humans need to then compete at another level relative the competition.

  1. I'm not convinced that the AI revolution benefits the high tech roles relative blue collar roles. AI is a lot closer to writing code from a language prompt than it is massaging someone's back or even truck driving, for that matter.

  2. I hope this is right. Would love the kinds of breakthroughs AI will provide and whatever we can do to get closer to radical abundance, which I think is the holy grail.

I also don't see a systemic reason thst AI benefits won't A) be broadly shared, and B) shy of radical abundance, that economic principles won't continue to reward non-automated work. Automated activities will become cheap because they're abundant, while activity that can't be automated will rise in demand/price and be rewarded as a consequence.

Jakub Kraus @ 2022-12-22T17:38 (+1)

Thanks for your reply. I think the biggest cruxes are about how quickly humans can adapt to change and how quickly AI capabilities can grow.

To my original point in (2), I'd also add something like "crossing the finish line" or "reaching the end": within the next few decades, I expect AI to be capable of automating nearly all knowledge work. By "all knowledge work," I mean all thinking-related tasks, includes both 2022 jobs and post-2022 jobs. I worry that this capability level (or a level reasonably close to it) might arrive quickly, before we're prepared to deal with the ensuing unemployment spike.

Brian E Adams @ 2022-12-23T06:43 (+1)

My half baked theory is that there will always be jobs shy of radical abundance in which case jobs won't be necessary.

If AI automated all knowledge work WITHOUT delivering radical abundance, then there would still be jobs delivering goods/services that AI is, by definition, not delivering.

And if so, we have nothing to fear.

mariushobbhahn @ 2022-12-16T08:40 (+2)

I think narrow AIs won't cause mass unemployment but more general AIs will. I also think that objectively that isn't a problem at this point anymore because AIs can do all the work but I think it will take at least another decade that humans can accept that. 

The narrative that work is good because you contribute something to society and so on is pretty deeply engrained, so I guess lots of people won't be happy after being automated away. 

Brian E Adams @ 2022-12-17T04:16 (+1)

General AI isn't the same as radical abundance, but to the extent they're the same, I see that as making the unemployment concern moot.

I also have faith that humans will continue with the "purpose driven life" under such circumstances.

Jens Brandt @ 2022-12-15T21:21 (+2)

This was a really interesting read. I think a lot of this is plausible and it is similar to my own expectations in many ways. Thank you for writing this.

That said, I think your vision is too conservative. AI the field that will change the world massively this century where transformative changes are already certain. But there are other candidates as well. Space development, self replicating macro and nano machines, human enhancement and new physics are options I consider likely in descending order. The interactions between those fields are the truly scary thing. You acknowledge that your predictions are very likely to be false, but I think the biggest error people make when trying to predict the future is that they look at only a single dimension and ignore the interactions. For example, a power seeking AI could reasonably prioritise gambling on an interstellar land grab in the forties.

Another point where I disagree with you is that AI's will cause more damage with spreadsheets, medical procedures and legal procedures. The human standard is already pretty dismal here, but we are wired to trust people in those positions. I think that while it is certain that AI will fail in some cases, it is far from certain that it will perform worse than the human baseline.

mariushobbhahn @ 2022-12-15T21:34 (+3)

Yeah. I kept the post mostly to AI but I also think that other technological breakthroughs are a possibility. Didn't want to make it even longer ;)

I think you could write more of these stories for other kinds of disruptions and I'd be interested in reading them.

Vasco Grilo @ 2022-12-15T18:43 (+2)

Hi Marius,

Thanks for writing this!

Just one note, I think it would maybe have been helpful to have some subsections.

mariushobbhahn @ 2022-12-15T18:57 (+4)

Yeah. I thought about it. I wasn't quite sure how to structure it. I guess I'm not used to writing "story-like" texts. Most of my other writing is just a glorified bullet-point list ;) 

Vasco Grilo @ 2022-12-16T12:44 (+6)

I made my comment too early, sorry, I have now fully read it, and actually think the structure is good. I strongly upvoted the post, and plan to share it with some people!