"Full Automation" is a Slippery Metric
By Ozzie Gooen @ 2024-06-11T19:53 (+18)
Research Status: Written & researched quickly. I think the key point is fairly simple and obvious. I relied on Claude to help with rewriting.
There's been a growing interest in predicting when various products or jobs will be "fully automated." Will we soon have popular movies, books, or even CEOs that are entirely AI-generated?
Some very quickly-found links:
- https://manifold.markets/ChaseStevens/will-an-aigenerated-paper-be-accept
- https://manifold.markets/GabeGarboden/will-aigenerated-art-win-a-major-tr
- https://manifold.markets/probajoelistic/-by-2026-will-any-fully-aigenerated
- https://www.forbes.com/sites/sherzododilov/2024/01/11/can-ai-become-your-next-ceo/
It's an intriguing question, but I believe its definition is slippier than some realize. Here's why.
First, the idea of "full automation" on complex tasks (movies, books, CEO duties) is somewhat of a false dichotomy. In practice, automation exists on a spectrum, with diminishing returns at the extreme. Consider self-driving cars. Even the most advanced autonomous vehicles still have remote human monitors who can intervene if necessary. This human involvement might be significant at first (one intervention for every few miles, in the cases of Tesla and Cruise) but will likely diminish over time as the technology improves. However, even if there's just 1 person left overseeing all vehicles, we technically wouldn't reach "full automation."
This leads to what I call the Hilda[1] Scenario. If we ask, "When will movies be 100% automated?" we're effectively asking, "When will Hilda, the very last human involved in the moviemaking process, be let go?" Perhaps Hilda has a unique talent for crafting prompts that yield remarkable AI-generated special effects, or a keen eye for making subtle but impactful script adjustments. If the cost of retaining Hilda is minimal, her involvement could persist even in an otherwise automated workflow. As long as Hilda remain cost-effective to have in the loop somewhere, it's not fully automated.
The question of a "fully automated CEO" also highlights the limitations of this framing. Even if the vast majority of a CEO's responsibilities could be automated, there might still be significant value in having a human in the role. It's one question to ask, "When are ~99.99% of current CEO duties able to be automated?" It's an entirely different question to ask, "When will companies officially deem it best to not technically have a human at the helm?" In this case, it might well be the case that a human will be needed in the role for legal reasons, even if they functionally have few duties.
Separately, there's the complexity that sometimes automation makes humans more valuable over time.
Consider fields like visual effects. Despite the rapid advancement of automation and innovation in VFX, the industry hasn't seen a drastic reduction in human workforce. Instead, the rising quality standards and the increasing demand for VFX shots have led to a growth in VFX artist employment.
This phenomenon mirrors the Jevons paradox in economics: as efficiency increases, consumption of a resource may rise rather than fall. In the context of automation, as certain tasks become more efficient, the demand for those tasks may grow, ultimately leading to an increase in human labor rather than a decrease.
Rather than fixating on the notion of "full automation," I think we should focus more on other, more precise benchmarks. Coming up with these is difficult, but here are suggestions.
For different industries (like visual effects, software engineering, management, etc):
- When will X be "mostly" automated? As in, it takes 80%+ less time than normal from a human, for a quality level that we'd expect in [2024]?
- How will employment and salaries change over time?
- When will it be possible to generate the work equivalent to a ~[2020, 2024] employee, using less than [$100, $1M] of compute spending?
- How much computation will be used? How much money will be spent on compute, vs. employees?
- Which tasks might get outsourced, using a system augmented by AI? (For example, "autonomous driving" arguably now means, "driving is outsourced to a company that heavily relies on AI.")
- Will any specific economic benchmarks (productivity, revenue) be impacted?
- Can we measure changes in quality and cost over time?
More work thinking along these lines seems useful.
- ^
Hilda is an example person here. My guess is that the absolute last person in this chain won't be named Hilda, but it's a possibility. The name Hilda was one of the first to come from a random name generator.