Debate: should EA avoid using AI art outside of research?
By titotal @ 2025-04-30T11:10 (+32)
There is a growing movement to ban or discourage the use of AI art, citing ethical concerns over unethical data scraping, environmental cost, and harm to the incomes of real artists. This sentiment seems most prevalent in left-leaning online spaces like reddit and bluesky. Some are even starting to associate AI art with the far-right, with one popular article declaring it to be “the new aesthetics of fascism”.
As an example of how far this movement is spreading, the subreddit for the poker roguelike video game Balatro had a kerfuffle a few months ago, when a volunteer moderator for the subreddit stated that AI art was allowed. A person on bluesky screenshotted the post, and declared that if they had known the Balatro creator was okay with AI art, they wouldn’t have bought or own the game.
In response, the creator of the game stated that “Neither Playstack nor I condone AI 'art'. I don't use it in my game, I think it does real harm to artists of all kinds. The actions of this mod do not reflect how Playstack feels or how I feel on the topic. We have removed this moderator from the moderation team.”
I bring this up not because the subreddit for an indie game is important, but because it’s not: If an indie game subreddit refuses to use AI art for ethical reasons, should the effective altruism movement be doing the same?
At the very least, the discussion should be had, and todays debate week seems like a good time to have it. And conveniently enough, it also provides a good example of the AI art usage I'm trying to gently discourage here: the announcement thread for DIY debate week has it’s own cute AI art tacked to the end:
I'm not trying to cancel Toby or Bulby here. Obviously he was just trying to spice up the post with a cute picture, and I'm not going to argue that this image is a major threat to humanity or anything. But on the other hand, not creating images like this costs very little, so if there are tangible harms then avoiding AI art could still be effective.
I’m also not saying AI art should be banned outright. Many people are trying to research the development and impact of AI, including AI images, and it would be dumb to prevent someone studying the progression of AI image models from showing AI images. So I want to clarify that I am only talking about cases like Bulby above, where the AI art is only used for illustrations or entertainment purposes.
I'll start off the debate with a few anti-AI art talking points, as well as some points on the other side. I'm personally anti-AI art, so I don't expect that I have produced the best pro-AI art arguments here.
Some arguments for avoiding AI art:
- AI art is being generated with scraped images without the consent or compensation of the original artists, and then used to undercut those artists in the marketplace. Many people, including myself, think this is highly unethical, so there is harm in normalising this practice.
- Following on, even if you don’t think AI art is majorly harmful, you may be concerned about setting a precedent: By tolerating unethical behaviour now, you might be making it easier for them to do worse things later.
- AI safety people (and AI harm people) generally see openAI as acting recklessly and dangerously, and they currently have the best AI image generation technology. Showing off their images might drive more people to use chatgpt and pay money for it, boosting their income, data, and market advantage.
- The use of AI art seems quite unpopular among real artists and creatives (although there are exceptions). By using AI art, you may be turning real artists (and their skills) away from the movement.
- Many people concerned about AI safety want to form a common front against AI companies with those concerned about short-term harms. The widespread use of AI art could hurt this effort, by giving the impression that AI safety doesn’t care about short-term harms.
- Reputationally, it might be a point of weakness: it could be pointed to as hypocrisy. “these people say they are against OpenAI, yet they are slapping Chatgpts plagiarised images all over their forum”
- There is an environmental cost to each image generated. A 2023 estimate placed it as equivalent to a full charge of a smartphone. I think it's likely that the impact of the new GPT image generator is significantly higher than this.
- AI art has negative associations with quality and is often seen as being tacky “slop”, which might hurt messaging using AI art.
- The upside of using AI art doesn't seem to be very high: subjectively it mostly seems like it's used for decorations.
Some arguments against avoiding AI art
- Some people enjoy it, and discouraging AI art might help contribute to a “fun police” atmosphere on the forum that makes it less appealing to read or participate in.
- Most of the AI art is being used in blog posts: we wouldn’t have paid artists to do this stuff anyway, so there is very little impact on an artists bottom line.
- The lack of AI art might make some articles less readable or persuasive. There are a lot of free images out there, but they are less versatile than image generators.
- Using AI art may help the average person get a better feeling and insights about AI, including what it is and isn’t good at doing.
- The stance may alienate EA from AI enthusiasts, and reputationally this would reinforce the perception of EA as luddite technology haters.
- Real art may be mistaken for AI art, which could result in people being unfairly maligned.
Daniel_Friedrich @ 2025-04-30T12:23 (+39)
I think the comparison in energy consumption is misleading because phones use unintuively little energy, as much as 10 Google searches per one charging, (Andy Masley has good articles on AI emissions), using a smartphone for one year costs less than a dollar. I think a good heuristic is "if it's free, it uses so little energy that it's not worth considering".
If you're not paying to generate it, you're also not taking any income away from artists.
The argument that it's bad vibes for artists is a good one.
Jamie Huang @ 2025-04-30T14:29 (+11)
According to the U.S. Energy Information Administration, "the average US household consumes about 10,500 kilowatthours (kWh) of electricity per year"
The study cited in the article says that 1,000 AI images uses around 3 kilowatt-hours of electricity. If someone made 10,000 AI images in a year, they would have increased their electricity usage by ~0.3%, which is not nothing, but not significant.
A moderately efficient air-conditioner seems to use one kilowatt hour per hour, so generating 10 AI images is approximately equivalent to using an air-conditioner for 2 minutes.
Jason @ 2025-04-30T14:07 (+5)
Also, we'd need to consider the environmental costs of creating Bulby by non-AI means. Even assuming they are lower than AI generation now, I could see the argument flipping into a pro-AI art argument with sufficient technological advancement.
Larks @ 2025-04-30T20:12 (+6)
Surely the environmental externalizes are dramatically lower with AI than with humans. There's a reason people bringing up this argument never do the actual apples-to-apples comparison: because once AI is capable of doing something, it can do it very cheaply.
Neel Nanda @ 2025-04-30T20:45 (+16)
Should EA avoid using AI art for non-research purposes?
In my opinion, the counterfactual is highly likely to be zero or free art rather than paying a human artist. I think art adds value, and the marginal harm of another AI generated art image that does not result in foregone income for an artist is fairly negligible. I think we should have a high bar for trying to create norms in a community against a fairly normal action, and this does not meet that bar
NickLaing @ 2025-04-30T11:58 (+14)
Love this question what a great one for DIY debate week (strong upvote). You definitely moved my needle a bit away from AI art, although I'll continue to use it for the moment.
One strong disagreement is against these 2 points of yours, just from my wife's recent experience
- "AI art has negative associations with quality and is often seen as being tacky “slop”, which might hurt messaging using AI art."
- "The upside of using AI art doesn't seem to be very high: subjectively it mostly seems like it's used for decorations. "
My wife recently made 20 large really high quality photorealistic AI images to help her with trainings moving towards a conflict mediation here in Uganda. The images demonstrated land conflicts, interpersonal violence and depicted a range of emotions people could have been experiencing. They were so realistic nearly all the participants thought they were even real people... This tool was super useful for the training and conflict meeting. She would never have considered (or been able to afford) paying someone to make these images, so without AI images wouldn't have bee used and value would have been loss
Quality can be high, and Upsides can be also high at times as well.
Yarrow🔸 @ 2025-05-07T01:52 (+11)
EA should avoid using AI art for non-research purposes?
My strongest reason for disliking AI-generated images is that so often they look tacky, as you aptly said, or even disgustingly bad.
One of the worst parts of AI-generated art is that sometimes it looks good at a glance and then as you look at it longer, you notice some horribly wrong detail. Human art (if it's good quality) lets you enjoy the small details. It can be a pleasure to discover them. AI-generated art ruins this by punishing you for paying close attention.
But that's a matter of taste.
What I'm voting "disagree" on is that the EA Forum should have a rule or a strong social norm against using AI-generated images. I don't think people should use ugly images, whether they're AI-generated or free stock photos. But leave it up to people decide on a case-by-case basis which images are ugly and don't make it a rule about categorically banning AI-generated images.
I am trying to be open-minded to the ethical arguments against AI-generated art. I find the discourse frustratingly polarized.
For example, a lot of people are angry about the supposed environmental impact of AI-generated art, but what is the evidence of this? Anytime I've tried to look up hard numbers on how much energy AI uses, a) it's been hard to find clear, reliable information and b) the estimates I've found tend to be pretty small.
Similarly, is there evidence that AI-generated images are displacing the labour of human artists? Again, this is something I've tried to look into, but the answer isn't easy to find. There are anecdotes here and there, but it's hard to tell if there is a broader trend that is significantly affecting a large number of artists.
It's difficult to think about the topic of whether artists should need to give permission for their images to be used for AI training or should be compensated if they are. There is no precedent in copyright law to cover this because this technology is unprecedented. For the same reason, there is no precedent in societal norms. We have to decide on a new way of thinking about a new situation, without traditions to rely on.
So, if the three main ethical arguments against AI-generated art are:
-It harms the environment
-It takes income away from human artists
-AI companies should be required to get permission from artists before training AI models on their work and/or financially compensate them if they do
All three of these arguments feel really unsubstantiated to me. My impression right now is:
-Probably not
-Maybe? What's the evidence?
-Maybe? I don't know. What's the reasoning?
The main aesthetic argument against AI-generated art is of course:
-It's ugly
And I mostly agree. But those ChatGPT images in the Studio Ghibli style are absolutely beautiful. There is a 0% chance I will ever pay an artist to draw a Studio Ghibli-style picture of my cat. But I can use a computer to turn my cat into a funny, cute little drawing. And that's wonderful.
I'm a politically progressive person. I'm LGBT, I'm a feminist, I believe in social justice, I've voted for a social democratic political party multiple times, and I've been in community and in relationship with leftists a lot. I am so sick of online leftist political discourse.
I am not interested in thinking and talking about celebrities all the time. (So much online leftist discourse is about celebrities.)
I don't want to spend that much time and energy constantly re-evaluating which companies I boycott and whether there's a marginally more ethical alternative.
I don't want every discussion about every topic to be polarized, shut down, moralized, and made into a red line issue where disagreement isn't tolerated. I'm sick of hyperbolic analogies between issues like ChatGPT and serious crimes. (I could give an example I heard but it's so offensive I don't want to repeat it.)
I am fed up with leftists supporting authoritarianism, terrorism, and political assassinations. While moralizing about AI art.
So, please forgive me if I struggle to listen to all of online leftists' complaints with the charity they deserve. I am burnt out on this stuff at this point.
I don't know how to fix the offline left, but I'm personally so relieved that I don't use microblogging anymore (i.e., Twitter, Bluesky, Mastodon, or Threads) and that I've mostly extricated myself from online leftist discourse otherwise. It's too crazymaking for me to stomach.
bluballoon @ 2025-04-30T19:51 (+9)
Should EA avoid using AI art for non-research purposes?
I voted 100% agree.
As a young person, I know that my first impression seeing AI art on this forum, a forum that's specifically meant to be concerned about the harms of AI came off as extremely hypocritical, and I know for a fact that many of my friends sympathetic to EA principles would feel the same.
Even if EAs could argue that using AI art is a net positive, (which I don't considering how little the benefits are in the vast majority of cases), I just don't think this is a hill EAs should be willing to die on.
An overwhelming majority of young people, leftists, and people concerned about AI (basically our target audience) strongly oppose AI art, and in plastering AI art everywhere we basically guarantee that our target audience will at least have an initial sour taste in their mouth seeing this forum.
AI art is far from the biggest potential concern surrounding AI. However, it's the concern that people seem to care the most about, and it's crucial to use this concern to get people on our side, and promote other more pressing issues related to AI.
Do we really want to promote something that's wildly unpopular among our target audience and make ourselves seem like hypocrites for very little apparent benefit?
TFD @ 2025-04-30T21:05 (+3)
An overwhelming majority of young people, leftists, and people concerned about AI (basically our target audience) strongly oppose AI art
Can you say why you think this?
I would also say that I think it would be helpful to get people who aren't currently concerned about AI to be concerned, so I don't strictly agree that the target audience is only people who currently care.
Alejandro Ruiz @ 2025-05-02T09:41 (+1)
Let's suppose we agree this is so, as a working hypothesis.
How do you propose a community which would cater to the aesthetic tastes of its majority would avoid evaporative cooling of group beliefs? This a grave concern of mine. That and O'Sullivan's Curse, itself related to group polarization.
OscarD🔸 @ 2025-05-01T07:17 (+8)
Should EA avoid using AI art for non-research purposes?
Seems somewhat epistemically toxic to give in to a populist backlash against AI art if I don't buy the arguments for it being bad myself.
Toby Tremlett🔹 @ 2025-04-30T12:17 (+6)
Should EA avoid using AI art for non-research purposes?
Treating "agree" as "yes"
I think the strongest reason against (using AI art for non-research purposes) for me is the idea of all the uncompensated art that made up the data. It's a bit of an original sin for AI in general (including text generation), and not one which we've found a good response to.
Reasons for (using AI art for non-research purposes):
- It makes sense for EAs to adopt a place in the memetic space where we think that AI is and will be very powerful (and therefore it's important to learn how to use it), and it's likely to be very dangerous. I don't think there is a contradiction there, and avoiding the use of AI would be increasingly hobbling. This is relevant because I don't think we can make a clean distinction between AI generated art and AI generated text - both are likely built on an amount of stolen/ un-compensated data.
- Using AI images (like bulby above) is just a bit of fun, i.e. the scale of use is pretty small - this correlates with this not being a very big deal.
- A lot of the concerns are hypothetical comms concerns, I'd take this more seriously if things played out that way, but right now I'd guess that the anti-AI-use camp is fairly loud but not strategically useful. And since I disagree with them for the other reasons above, I'd rather not pretend that I do for optics reasons.
Overall: I'm definitely open to changing my mind on this. I especially don't feel like I have a principled response to the un-compensated labour that went into creating AI, and it'd be great to have one.
Jason @ 2025-04-30T13:42 (+5)
How different is the process of how AIs "learn" to draw from how humans learn for ethical purposes? It seems to me that we consciously or unconsciously "scrape" art (and writing) we encounter to develop our own artistic (or writing) skills. The scraping student then competes with other artists. In other words, there's an element of human-to-human appropriation that we have previously found unremarkable as long as it doesn't come too close to being copying. Moreover, this process strikes me as an important mechanism by which culture is transmitted and developed.
Of course, one could try to identify problematic ways in which AI learning from images it encounters differs from the traditional way humans learn. But for me, I think there needs to be that something more, not just the use in training alone.
Most art is, I think, for "decorations" -- that way of characterizing most art is a double edged sword for your argument to me. It reduces the cost of abstaining from AI art, but also makes me think protecting human art is less important.
TFD @ 2025-04-30T16:36 (+5)
I've seen this machine/human analogy made before, and I don't understand why it goes through. I think people over-index on the fact that the "learning" terminology is so common. If the field of ML were instead "automatic encoding" I don't think it would change the IP issues.
I think the argument fails for two reasons:
- I assume we are operating in some type of intellectual property framework. Otherwise whats the issue? Artists don't have a free-stranding right to high demand for their work. The argument has to be that they have ownership rights which were violated. But in that case, the human/machine distinction makes complete sense. If you own a work, you can give permission for certain people/uses but not others (like only giving permission for people who pay you to use the work). Thus, artists may argue, however it was we made our works available, it was clear/reasonable that we were making it available for people but not for use in training AI systems. If developers had a license to use the works for training then of course there would be no issue.
- We could reverse the analogy. Let's say I go watch a play. The performers have the right to perform the work, but I haven't secured any rights to do something like copy the script. As I watch, surely I will remember some parts of the play. Have I "copied" the work within the meaning of IP laws? I think we can reject this idea just on a fundamental human freedom argument. Even if the neurons in my brain contain a copy of a work that I don't have the rights for, it doesn't matter. There is a human/machine difference because, below a certain threshold of machine capabilities, we probably believe humans have these types of rights while machines don't. If we get to a place where we begin to think machines do have such rights, then the argument does work (perhaps with some added non-discrimination against AIs idea to answer my #1).
At the same time though I don't think I personally feel a strong obligation not to use AI art just because I don't feel a strong obligation to strongly respect IP rights in general. On a policy level I think they have to exist, but lets say I'm listening to a cover of a song and I find out that actually the cover artist doesn't have the appropriate rights secured. I'm not gonna be broken up about it.
A different consideration though is what a movement that wants to potentially be part of a coalition with people who are more concerned about AI art should do. A tough question in my view.
Joseph_Chu @ 2025-04-30T15:26 (+3)
A difference between how human artists learn and AI models learn is that humans have their own experiences in the real world to draw from and combine these with the examples of other people's art. Conversely, current AI models are trained exclusively on existing art and images and lack independent experiences.
It's also well known that AI art models are frequently prompted to generate images in the style of particular artists like Greg Rutkowski, or more recently, Studio Ghibli. Human artists tend to develop their own style, and when they choose to deliberately copy someone else's style, this is often looked down upon as forgeries. AI models seem to be especially good at stylistic forgeries, and it might be argued that, given the lack of original experiences to draw from, all AI art is essentially forgeries or mixtures of forgeries.
Matrice Jacobine @ 2025-04-30T19:38 (+4)
Stylistic pastiche is unambiguously protected by the First Amendment, not "forgery".
Joseph_Chu @ 2025-04-30T20:45 (+1)
Can you cite a source for that? All I can find is that the First Amendment covers parody and to a lesser extent satire, which are different from pastiche.
Also, pastiche usually is an obvious homage and/or gives credit to the style's origins. What AI art makers often do is use the name of a famous artist in the prompt to make an image in their style, and then not credit the artist when distributing the resulting image as their own. To me, even if this isn't technically forgery (which would involve pretending this artwork was actually made by the famous artist), it's still ethically questionable.
Jason @ 2025-04-30T21:08 (+3)
This is more a copyright law question than a First Amendment one, at least under current law. E.g., https://www.trails.umd.edu/news/ai-imitating-artist-style-drives-call-to-rethink-copyright-law.
I believe whether the 1A requires this outcome is unclear at present. Of course, there's a lot of activity protected by the 1A that is horrible to do.
Jason @ 2025-04-30T21:01 (+3)
So I think we may have a crux -- are "independent experiences" necessary for work to be transformative enough to make the use of existing art OK? If so, do the experiences of the human user(s) of AI count?
Here, I suspect Toby contributed to the Bulby image in a meaningful way; this is not something the AI would have generated itself or on bland, generic instructions. To be sure, the AI did more to produce this masterpiece than a camera does to produce a photograph -- but did Toby do significantly less than the minimum we would expect from a human photographer to classify the output as human art? (I don't mean to imply we should treat Bulby as human art, only as art with a human element.)
That people can prompt an AI to generate art in a way that crosses the line of so-called "stylistic forgeries" doesn't strike me as a good reason to condemn all AI art output. It doesn't undermine the idea that an artist whose work is only a tiny, indirect influence on another artist's work has not suffered a cognizable injury because that is inherent in how culture is transmitted and developed. Rather, I think the better argument there is that too much copying from a particular source makes the output not transformative enough.
Joseph_Chu @ 2025-04-30T21:36 (+1)
You could argue that Toby's contribution is more what the commissioner of an artwork does than what an artist does.
On the question of harm, a human artist can compete with another human artist, but that's just one artist, with limited time and resources. An AI art model could conceivably be copied extensively and used en masse to put all or many artists out of work, which seems like a much greater level of harm possible.
Alejandro Ruiz @ 2025-05-02T09:35 (+4)
EA should avoid using AI art for non-research purposes?
Ethical concerns aside, I'd rather EA not place themselves on yet another political axis.
I'm concerned about the extent (and this is a fully general argument, I'm aware) of the recent phenomenon where some highly-connected, highly-concerned, highly-online group is able to unilaterally polarize the discussion of <insert-topic-here>, by simply taking a certain stance and being very vocal about it, plus or minus associating this viewpoint with one of the big political camps. Thus forcing everybody else to be on one of two camps, because the matter is formulated so as to leave no possible middle ground of position of neutrality.
To this I say we should say "mu", in "I explicitly reject your formulation of the question as invalid". I say we should carve a middle ground even if it doesn't seem possible. And wherever there is a supposed binary of "do or don't", I say we should explicitly reject that binary. You can do X and mean Y, with Y being <thing-supposedly-associated-with-X>. Or you can do X and mean not-Y. Or you can do X and mean Z, with Z orthogonal to Y. The same when you do not-X.
Having said all that, please use AI art, or do not use AI art, and ignore the resulting noise. For if "this causes noise" is to be the main drive for all of your decisions, you've already abdicated all power to the noise-makers.
EDIT: I voted "strong disagree" (as in "we should do it if it makes sense to do it"), though I don't see it reflected in this comment).
After further reflection, I realize I embedded an implicit assumption in my reasoning. There is a default, neutral position, and it is "what everybody else is doing". I've always thought that rule of "every blog post must have an image, by decree of the SEO gods" was silly, but we have to deal with it. I've seen a strong move towards AI-generated images around me, instead of clip art collections. This reflects simple economic logic: image collection sites are more expensive than AI (the paid ones), less useful than AI (the free ones), or both. Deviating from what is quickly becoming a de facto norm around us will tend to group us with one or another political camp. Thus, the safest norms are almost always either the "everybody is doing this", the "it's the price, stupid", or both.
Marcus Abramovitch 🔸 @ 2025-05-01T02:12 (+4)
Should EA avoid using AI art for non-research purposes?
To the contrary, you probably cost the AI labs a bunch of compute and this is the overwhelming effect.
Also, the environmental costs are tiny. If you are doing it for the environment, nearly all your environmental footprint is your diet and transportation, not electricity usage.
akash 🔸 @ 2025-04-30T20:02 (+3)
Should EA avoid using AI art for non-research purposes?
Voting under the assumption that by EA, you mean individuals who are into EA or consider themselves to be a part of the movement (see "EA" is too vague: let's be more specific).
Briefly, I think the market/job displacement and environmental concerns are quite weak, although I think EA professionals should avoid using AI art unless necessary due to reputational and aesthetic concerns. However, for images generated in a non-professional context, I do not think avoidance is warranted.
Joseph_Chu @ 2025-04-30T15:55 (+3)
Should EA avoid using AI art for non-research purposes?
In addition to reasons already given, I've recently starting coming round to the idea that we should straight up be boycotting AI that can potentially replace existing humans.
If we take, for instance, the ideas of PauseAI seriously, we should be slowing down AGI development in whatever way we reasonably can. A widespread boycott of certain forms of AI could help with this by reducing the market incentives that companies currently have to accelerate AI development.
Now, I don't think we should boycott all AI. AlphaFold for instance is a good example of a form of narrow AI that doesn't replace any humans because it does something complementary to what humans can do. Conversely, AI art models compete directly with human artists, much in the way future AGI would compete with all humans eventually.
It does seem to me that there is a lot of support already among artists and creatives in particular to boycott AI, so I think there's a better chance for this to gain traction, and is more tractable a method than trying to pause or ban AI development outright. Whereas pauses or bans would require government coordination, a boycott movement could come from individual acts, making it much easier for anyone to participate.
Edit:
Just wanted to add, in some sense an AI boycott resembles going vegan, except with regards to AI issues instead of animal ones. Maybe that framing helps a bit?
Also, another thought is that if it becomes sufficiently successful, an AI boycott could allow for part of the future economy to maintain a "human made" component, i.e. "organic art" in the way organic food is more expensive than regular food, but there's still a market for them. This could slow down job losses and help smooth out the disruption a bit as we transition to post-scarcity, and possibly even give humans who want purpose some meaningful work even after AGI.
philbert schmittendorf @ 2025-05-01T21:13 (+1)
EA should avoid using AI art for non-research purposes?
EA already gets a lot of flak externally for being obsessed with AI and dominated by AI discourse ("rich tech bro morality" etc.). Using AI art -- which seems to be emblematic of everything the average person hates about the tech world -- really does not help our case here...
Benjamin M. @ 2025-05-01T00:28 (+1)
Should EA avoid using AI art for non-research purposes?
I'm unconvinced by the arguments for first-order harms (environment, copyright) being sufficiently big, but I think it's worthwhile to send a signal that EA is anti-giving-AI-too-much-power. Also I think it's mostly mediocre, but I'm only a mild agree vote because it's not really something worth policing. Maybe this is what people mean by disagree reacting the post itself?
Oisín Considine @ 2025-04-30T21:26 (+1)
Should EA avoid using AI art for non-research purposes?
While I understand that it may help people better understand some things through imagery while saving time and effort, I do have some ethical concerns about the scraping of artwork from artists without their consent. On top of that, at least currently, you can usually spot AI art, as oppose to human-created art, from a mile away and to me it looks a bit cheap. So unless it's really needed, I believe there should be a norm against the use of AI art