We've now run hundreds of tests that include AI-generated cover variants. The results are more nuanced than either the AI enthusiasts or the skeptics predicted.
No topic generates more heated debate in indie publishing communities right now than AI-generated book covers. Designers argue they're devaluing the craft. Authors argue they're democratizing access to professional-quality imagery. Readers, for the most part, don't know or care how the cover was made — they just know whether it makes them want to read the book.
CoverCrushing has now run enough tests that include AI-generated cover variants to offer something more useful than opinion: data.
AI-generated covers perform very differently across genres. In fantasy and science fiction — genres where readers expect otherworldly, impossible imagery — AI-generated covers are performing competitively with traditionally designed covers. The ability to generate photorealistic images of things that don't exist (alien landscapes, impossible architecture, mythological creatures) is genuinely valuable in these genres.
In romance, the results are more mixed. AI-generated human figures still have the "uncanny valley" problem — faces and hands that look almost right but not quite. Romance readers, who are highly attuned to emotional expression and physical attractiveness, notice this. AI romance covers that avoid close-up human faces perform significantly better than those that feature them prominently.
In thriller and crime, AI covers are performing below average. The genre relies heavily on photographic realism — a dark city street, a figure in shadow, a crime scene detail — and readers can often detect when an image has been generated rather than photographed. The detection triggers a subtle credibility discount.
The most significant finding from CoverCrushing's AI cover data is what we call the "looks AI" penalty. When readers can identify a cover as AI-generated — even without being told — it scores lower on purchase intent, independent of aesthetic quality.
This penalty is not universal. In fantasy and sci-fi, it's minimal. In contemporary fiction and thriller, it's significant. The penalty appears to be driven by an association between AI imagery and lower production quality — an association that may fade as AI tools improve and become more widely used.
The highest-performing AI-assisted covers in CoverCrushing tests are not purely AI-generated — they use AI imagery as a starting point, then apply traditional design techniques (typography, color grading, composition adjustments) to produce a final cover. This hybrid approach captures the cost and speed advantages of AI while avoiding the "looks AI" penalty.
If you're considering an AI-generated cover, the data suggests:
4. **Consider the hybrid approach.** AI-generated imagery + professional design = the best of both worlds.
Will readers eventually stop caring whether covers are AI-generated?
Probably. The "looks AI" penalty appears to be driven by novelty and association with early, lower-quality AI tools. As AI imagery becomes ubiquitous and quality improves, the penalty will likely diminish. But we're not there yet.
Are there genres where AI covers are already performing at parity with traditional covers?
Yes — fantasy and sci-fi sub-genres that rely on impossible imagery are the clearest examples. In these genres, AI covers are performing competitively with traditionally designed covers in CoverCrushing tests.
Should I disclose that my cover is AI-generated?
This is an ethical question that the data can't answer. What the data can tell you is that disclosure doesn't appear to significantly affect purchase intent — readers who are told a cover is AI-generated don't score it differently than readers who aren't told.
Share this article
We use cookies to improve your experience, analyze traffic, and show relevant ads. No header ads — ever. Privacy Policy