Parents and Children Share Honest Reactions to AI-Generated Images in Kids’ Storybooks

A diverse family enjoys reading a storybook together, fostering warmth and connection.

A new study from North Carolina State University offers a detailed look at how both parents and young children respond to AI-generated illustrations in children’s storybooks. The research explores what families like, what concerns them, and what they believe publishers should do when using AI tools in books meant for early readers. It’s a timely topic, because AI-assisted art is becoming far more common in digital publishing and self-publishing platforms, yet there hasn’t been much work examining how real families feel about it. This study helps fill that gap with concrete observations.

What the Study Looked At

Researchers worked with 13 parent-child pairs, each including one child aged 4 to 8 along with at least one parent. Each pair was asked to read two out of three short stories provided by the research team. The researchers tried to keep the reading experience as close as possible to the families’ normal storytime routines. Every story included a blend of three illustration types:

  • Fully AI-generated artwork
  • Human artwork enhanced using AI tools
  • Completely human-created artwork

After each story, the children shared how much they enjoyed the story and the illustrations using simple, age-appropriate questions. In contrast, the parents participated in longer interviews to discuss their preferences and concerns. The goal was to understand how much the presence of AI-generated imagery changes the reading experience, if at all.

How Children Reacted to AI-Generated Images

According to the study, children noticed emotional inconsistencies more often than their parents did. For example, when characters in the story were described as happy or excited, some AI-generated images displayed faces that looked tense or angry. Children pointed out when the emotional tone of the images didn’t align with the emotional tone of the text. This happened because AI tools still struggle to interpret emotional cues reliably, especially when context matters.

Older children, in particular, also picked up on errors in realism, such as inaccurate animal behavior, incorrect body proportions, or strange object sizes. These inconsistencies didn’t just seem odd to them—they occasionally pulled the child out of the story, making the reading experience feel less natural.

What Parents Thought About AI-Generated Art

Parents had a wide range of opinions, but certain patterns stood out.
Most parents were open to AI-generated illustrations, but only under specific conditions. They said they would feel comfortable if the story text was clearly written by a human and if the AI-generated images were screened by experts—people such as librarians, educators, or professionals familiar with children’s literature. This reviewing process mattered to them, because parents worried that certain types of errors in AI images might encourage unsafe behavior or teach incorrect real-world information.

For example, if a science-themed story depicted an animal doing something that could be dangerous or impossible, parents felt that could lead to misunderstandings. This concern was stronger when the story was realistic or educational rather than a fable or fantasy.

Some parents, however, had fundamental objections to using AI at all. These parents believed that AI-generated art diminished the role of professional human illustrators or simply disliked the sometimes distorted or “artificial” appearance of AI images. A smaller subset expressed discomfort with AI being used in creative fields that traditionally depend on human talent and imagination.

Additionally, most parents said they were not comfortable with the idea of using AI to write the story text itself. They accepted AI as a tool for illustration more than as a tool for storytelling.

The Role of Labels and Transparency

The researchers experimented with placing small labels beneath each illustration stating whether it was AI-generated. Interestingly, most parents and children did not notice these labels at all, and for those who did, many found them distracting.
Instead, parents suggested that publishers should include a clear notification on the book’s cover stating that AI tools were used for the illustrations. They preferred this approach because it allowed them to make an informed choice before purchasing or reading the book, without interrupting the flow of the story.

Why Accuracy Matters More for Certain Kinds of Stories

The study highlighted an important nuance: realism affects expectations.

  • In fantasy or imaginative tales, AI’s artistic mistakes or unusual interpretations weren’t seen as major issues.
  • But in realistic or science-oriented stories, accuracy mattered significantly more.

Parents wanted illustrations that didn’t mislead their children about real-world animals, objects, or behaviors. Older children, too, seemed more attuned to errors when the story setting resembled everyday life.

What Experts Suggest Based on the Findings

The research team emphasized three main takeaways that publishers and authors should keep in mind:

  1. Use a simple cover label to show whether AI was involved in creating the illustrations. Page-by-page labels aren’t helpful and distract from reading.
  2. Have experts review AI-generated illustrations to catch mistakes related to emotion, realism, or safety.
  3. Understand that different types of stories need different levels of accuracy. A fictional fable can tolerate more AI quirks than a science-themed picture book.

Additional Context: AI and Children’s Media

This study is part of a broader conversation about how AI affects children. AI-generated images and stories are becoming more accessible, which means parents, educators, and publishers need to think carefully about how these tools influence learning and imagination.

Current research across multiple fields shows a few consistent themes:

  • Children are often excited by AI tools, especially ones that let them generate art or stories.
  • However, young kids may have difficulty understanding the difference between AI-generated content and real-world facts.
  • Emotional accuracy is essential for early readers, because young children rely heavily on illustrations to interpret story meaning.
  • AI tools still frequently misinterpret context, body language, and spatial relationships—areas where children are surprisingly perceptive.

This means that while AI can be a powerful creative aid, it still needs careful oversight when used in materials meant for children.

What This Means for the Future of Children’s Books

As AI becomes more common in publishing, parents may increasingly expect transparency about how books are created. The study shows that families don’t reject AI outright; instead, they want thoughtful, responsible use of the technology. Human-authored text, expert-reviewed illustrations, and clear labeling are all key elements of responsible integration.

For authors and illustrators, this research signals a shift. Traditional art may continue to be valued, but hybrid workflows—where humans and AI collaborate—might become normal, provided quality and safety standards are upheld. Publishers who want to incorporate AI tools must remain aware of the emotional and developmental needs of young readers.

In essence, this study doesn’t say AI should replace human illustrators. Instead, it encourages a balanced, transparent, and carefully reviewed approach that respects both creativity and child development.

Research Paper:
“They all look mad with each other”: Understanding the Needs and Preferences of Children and Parents in AI-Generated Images for Stories
https://doi.org/10.1016/j.ijcci.2025.100787

Also Read

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments