Netflix’s Sketchy History of Inappropriate Content


photo courtesy of Julia Henning

Netflix promoted its new movie Cuties with a different cover to the original French version that sparked controversy for the misleading image looking more kid friendly than the actual content of the film.

Julia Henning, Editor-in-Chief

[Content Warning: This article includes discussions of suicide, sexual assault and self-harm]


With the release of their new and controversial movie Cuties, audiences and critics have questioned whether Netflix takes morals seriously, especially with the history of promoting age-inappropriate shows to younger audiences and a peculiar algorithm.

Cuties, originally Mignonnes, was released by Netflix in late 2020. The intention of the film was to criticize the hypersexualization of young girls, according to the filmmakers, but it came across as overly sexual and inappropriate to many. Some American politicians even called for its removal from Netflix US.

Because the actresses are playing their true ages of 11 and 12, children on Netflix may be inclined to watch based solely by the title cover of the movie. The explicit and inappropriate content of the movie, as portrayed by these young girls, is likely not age appropriate for age Netflix promoted to.

This is not the first case of inappropriate promotion to the wrong age group by Netflix. The releases of original series ‘13 Reasons Why’ and ‘Insatiable’ sparked controversy for the messages they sent to teenagers about mental health and obesity, respectively.

13 Reasons Why tells the story of a high school girl who commits suicide and leaves behind pieces for her classmates to decipher. The romanticization of suicide, self-harm, and on-screen rape scenes drew serious criticism from educators and parental figures. Olivia Davis (USG ‘24), like many others, saw the show under her ‘suggested’ in the Netflix app and watched the show for the first time at the age of 12.

“In Season 1, they bring the topic of rape a little too much,” said Davis. “And at the time, I didn’t really understand”. Unfortunately, many young students learned these themes of rape and suicide from the shows promoted by Netflix.

Even fun shows such as Insatiable, which appear harmless from the outside, have deeper hidden ideas of starving yourself to lose weight. For many kids who haven’t heard of more constructive ways to battle body image or self-harm, or just haven’t developed the maturity to consume such mature content, this may be the way they are learning what they believe to be “realistic”. 

Jenna Lott, upper schools counselor, encourages students to discuss with their parents and to find education in the correct forms before engaging with such heavy topics.

“I think it’s so important to prevent them from learning about a topic in a harmful way or in a way that is glamorized. These things are real and they’re happening all around and to educate a child is our duty and to help them navigate hard times and make choices when they’re in a situation that is harmful. I think I would encourage parents to have these hard conversations early on and that it doesn’t encourage the student to do or engage in those things, it is just educating them. Or parents can reach out to the school counselor to have that conversation,” said Lott. 

To combat their actions, Netflix created a portal with the contacts to the Crisis Text Line and National Suicide Prevention Lifeline. But as critics have pointed out, viewers will not search for this information on their own. If the show’s content showed positive solutions for battling mental health, their young audience would be able to see more ways to reach out rather than turning to suicide. 

“I think there’s this fear of silence and being alone in our own thoughts,” said Lott. “And that’s why regulating our social media intake and our Netflix intake is so hard. People don’t want to sit and feel feelings and it’s hard. I think there’s a duty with parents and families to teach healthy coping skills.”

The ‘suggesting’ algorithm, as Davis discovered for 13 Reasons Why, is another problematic aspect of Netflix’s business model. As originally explained by Vox, Netflix will apply a photo as the thumbnail that matches the other shows you’ve seen. For example, if you watch shows from their “Diversity Matters” collection, which includes many shows with BIPOC characters, the thumbnail that appears for another show may be a photo of the only person of color in the show. This is not only misleading, but it also picks out the actors to pinpoint them only as a ‘diversity factor’ which devalues their acting skills and uses them for the color of their skin. 

To turn their actions around, first, the content of the shows presented as more constructive and showing the realistic and positive outcomes of battling obesity and mental health. Second, when a show is too mature, a system to stop the promotion of these shows to middle schoolers could help with the kids discovering this content from shows instead of their school or parents. Finally, the thumbnails across shows could be kept the same across accounts for a realistic expectation and non-discriminatory presentation of the show’s content.