Gossip Herald

Home / Technology

Viral AI image trends in 2025: Ethical concerns and drawbacks

Following Nano Banana’s success, Chinese tech giant Meitu captured global attention with its 'AI snow scene' effect

Viral AI image trends in 2025: Ethical concerns and drawbacks
Viral AI image trends in 2025: Ethical concerns and drawbacks

The year 2025 will be remembered as the moment when social media feeds were completely transformed by an overwhelming wave of AI-generated images. From stunning digital art to mind-bending visual effects, artificial intelligence became the hottest and most talked-about creative trend, captivating millions of users around the world and reshaping the way we experience online content.

Whether it were cartoon-style portraits, hyper-realistic photos with favourite celebrities, and even digitally meeting one’s childhood self, these trends quickly gained global traction. 

Below, we have listed the top three AI image trends that emerged in 2025.

Ghibli-style AI images

Viral AI image trends in 2025: Ethical concerns and drawbacks

One of the biggest AI image trends emerged in March, when users began transforming their photos into dreamy, animation-style visuals inspired by Studio Ghibli. 

Powered by an image-generation tool inside ChatGPT, developed by OpenAI, the feature quickly went viral under the leadership of CEO Sam Altman.

The AI tool allowed users to recreate visuals reminiscent of classics like Spirited Away and My Neighbor Totoro, associated with Studio Ghibli, founded by legendary animator Hayao Miyazaki. 

While visually appealing, critics argued the trend blurred ethical lines by mimicking a highly distinctive artistic style without consent, compensation, or transparency.

Nano Banana image generator trends

Viral AI image trends in 2025: Ethical concerns and drawbacks

Later in the year, Google Gemini’s Nano Banana image tool sparked another viral wave. Users uploaded selfies and generated Polaroid-style images alongside celebrities, creating convincingly realistic “memories” with stars like BTS’ Jungkook, Cristiano Ronaldo, Taylor Swift, and Lionel Messi.

The trend spread rapidly on Instagram, with many users struggling to distinguish fantasy from reality. 

Meanwhile, particularly in South Asia, the tool also fueled the viral “vintage saree” trend, producing retro, Bollywood-inspired portraits that blended cultural expression with AI-driven aesthetics.

Meitu’s hyper-real snow portraits

Viral AI image trends in 2025: Ethical concerns and drawbacks

Following Nano Banana’s success, Chinese tech giant Meitu captured global attention with its “AI snow scene” effect. 

The tool transformed selfies into stylised winter portraits with enhanced facial features and cinematic backdrops. Its winter-themed rollout aligned perfectly with seasonal moods, amplifying its viral appeal.

AI image trends and ethical concerns

However, the rise of AI-generated images came with a price, i.e., serious ethical, legal, and privacy-related concerns that sparked growing debate worldwide.

Uploading personal photos often meant unknowingly granting platforms access to biometric data that could be used for AI training, targeted advertising, or third-party sharing.

Copyright issues

Similarly, copyright issues were equally pressing. Many AI models were trained on copyrighted material scraped without permission, sparking legal debates.

Miyazaki himself has previously criticised AI-generated art, calling it “an insult to life itself.” 

Additionally, digital rights groups such as the Electronic Frontier Foundation have also questioned whether generative AI truly democratises creativity or simply benefits corporations at creators’ expense.

Misinformation, deepfakes, and loss of trust

Advanced AI tools made it easier than ever to create deepfakes, but false images capable of spreading misinformation, damaging reputations, and eroding public trust. 

As AI image tools continue to evolve, experts argue that stronger safeguards, clearer consent frameworks, and respect for individual and creative rights are essential.