July 27, 2025

H&Ms Digital Doppelgängers: When your fashion model is a copy

H & M surprised the industry on 2 July 2025 by posting the first look book built with “digital twins”. Thirty real models were body-scanned once; every other pose, backdrop and lighting setup came from software. Business of Fashion, Fashion United and other outlets covered the drop within hours, and comment threads filled with cheers about sustainability and worries about lost jobs.

Why the retailer went virtual

Fashion photography is expensive. A single global shoot means flights, freight, day rates and post-production. H & M’s creative team wanted faster content and a wider mix of body shapes without adding travel days. Digital twins let them swap jeans, tweak sizes and change scenery in minutes, all while the human model is at home. The brand also promised that each model keeps legal control of their clone, sharing any new income that clone generates.

Early internal tests convinced the finance team. H & M says the virtual workflow cut image-production cost by nearly half and sped up delivery of social assets from weeks to days.


New questions for the industry

The Model Alliance and other labour groups warn that careers could turn into “disposable data” if clones replace real shoots. Photographers, stylists and hair artists echo the fear. Similar tension surfaced when Mango introduced AI avatars last year.

Regulators are moving. The New York Fashion Workers Act now requires written consent before a brand can create or use a digital replica, and the upcoming European Union AI Act will demand clear labels and bias checks for any synthetic imagery.



Does it move product?

H & M has not published full sales numbers yet, but the first denim carousel on Instagram drew well above the brand’s average engagement rate according to the Business of Fashion report. Mango and Levi’s saw similar spikes when they tested AI models, which suggests that novelty still works. The open question is fatigue: feeds could feel crowded once every fast-fashion label pushes flawless avatars.


A reflection on where humans fit

AI twins bring speed, flexible sizing and lower travel costs. They also raise a tougher question: How much of fashion’s spirit relies on real bodies in front of a camera? Live shoots capture unplanned moments such as a quick laugh between takes or the way denim folds when someone crouches. These details still show us how clothes behave in real life. They feel honest in a way that perfect computer images cannot match.

The new tools can widen representation and cut waste. They should add to the craft, not push it aside. When a brand starts with a small live shoot and later lets digital twins fill in extra angles, it keeps the mural of human expression intact while trimming the parts that hurt budgets or the planet. The industry, from casting agents to young photographers, continues to breathe. Producers also gain space to test ideas without booking another flight.

Every label will draw its own line. The safest path is a dial, not a switch. Turn virtual help up or down as each project evolves. It works when we remember that fashion began as a human story and still runs on that pulse.


Key takeaways

• Live shoots remain the heartbeat of fashion. They capture honest moments that digital models cannot fully recreate.

• AI twins work best as support. Better use them to extend a campaign’s range and cut excess cost or travel, not to replace people.

• Transparency is crucial. Always marking synthetic images so trust stays intact.

• AI as a dial. Adjusting the mix of real and virtual for each project rather than choosing one side forever.

• Training models on varied body shapes and tones helps the technology widen representation instead of narrowing it.


Images:
https://hmgroup.com/news/hm-continues-its-exploration-of-creativity-with-ai/https://www.businessoffashion.com/articles/technology/hm-plans-to-use-ai-models/