According to Euromonitor and McKinsey, the revenue of the global fashion industry is $1.7 to $2.5 trillion, which makes it one of the biggest industries in the world. But it is also highly traditional and slow-moving in terms of technology. However, some entrepreneurs believe that this can be changed.
I have talked to George Yashin, CEO & Co-Founder of ZERO10, the leading AR-fashion platform providing innovative tech solutions for brands entering the digital world, on the idea behind the project and current tech trends that will disrupt the world of fashion.
Where did the idea for ZERO10 come from? What inspired you to create an app for users to try on digital clothes in real time and collect them in a digital wardrobe?
At first, there wasn’t a specific plan to launch an app or a platform. This decision was a natural consequence of a concept taking shape. The concept itself was a result of my experience, working in fashion for many years and seeing all the problems and particularities of the industry. I’ve come to the realization that I want to impact the fashion industry and make it more sustainable by eliminating the necessity to make material clothes. It seemed that the most logical alternative to physical clothing was 3D digital apparel. However, 3D models can’t substitute tangible items because you can’t try them on. Coming to this idea, our team started to think of ways we could create the illusion of the flesh and blood of a real human wearing digital clothing.
There are two methods of placing virtual clothes on a human’s body ― overlaying digital pieces on a photograph of that person or fitting them digitally in real time. The first tactic was and still is popular among some players on the market. Their design teams render 3D models of clothing and place them upon users’ photos with such software as clo3D and Photoshop. The process may require a few hours or even a couple of days and can be called manual because it must be done by a specialist each time. As a result, the client gets a ready-made picture with them wearing the digital piece and can use it on social media, but only on social media.
We didn’t want to turn to this method because there wasn’t any unique technological solution behind it, thus it wouldn’t revolutionize the fashion industry. An even more considerable objection was that it had nothing to do with the physical fitting. We knew exactly what direction we wanted to move in and started to work on our own real-time try-on technology. But then there was another issue to resolve; to give users the possibility to try digital clothes on, we needed to provide them with the clothes. Moreover, for virtual apparel to be interesting, it wasn’t enough for it to be a technological advancement. We needed pieces that would be valuable to customers because of their uniqueness and the designer behind them.
As it was crucial for us to allow real-time fitting, we needed to develop a process that would be smooth and resemble the experience in the physical world but that could be accomplished. Our focus on users’ needs and behaviors instead of brands impacted the product and shaped our decision to build a marketplace where brands would showcase and sell their items, and a digital wardrobe for users to keep the purchased items. It seemed that the most logical and convenient embodiment of that concept was to make it into an app because users could easily use it for fittings, choosing and storing virtual items, and sharing their content on social media.
Shifting the main focus from users to fashion brands, ZERO10 provides innovative tech solutions for brands entering the digital world. What's the main difference?
We believe that the future of fashion is happening now. So, we haven’t just developed our product for the present, but we're also laying the foundation for it to be widely used in the future. In the beginning, we saw people who would purchase digital clothing as our end users, but after a few months of work we’d come to understand that many of them were not ready yet to make digital fashion a part of their everyday living and exchange tangible pieces for virtual ones. Today they use AR to try clothes on but only for the sake of fun and creating content for social media.
We’ve noticed, though, that businesses — fashion brands and retailers — have started to show demand for AR and other emerging technologies. The business community has begun to see the power of these technologies resolve some of their problems: consumer engagement, omnichannel, personalization, conversion of users’ attention into purchases, etc. We firmly believe that we’ll be able to promote our technology among consumers through brands and companies that are opinion leaders for them.
By creating this technology today, we’re not interested in getting a head start or trying to achieve quick results. On the contrary, we are working for the future of fashion by developing a one-of-a-kind technology that can be easily adapted to the market’s needs, be it virtual fitting of clothing for e-commerce now or putting together a wardrobe for a metaverse a year later. From our experience, it seems that the metaverse, Web3 and users as ultimate consumers are the future that will become real and monetizable but not for another 2 to 3 years. That’s why we create products for today but with our sights set on the future.
Despite our recent pivot to B2B, we haven't given up B2C. On the contrary, we’re planning further developments in this direction. But, understanding the modern world and the market’s specifics, we’ll be transforming the way we work with regards to the end user. Take our latest project, the Open Call competition, which is aimed to attract digital fashion early adopters — digital fashion creators and 3D designers. The concept of the creative economy, where users simultaneously make and consume content, is what’s on the front burner right now, and it’s going to progress even more in the future.
How can the B2B sector benefit from your technology?
We see huge potential for the fashion business to implement AR clothing try-on technology. As the competition is growing between online and offline retail, both face certain problems. Currently, e-commerce is dealing with a huge number of returns and a low rate of users’ attention converting into purchases. It is a direct result of users having no chance to try clothes on before buying them, and such solutions like 360 product view or size guides are not efficient enough to change the situation. That’s why the implementation of digital clothing try-on technology can be helpful as it’s the counterpart of physical fittings.
Our real-time try-on technology allows users to try on clothes the same way they would in the fitting room of a physical store. Offering brands and online retailers our SDK, a set of tools that allows e-commerce platforms to easily add AR try-on to their apps and websites, gives their customers the chance to try things on. The end result boosts customer confidence through the choosing process, and, consequently, sales increase.
Today AR is being eagerly implemented not only in fashion but also by brands and companies that sell cars, furniture, shoes, beauty products and accessories via the internet. They do it to help their clients visualize how products will look on them or in their apartments and offices. The real-time try-on technology for clothing is the most complex and complicated part of the entire AR try-on sector because a human’s body has a lot of various characteristics, such as different shapes and types of figures. It directly influences the level of vividness with which a digital item looks on a human’s body in AR. At the same time, fashion is one of the biggest markets for the AR industry.
In offline retail, AR helps to engage customers. Consumers are actively going back to offline shopping; the numbers are close to the pre-Covid time of 2020. But consumers have changed as they’ve become more active in online shopping and have gotten used to quick personalized service provided by e-commerce. Also, it’s important to think of the preferences of the new generation of customers that is Gen Z. Even though they extensively use technology, they still visit offline stores, mostly for the sake of the experience. Brands and retailers should be looking for new approaches to keep customers coming back to their physical spaces. AR try-on technology is one of the possible solutions for that.
How do you enter into partnership with clothing brands? Who usually initiates it, and what are the requirements for a brand to get their items digitized by ZERO10?
It was hard to persuade brands to work with us in the beginning. We set our bar high from the start and decided to work with well-known designers and avoid the no-names in order to create additional value for the users. We wanted to let them try on pieces by their favorite designers and at the same time give brands the opportunity to try their hand in digital fashion.
At that time the discussion around virtual clothing had just started in both media and the industry, and most of the brands were not ready for it and didn’t understand the benefit. However, in about a year the situation had changed drastically. The concept of AR try-on had become a popular topic in mass media, and brands had come to the realization of how it could be useful for their business. Besides, by that time, we had significantly improved our technology since its launch. Developing our own technology allows us to create more solutions and products so that we don’t have to limit ourselves to being an app with designers’ showcases. Now a lot of brands contact us directly.
What makes ZERO10 technologies different from other popular solutions on the AR-fashion market?
One of our main advantages is the unique combination of experience coming from a variety of fields ― tech, fashion, and product development. Such a mixture of backgrounds allows us to develop a technology that doesn’t exist in a vacuum and is applicable and feasible for the fashion industry, which helps to solve the existing problems of business and meet the needs of brands. If our team had just a tech background, we wouldn’t be able to create a product like this because we’d lack the understanding of the market.
Developing the technology for virtual try-on and creating AR-enabled garments, ZERO10 is a full-cycle company. The ability to oversee all these processes in-house, and being fully in touch with the team on every stage of work is also an advantage. Also, we utilize the recent innovations in computer vision and computer graphics and have a team of world-class engineers.
At the moment, there are only two companies with their own AR clothing try-on technology on the market. To distinguish our product from our competition, we constantly improve the quality of virtual fitting in order to achieve the likeness of the experience to the physical one. It’s a widely known problem that AR try-on technology is in its early stages of development and, as a result, digital clothes often don’t look the same on a person as tangible items do in real life.
It certainly drives away many users who want to create high-quality content, as well as luxury brands because the visual aspect, the way a piece of clothing looks, is crucial. It’s extremely important for them that the digital version of each item should look as close to the real thing as possible. We share this approach, and, to achieve this, we pay a lot of attention to developing our fabric simulation technology which helps digital clothes look as vivid as possible, down to the smallest details ― textures, materials and movements.
How does Cloth Simulation work? What technology did you use to get an accurate look on the body, and what difficulties did you encounter?
Cloth Simulation is one of the core technologies of the ZERO10 Try-On that replicates the natural flow of fabric from human body interactions. The same technology is used in the gaming and movie industries. However, our task was more challenging because we needed to simulate clothing in real-time on a mobile device. We have done a lot of optimizations to run cloth simulation on a GPU.
GPU, or Graphics Processing Unit, is a special electronic circuit that has been designed for computer graphics tasks and later, generalized for other tasks. GPU’s main advantage is that it can perform more arithmetic operations than the Central Processing Unit (CPU). However, writing effective programs for GPU (called shaders) requires deep understanding of the internal workings of the GPU and experience. We also use shaders to simulate various materials, like fur, leather or jeans, and create visual effects, like burning.
What is body segmentation and why was it essential to develop such technology?
Real-time body segmentation is used in a range of different applications. For example, body segmentation is used during Zoom calls to hide the background. At ZERO10 we are working on multi-class body segmentation that allows us to segment different parts of the human body, including hands, face, hair, etc. These segmentation marks let us create a more realistic AR Try-On experience. For example, we use hand segmentation to make sure that virtual clothing does not hide the hands if they appear in front of the body.
Developing a high-quality body segmentation requires:
How can you describe 3D body-tracking technology? Have you encountered problems with data analysis from body scanners?
3D body-tracking technology helps us to understand the user's pose and body shape, which are necessary ingredients for a good fit. 3D body tracking is a very challenging research problem. There are a number of research papers on it published at the top Computer Vision conferences every year. The main reason that makes body tracking so difficult is our desire to make it work in 99 percent of cases. It is relatively easy to create the technology that works in 50 percent of cases, but it is challenging to make technology that works robustly across all poses and for all body shapes. Our technology does not use body scanners. We think it is an unnecessary extra step for our customers. We analyze the user’s body pose using the image from an iPhone camera.
What are the next goals for ZERO10? Are you planning to add new features?
Since we've pivoted to B2B, our team has been actively developing new products for retail, both e-commerce and physical stores. At the moment, we are busy preparing to launch our SDK for e-commerce, which will be out soon. For offline retailers, we are developing the concept of pop-up stores.
We will showcase the project in New York in September. It’s going to be the quintessence of the physical and online shopping experience combined, but with digital-only items. It’s our first step toward developing innovative pop-ups. Later, we are going to present a completely new concept of digital retail spaces which doesn’t require any physical attributes, such as hangers or a cash register, or even any physical space at all.
We believe that the future of shopping is inside the mobile device’s camera, not inside a website. Next year we are going to ramp up our efforts by turning our technology with AR try-on inside a camera into a new shopping tool.