What I Saw in Our Smart Mirror
Runway scene from Suzhou Art & Design Technology Institute event (c) Balazs Fejes, 2018

What I Saw in Our Smart Mirror

I've spent a fun weekend in Suzhou recently, showcasing one of our R&D prototypes. We at EPAM have been working with the Suzhou Art & Design Technology Institute to integrate our prototype Smart/Augmented Reality Mirror into the fashion department’s graduation event. The runway show was complemented by our booth area in the exhibition hall, and visitors were able to virtually try on the pieces from the runway show right after the event.

It was amazing to see our collaboration resulting in great content & visitor experience! And I've got to do a photo shoot for the runway event, that's a first for me :) But the most rewarding experience was seeing the prototype product in action, getting a couple of hundred people experience it for the first time.

As with every product, we had an initial vision, which we had to adjust, refine, and somewhat drastically pivot during the development phase, based on technology constraints, creative input, and personal testing experience. But putting the product into a live environment, and getting a large group of people to play with it, of course generated another level of insight. Let me share some of my realizations that will help us to take this research project further.

Augmented Reality, Fitting algorithm

Many companies already launched Smart Mirrors and Augmented Reality-based virtual clothing experiences. Even my own company did quite advanced prototypes, relying on 3D cameras, AR and VR frameworks to deliver a convincing virtual fitting experience. But the product category is still in its infancy. There are many-many challenges that I think will push out wide-scale adoption for a few more years.

One key constraint is that for 3D-fitting, the clothes themselves need to exist either as a CAD-drawn 3D model, or a somehow 3D-scanned model. Clothing brands would need to add this step into their content production pipeline. Even just creating appropriate quality product shots, managing the related digital assets, distributing it between the different digital or print, online, offline channels is a huge challenge. Adding this 3D-scanning step is not a small investment in effort and money.

Considering the application in luxury, high-end fashion, my understanding is that the presented outfits really come together last-minute for the runway show. By us going "simple" with a 2D-overlay approach for the fitting, we enabled a really simple process of uploading the product photos just a couple of days before the runway event. The integrated Smart Mirror experience, tailored for the particular runway show, required very little effort.

How's the fitting experience in 2D? We've listened to the visitor's feedback, and the trend was very simple to spot:

  • Fashion professionals (designers, teachers) and technologists all noted the lack of 3D fitting, and the somewhat quirky experience in our prototype: the 2D sprite trying to catch up with the moving body.
  • Visitors, who were interested in the clothes seen on the runway, all enjoyed the experience of "putting on" specific pieces that they liked, and sharing the resulting photo image on their phones via the generated QR code.
  • Shop owners, clothing brand small business owners asked immediately for the price and availability of the mirror, as they noticed the buzz created by our AR booth and the delight of visitors.

I'm completely convinced that the 3D experience is the target for this type of functionality in the end. At the same time, having something that's 2D, fun for the customer, easy to deploy for the shop or event, and easy to generate content for, is overall a good proposition. 

Improvement opportunities for the 2D experience:

  • Better a composite image with better blending/antialias/light matching for the overlaid image. At some moments, the image looks very convincing, we just need to work on making that period last longer, even if the light or body shape moves around.
  • Faster tracking of body and movements. For this prototype, we went with a JavaScript-based set of frameworks and tools, not sure if we'd need to drop down to a different stack to dramatically improve the speed of body tracking, or just need to optimize the current JavaScript code.
  • I think the "floatiness" of the overlaid clothing piece can be completely removed - the sprite should only be displayed fully if the customer's body is static for a half second or so, and we know we have the right position. When there's movement and we need to shift the position, we could hide the image or just display a light silhouette for fitting.

Serious or playful?

While I've observed people gather around the mirror and tried the pieces from a quite serious-looking, high-concept, artistic collection, I noticed one thing. Literally not a single person took the experience seriously. Chinese women are extremely protective of their visual image: on average they spend about 30 minutes of capturing, editing per shared selfie. Since 2D fitting experience did not really allow the usual serious-mysterious poses, they freely laughed out and snapped photos and just went for the fun aspect of the experience. I'm thinking that this could be taken into even more of a fun direction - applying comedic backgrounds, or Warioware-style ask the user to "Strike a pose". This could improve the overall experience and de-emphasise the need for precision fit.

A Happy realization

One of the interesting features we've developed for the Smart Mirror is the RFID-based link between the physical product item and the online, digital experience. We've enabled the users to tap the clothing piece (or more practically the hanger) to the side of the mirror, and product details, and the fitting experience would pop up featuring the selected piece. The Suzhou Art & Design Technology Institute was kind enough to lend us a bunch of clothing pieces from last year's graduation work for our EPAM Showroom (which I'll talk about in another post). But it was not possible to get the brand new pieces from the runway show, so we've received beautifully printed photos on hangers, and we've added the RFID tags to the hangers. 

Our exhibition hall booth had the hanging photos neatly arranged, on an all-white frame, next to the mirror, and the visual impact was quite striking, it looked and felt so much more neat and clean and fitting compared to the actual clothes in our Showroom. And not only it looked good, from a functional perspective, it turned out to be incredibly convenient for a group of visitors to collaboratively browse the collection in a more tactile, physical way, instead of the digital, onscreen experience. Selecting multiple pieces, queuing items up, sharing and showing favourites to each other - it just worked the most natural way.

Now I'm thinking about how we can remodel our own Showroom with this approach, and how to reframe the browsing/shopping experience for events and shops this way.

So overall I was really happy to contribute to the success of the event, and looking forward to see what new idea we'll be able to explore for next year's graduation showcase!


To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics