Disclaimer: Mind Over Matter is an editorial series. The featured opinions in this article are not an endorsement of Vue.ai or Mad Street Den.
In 1997, Dr. Jeff Wu who is currently with the Georgia Institute of Technology, said that it was ‘time in the history of statistics to make a bold move’.
Perhaps understanding the impact of such a step in the next century, he proposed that the field of statistics be renamed as ‘data science’. He bemoaned the layman equation of statistics with the entirely different branch of accounting in his inaugural lecture for the H. C. Carver Chair in Statistics at the University of Michigan, and then moved on to say that ‘several good names have been taken’, but ‘data science is likely the remaining good name reserved for us’.
Apart from the oomph factor — statistician paints a picture of a staid, old-school accountant, while data scientist clearly says new age whiz — the change in nomenclature also reflected the changing times. Data science was born out of the coupling of statistics and computer science.
Institutions like WGSN dedicate themselves to unearthing trends and predicting the future for fashion businesses. While they’re increasingly turning to technologies like big data to analyze data and take the guesswork out of connecting art with commerce, more advanced data science has the potential to be a veritable juggernaut in this space.
Stitchfix has seen massive success by combining data science with the human touch of real stylists for their subscription eCommerce model. Every brand, physical store, eCommerce website and fledgling affiliate marketing blog crunches numbers because the fashion industry hinges on one very important metric: sales.
There is a constant flow of data in and out of retail systems every day, and businesses mine a fraction of it to understand the ebb and flow of finance and consumer sentiment. But just as we get comfortable with analyzing large amounts of data, there are other curve balls heading our way. And beyond those curve balls, the future holds an immensity of business dark matter — the vast unknown of the zettabytes of data we are yet to record.
A big challenge facing the fashion industry right now is time to market. With the rise of fashion immediacy, brands are under pressure to minimize the time between shows and sale.
One of the images that rise to mind is an Instagram story where Mr. Ralph Lauren personally opened the doors to his flagship women’s store in Madison Avenue during NYFW 2016, after presenting his first ‘See Now, Buy Now’ collection. The store was already stocked with pieces from the show, and all of them could be bought. A radical change from how couture and high fashion used to work.
Immediacy is not just a random fad. It reflects the zeitgeist of the modern shopper who wants instant gratification every time he or she reaches for their phone to check new content.
It’s forcing the adoption of new operational mechanisms because everyone is privy to what popular celebrities like Gigi Hadid and Kendall Jenner are doing right now through Snapchat. They can follow the lives of famous designers and see the inner workings of fashion brands. People dedicate time every day to skim, survey and follow what the industry puts out for display, and they consume storified data through multiple channels. And while they survey, their reactions are being viewed and surveyed in turn, and transformed back into data.
This surveillance is different from the chic kind that Camilla Nickerson styled for Vogue. It’s not hidden cameras capturing allure from across the building. It’s devices and machines that are out in the open or placed behind a layer of abstraction, capturing your behaviour objectively, seeing you with the lens of business.
Every click, tap and purchase is captured online. Every aisle choice, product pickup and facial emotion can be ‘seen’ in the physical store. And a large part of this information will be outdated as new data keeps rolling in. It is then relegated to the data graveyard where perhaps, a chance query could dig it up again before it is forgotten or lost forever.
With timely relevance coming into focus, data can’t be confined by the parameters of inventory manuals, annual growth projections or quarterly sales reports anymore. It’s turning into a living universe that we see, hear and feel. In other cases, it’s world of data that we can’t perceive but has a tangible impact on us. Samantha Lee, co-founder and Director of UniversalAssemblyUnit, says,
“So much about the world we live in is invisible to our bodily senses. We cannot see electromagnetic waves or smell data. We use technology as a means to evolve our senses to better navigate the world around us. The next significant step is when we move away from an ocular-centric virtual reality to one which encompasses sound, touch, smell, and taste. Then we will have the ability to conjure up real and meaningful presence in virtual space.”
Take, for instance, Datum Explorer — UAU’s multi-platform project where data from a 3D scanner and a binaural recording device installed in a forest was recast in different environments. In 2014, they projected the dataset of points onto its original environment, a forest in East Sussex.
This cohabitation of the real and the virtual could well be the stuff of our futures. Just as we have characteristics which set us apart from everyone else in the real world, our virtual selves will develop a language of characteristics that thinking, seeing machines can read.
Data science and AI give us a two-way spyglass where the parties on either side are viewing a constructed reality. In between, there’s a third eye — machines that take the stream of data coming in, understand them and show each party an interpretation of the data that interests them the most, while simultaneously profiling preferences and mapping it to individuals.
So as we move away from numbers on a sheet, we’re not just dealing with data that’s captured from and recast into the real world. We’re also building a version of ourselves online with our own personal data.
From the time we created our first MySpace accounts or used avatars in Yahoo (or going back even further, chose usernames in an AOL chatroom), we’ve been creating virtual representations of ourselves.
At the Oculus Connect 2016 developer conference held in the first week of October, Mark Zuckerberg showed a demonstration of the Oculus Connect 3. Your virtual self could ‘draw’ things into existence, play a game, change your environment and have conversations with people both inside and outside VR.
And you can create Oculus avatars much the same way as picking your colors and outfits in Pokemon Go, but it goes one step further in trying to be more realistically like you. (In a light moment in the demo, Priscilla Chan asks her husband why he looks like Justin Timberlake.) But what about our invisible selves? Fashion businesses now have the ability to deconstruct ‘small data’ to create individual personas with AI and computer vision.
Director Frédéric Tcheng in conversation with Alexa Chung for the first part of her documentary series ‘ The Future of Fashion’, said:
“Fashion…is superficial because it is literally on the surface. You see it. But if you look a little deeper…you see things about the inside of society, and how people perceive themselves.”
Brands are beginning to deconstruct people’s inner selves for a better understanding of their behaviour in retail settings. And technology is letting us articulate style and fashion stance with actions, and have them translated into custom-fitted experiences. This has the potential to bring a wider spectrum of ideas, beauty and acceptance to the table through commerce, without taking away anything from our humanity.
Data science and AI are shaping up to be the great levellers of our times. For brands it can mean a single stream of truth that lets them know a person and cater to his or her individual needs, regardless of market segment, geography or channel. For people, it gives a more efficient, enjoyable way to cut through the noise to find what they want.
But there’s a caveat here. Are data models passive entities that record intent and behaviour, or do they have a heavy hand in influencing the direction of both? Samantha Lee says that we must be careful to avoid that trap.
“We have to be careful not to get caught in the self-perpetuating bubble of Eli Pariser’s book ‘The Filter Bubble’, where personalisation algorithms reflect a superficial vision of ourselves, reducing us to data, and ultimately molding our personalities into Netflix categories. It’s important to be as unpredictable as humanly possible. After all it is algorithms that should be learning from us, not the other way around.”
With the rise of personal surveillance and data collection, businesses have a responsibility to treat people’s data footprints with care. Building authentic relationships online, offline and in social is of paramount importance in the trust economy. And it’s up to leading technology adopters to set the ground rules of how to use this 360-degree vision responsibly, for the next generation of fashion brands.
Disclaimer: Mind Over Matter is an editorial series. The featured opinions in this article are not an endorsement of Vue.ai or Mad Street Den.
San Francisco, CA — November 12, 2024 Vue.ai, a leading AI orchestration platform, is proud to announce the launch of… Read More
The old-school, paper-heavy loan processing methods are like using a flip phone in the age of smartphones—outdated and frustrating. Customers… Read More
SimpliFI Consulting, founded by Jinesh Gosar, a banking veteran from the MENA region, selects Vue.ai, an enterprise data and AI orchestration… Read More
We are excited to announce our new partnership with Decimal Technologies, a leader in the BFSI sector, to accelerate digital… Read More
Aug 26, 2024, San Francisco, CA - Vue.ai, a leader in Enterprise AI, is excited to share news that it… Read More
Vue.ai and Moative announce partnership to roll-out enterprise grade AI applications, redefining AI transformation efforts across industries. Vue.ai, a first-of-its-kind… Read More