May 14, 2024
Welcome to the third article in our AI & UX series!
As we continue to meet with our technology partners, we have uncovered a consensus that AI is supercharging what we are able to test, optimize, analyze, and research through user experience (UX) work.
In this installment we dive even deeper into the potential use cases of AI with our next guest, Chip Lay, Director of Product Strategy at Fullstory.
Fullstory is a digital experience intelligence (DXI) platform that empowers businesses to deliver the best digital experiences across their websites and apps. Fullstory customers gain visibility into every step users take with product analytic and session replay capabilities within the platform.
Chip’s role has been an evolution. He has a background in software development and analytics and first started at Fullstory as a UX Designer. He is now the director of product management over Fullstory’s data products.
In the sections below, read about how AI is changing the way we improve user experiences and some future applications for DXI tools.
Chip:
It has been interesting to see this new wave of AI that has happened over the past year or so and how that is starting to change the way people engage with digital experiences. AI is opening up some new doors to how people think about the whole product development process and it is evolving fairly rapidly.
Tools are becoming more connected, from how designers conceptualize experiences in tools like Figma, and how those experiences are actually brought to life through software engineering and product development. It is all becoming more and more of a connected experience. With AI kind of supercharging the way in which products are created, you're really setting the stage for the next era of software in a sense, or the next era of experiences.
Chip:
You can imagine it as three phases of software. Phase one was the old world of shrink wrap software where you would buy a box and install the CD. You would install the software and then you would go buy another box when you wanted the next version.
Phase two of software was really the age of internet software - SaaS products where companies could collaborate, develop, and start to observe people using the experience for the first time.
Even that is fairly new in the last 15 or 20 years. We went from no visibility of this internet software, where we shipped it and hoped it worked, to this first age of analytics. People would instrument events, try to understand the user journey, and measure product usage patterns to inform how we change the next version.
Then there is what Fullstory does - part of the next wave of analytics where you go beyond just the known things you would instrument and you try to read between the lines to understand what’s really happening in your software. You do that through a mix of technology, like session replay, where you can watch somebody use your product, and have a really empathy centered approach to understanding the experience and trying to make it better.
Where AI comes in is phase three, which we’re just seeing the first taste of. This next phase is the phase of adaptive software. Software that actually adapts to you, with hyper personalized experiences that are not static. It’s not one size fits all.
Chip:
You can already get a sense of how software is going to be more adaptive to your experience with generative AI. Even using ChatGPT, it creates a much more conversational experience where you’re going to express your goals and it can interact with you in that way. The important thing is that understanding the user experience is no longer static, it kind of moves beyond the realm of knowing what’s going to happen and things can change dynamically.
AI will start inferring patterns like, how might we know that this customer is on the path to abandoning their cart? Might we interject some dynamic discounting or the right kind of prompt or the right path when we know they're stuck. How do we get them into the next mode?
When I talk about adaptive software, I think about experiences that can understand the best next action in the moment and how to meet the user in a very human and empathetic experience versus the traditional chatbot that we all love to hate - the chatbot that pops up and knows nothing about you and you end up frustrated with.
Chip:
We are already testing adaptive software today and it's pretty exciting stuff. It's really about a type of digital body language. In the physical world, if you were to walk into a retail store today and a store associate was there, they could watch you walk around, they could see you looking at stuff, read your body language, and know that you're going to the sale rack or you're going to see some expensive stuff. They could read between the lines and kind of understand or infer your intent. That is the future - where AI powered experiences become contextually aware of your digital body language and are able to then come alongside you and help you in that experience in the same way that somebody can in the physical world.
The behavioral data collected through Fullstory can be used for machine learning and data science, where you can power things in real time. Behavioral data becomes much less about describing the past and more about predicting what’s happening next.
So, if somebody comes into an ecommerce site, it’s no longer just about the clicks and the pages, but really reading between the lines. Are they scrolling up and down? How can you infer heuristics of these behaviors that are much more nuanced?
Chip:
Yeah, it’s such a good question. I really think that AI is going to expand how we think about things. For example, take UX designers. Really great UX designers think about solving problems, not designing pixels. Where it gets hard is there are a lot of really great problem solvers out there - design thinkers - that may not have the skills necessary to communicate their ideas visually.
I think AI is going to lift the ceiling around that to have more contribution, more inclusiveness around who can add to this collective experience design versus just the people who know the technical skills to get in and move things around in design tools like Figma. It is going to be an empowerment for creativity - making it much more expressive - but I don't think it's a replacement.
I just don't think you can replace human curiosity or ingenuity. With every new wave of technology that has come, you always think, oh gosh this is going to replace everything. There are a lot of unknowns here for sure - I don’t want to sugar coat it in that sense, but there is a lot left to figure out and I think this next year is going to be pretty wild and come at a fast pace of change. We’ll likely see a lot of really unexpected and exciting ways this technology is applied to our craft.
Everyday CXperts uses Fullstory to help companies build better experiences for their users. To think that AI will bring even more capabilities to help achieve human-centric user experiences is a very exciting future.
Chip’s perspective on adaptive technology with AI has brought to light the key idea that not only can behavioral data help us understand the past, but eventually will be a very powerful tool in predicting and catering to a users’ needs in real time. There is a whole new era of digital experiences just waiting for us to explore.
Head to the other articles in this series to read more about AI’s influence on testing and research possibilities in Part 1 with AB Tasty and Part 2 with WEVO.
If you would like to learn more, reach us at hello@cxperts.io with any questions!