Product pages exist to help customers learn, consider and purchase the right Dell product. We observed customers felt Dell.com lacked relevant and clear product content making it difficult to differentiate options. How might we make it easier to learn how a product can meet customer needs and provide them with clear next steps?
Dell offers 1,000s of PC configurations to millions of customers around the globe. The product pages are an integral part of the multi-billion dollar e-commerce platform. The vast majority of site traffic enters these pages via organic search.
The solution framed the product page‘s role in the larger customer journey; focusing on customer tasks across awareness, research and consideration. We prioritized a methodical shopper persona seeking confidence at all points in this journey. Product comparison is the core of the content strategy that consolidates the most essential content across six pages into one.
The top 20% of the page concisely communicates the core value of the product model. The images, headlines and descriptions concisely reflect what differentiates it from other model variants and gives an effective assessment of the product size and aesthetics. Tech and marketing jargon is limited and defined when required (eg. the term Thunderbolt anchor links to a description of the technology on the page). A third party review is highlighted as a valuable voice to speak for the product quality.
The interaction design uses progressive disclosure at a micro and macro level. Switch from the 13 inch model to the 15 with a button click. All the relevant information and specs change to fit that model, including the accessories. The highest priority specs are highlighted while the all details can be found when the customer is ready. Overall the the experience is scannable, essential and accommodating. It accommodates a quick glance to assess the product value and the need to deep dive on finding the right configuration and offer to purchase.
Feeling confident is about understanding context and expectation. This product details experience meets you where you are and provides appropriate paths forward, even if forward is continuing research elsewhere and picking up here at a later point.
The model variance selection feature was quickly put into production on Dell.com. Users of the feature contributed to an increased revenue per visitor of 2%. Other aspects of the content strategy validated efforts to make the content more relevant to contribute to increased customer satisfaction scores. The learnings generated additional investment in research to drive solutions for gaming customers. The highlighted specs feature was continued to be iterated on to create a range of gaming features and enhancements that exceeded that programs annual operating plan by 200%.
With a three month goal to launch an A/B test to customers on Dell.com, design-thinking allowed us to get stakeholder alignment quickly around the user. Stakeholder interviews allowed us to learn goals and expectations. We reviewed and revised as a cross-functional team to frame this as our collective strategy. Rapid iterative testing was our method for taking our ideas to customers quickly for feedback.
Data from the analytics, customer satisfaction, and user research teams framed the current state experience. We learned the journey to be primarily defined by three user intents; awareness, active research, active consideration. Post purchase needs also represent a notable aspect of the experience as reflected in product return feedback.
Our persona was "Methodical". They are motivated by personal productivity and their tech needs are to achieve personal goals, enhance personal experiences, information from trusted sources, access to critical technical details, and shop in store but purchase online.
The site analytics and customer satisfaction teams helped provide insight into the intended purpose of customer visits to Dell.com. We leveraged this data to create scenarios for awareness, research and consideration phases, The awareness phase is defined by a customer arriving to product detail page via a Google search and interested in seeing the product look and feel. They are assessing style and quality. This visit lasts less than two minutes on the site.
In contrast the scenario for a customer during active research is driven by advertising and arrives to Dell.com seeking to compare across a product category and model variations. The visit will span several pages without going deep on configurations or accessories with a purchase.
By the time customer is actively considering a purchase they’ve collected information from a variety of offline and online sources on the product. They enter Dell directly to finalize the configuration, attach accessories and get a price. They dig deep into viewing video, design details and checking for offers to get as much value as possible.
Across this journey we synthesized tangible areas of friction
With the strategy pasted to the walls and user scenarios in hand we held a two day design studio workshop. Merchandizing, marketing, content, analytics, research, product managers, developers and designers sketched ideas. The design team facilitated the group to get to a final storyboard and list of prioritized feature ideas.
The design team immediately took the sketches and jumped into prototyping with a focus on the highest risk assumptions. Design and research worked to create testing scripts covering scenarios across customer intents. Four rounds of iteration were used to learn and iterate to the solution that would go into A/B testing.
Scan-ability or confidence? As we neared our final round of qualitative testing we identified an assumption that was at odds with one of our design principles. Our hypothesis was “customers need to see the top specs for all available model variants by default in order to differentiate them”. This was at odds with the assumption that this level of detail would not be easy to scan. This was of particular concern for customers in the awareness phase which is brief but critical.
To move forward we added a final task to the usability test to evaluate their ability scan and finally gather their preference between solutions. As a result of the learnings the team moved forward with the recipe that contain a model comparison of top specs.
This project was an epic experiment when you consider amount of products, paths, locations , devices implicated. Ultimately to make it feasible we scoped it to the US, XPS laptops, and desktop. The collective mindset tends to evaluate the whole as a success or failure.
If I was to run an A/B test of this size again I would put more time in framing sub-hypothesis as a sequence of tests that build out a body of learnings.
I directed the approach, facilitated cross-functional meetings, and contributed directly to the research and design execution. The design team consisted of two product designers and a content strategist at 50% capacity who shared interaction design, information architecture, prototyping, and UX writing responsibilities.
Our design team partnered with the online business, product group, marketing, customer satisfaction, video, user research, analytics, and testing teams to realize this opportunity and successful experiment.