Signed in as:
filler@godaddy.com
Signed in as:
filler@godaddy.com
Dell believed they wanted a personalization (P13N) wizard, and gave it to the Pivot team - a co-located, agile, balanced team of Design, Product, and Developers. Follow as we walk through uncovering what shoppers really want through interviews and iterative design/testing.
Originally Dell wanted a Product Advisor, a wizard that would pose a few questions to customers, then suggest options that fit their answers.
In user testing, it wasn't working. It was hard to find the advisor itself (a link toggle that was secondary to the existing top nav filters), the options were vague, and the suggestions were often too plentiful.
Basically they had started with a problem statement based on assumptions (but no problems) and jumped into solutions.
Our Dell.com customers don't know which laptop they want to buy.
Let's design a product advisor to ask detailed questions about how they laptop will be used.
So the Pivot team was called in to redo this personalization wizard. This team consisted of a Designer, a Product Manager, and three Devs. This agile team was co-located to literally work closely together, quickly designing and developing based on user data as it came in.
the first thing we started with was a D&F (Discovery and Framework) kickoff with stakeholders to understand goals, success metrics, stakeholder mapping, risks and mitigations, and create a service blueprint.
We wanted to understand not only what the stakeholder goals were (business, product, engagement, as well as anti-goals - what this project was not). but also their assumptions.
The assumptions allow us to see what the business thinks is the current landscape, what they assume customers want and why those customers are doing what they're doing, what they think should be fixed, what they think their system is capable of handling.
Then we can do a risk assessment - We think that shoppers want or do X, but how much do we really know about that (high data, i.e. recent user testing)? Or we don't actually know and just assumed (so we need to go interview users and test). Also, how impactful is it if we get it wrong (and can prioritize getting it right).
We now had an idea of what we (thought) we knew, and what we didn't know, so we could interview users to confirm or discover.
But also there could be things we did not think about (spoiler alert - we found). Also features we may already have but didn't know how users really used them.
The team worked on how to create this "experiment" in the lab, how to ask the questions without leading the user, and the whole team (Devs included) participated in the hypothesis generation, the script, and taking notes.
The observations were synthesized, taking into account the type of user, what their needs were, etc. We could stack and start finding clues to what they really wanted, and found something interesting.
They didn't want a wizard, they just wanted a newer version of what they already had. They also wanted to see certain specs upfront and easy to compare.
So now, going back to the beginning, we had a validated problem statement. One that we could build towards.
Computer shoppers shop for something they are familiar with.
When computer shoppers are shopping, they want a newer version of their existing device.
Computer shoppers care about hard drive space, memory, weight, screen size, processor, ports, resolution.
Now the Pivot team, would sketch concepts per synthesized data. we would prioritize ideas using business needs, user pain points, and technical complexity. This framing of solutions let to sketches that the team could vote on to wire and test.
As a customer, I want to find a newer version of what I have
Sketches needed to be validated with users frequently with small tests that could increase in scale and scope over time, adding features incrementally.
As I wired according to sketches, Dev could add real data behind it. Backend would take the raw information (they currently have computer X) and make suggestions based on criteria.
Testing with users, we would get comments like "It's a cool feature!" and "Wow, this is helpful!" They understood it, and we could see exactly the information they would enter, i.e. "Macbook pro" and know to set up our backend to take answers such as that and match to Dell Products.
We had now tested the entry and where to place it, what they wanted the result(s) to look like, how many to show, what they expected to see (i.e.spec info, logos, icons), side by side vs. cards, hide/show, etc.
The resulting page was a hit, bringing more qualified shoppers who concerted at a 17% higher rate. We launched and further tweaked using A/B testing, with select servers collecting info on the tested versions.
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.