A quick overview

Priceline launched their first international travel product in 2018. After translating our rental car experience in 5 languages, I lead and executed our first overseas user research in Madrid, Spain to improve our international product.

skills: user research, usability testing, project planning, leadership

the problem

While Priceline has several sister companies that operate abroad, most of our revenue and current digital experience stems from our North American audience. The business’s first venture was to translate the experience locally, but that investment hit a roadblock: translating an American experience into a foreigner’s native language doesn’t ensure salience in a new market. The digital system has to match customer expectations and communicate important information in a coherent, digestible way. What information is important abroad? What norms did we not consider? Does the translation make sense to our new customers or did we manage to create an (accidentally) isolating experience?

The only way to answer these questions was to enter one of our first biggest foreign investments: Spain. And, boy, did we get some answers.

process & methodology

In order to get the right results, we needed a third party moderator who knew the language well. While I’m technically declared proficient in Spanish by my university (*cough* 6 years out of practice *cough*), there was no way I would be able to pick up on the slang and cultural expectations of the service in a savvy way. We needed an ally and agent on the ground to get us our answers and I helped convince Priceline the investment was worthwhile. We got the fidelity we needed and were able to experience mannerisms, ticks, and patterns in real-time.

Since the translation was well underway of our current experience and launched in-production, we intended to do user testing on both our desktop and mobile experience. We targeted customers who rented a car in the past 6 months and took a general sampling of more typical travel demographics.

results

We came back to the States richer than we expected, and that was by watching people struggle through the experience. A lot. (To my delight, it was a rough first day of feedback. That just meant we were discovering opportunities to improve and discovering them quickly!)

For example, the team learned that first impressions are essential when it’s a user’s first time on your homepage (and they don’t know you or trust your brand). Watching our first Spanish user struggle with putting their dates in the calendar was a note of interest, the second user was a red flag, and by the sixth user it was an important pattern. The UI didn’t meet their mental model in a couple major ways but the most interesting were these: 1) the separate date inputs (indicating pick-up and return) didn’t function as expected, which made it difficult to select the correct dates or change their intended trips, and 2) European calendars start on a Monday (not Sunday) and this led to the input of incorrect dates.

When the frustration of this initial experience mounted, we knew then and there that we had to invest more time on calendar interactions and brand investment on the homepage. If they couldn’t get this new, foreign brand to work for them, why bother continuing with the experience?

impact

Within a month, the team hit the ground running to improve the calendar experience mentioned above. In order to validate our investment in research (both overseas but also within the company), we needed to highlight how one of our insights led to a lift in business. The calendar was just that; those improvements led to a considerable conversion lift and became a quick case-study to prove the power of research. This and other findings (that we worked on seeing later conversion rates as well) were shared out and openly communicated at lunch-and-learn I led for the entire product organization.