I began my adventure at Despegar as a Lead Product Designer in the Exploration team. In this role, I led the team’s strategic direction, designed new features, and developed navigation improvements for the home and relevant landing pages in mobile apps and webpages that receive millions of daily visits.

My responsibilities: Conducted an end-to-end UX design process, including usability tests, in-depth interviews, quantitative analyses, and high-fidelity prototypes craft. I also supported the leadership team by providing the path to define the following strategic and tactical steps with cross-teams such as product, tech, and business teams.




Lead Product Designer

Project Duration

06/2019 – 12/2019 (6 months)

The Challenge

The Exploration squad is responsible for user onboarding success, which includes:

  • Navigation Structure: top navigation menu, messages, notifications, footer, and logged-in/out area;
  • Homepage: search box, offer rows, banners and marketing elements, payment methods, user reviews, and institutional information;
  • Landings: pages with contextualized content, ranging from specific marketing actions to standalone product pages such as hotels, activities, and assistance;
  • Inspiration Bot: a conversational interface that suggests destinations and travel itineraries for users who haven’t yet decided where to go.

These features are well-known in the UX universe, but the story becomes complex when considering the following challenges:

  1. Millions of daily visits across 20 countries in the Americas;
  2. Cross-device experience: Desktop, Tablet, Mobile, iOS & Android Apps;
  3. Content and Ads Personalization—according to the search profile and history with Despegar. This requires taking into account a wide range of user situations, from those who haven’t conducted recent searches to those almost ready to buy or finalize their trips;
  4. Rebranding—a recent redesign that significantly impacted how users interacted with the product.

Considering all these variables while developing a UX plan is truly complex. To add a personal challenge, I’m Brazilian and accepted the proposal to move to Argentina with a rustic and basic Portuñol (a mix of Portuguese and Spanish).

After understanding the context and metrics, we listed the main challenges and opportunities ahead:

  • Lack of complete visibility on the quantitative and qualitative impacts of the rebranding that affected user behavior;
  • Various stakeholders with high expectations and teams without understanding the value brought by each;
  • Need for a clear vision and definitions on where to start UX improvements and iterations;
  • Lack of a strategic plan to guide our actions;

Carrying out a strategic investigation project would take a lot of time and effort. On the other hand, we needed to deliver quick results with value to users and take advantage of the findings from the investigation plan in a practical way. Therefore, we chose a dual-track agile approach.

The dual-track agile approach

Dual-track agile is an agile methodology that contains two separate tracks: “Discovery” and “Delivery.”

The Discovery track focuses on generating insights and backlog items validated with evidence. The Delivery track works to turn those ideas into a real product. Dual-track agile provides a way to combine the goals of agile development and UX design. Both tracks operate in harmony and aim to create excellent products with strategy and vision.


We had to address several layers of internal and external issues, so we worked in two steps:

Capture internal perceptions and expectations about the Home | Metric Analysis + Internal Desk Research + Benchmark + Stakeholder Mapping.

Capture external perceptions and expectations about the Home | Exploratory in-depth interviews with users from key countries

Cross-referencing and feedback—understanding common points and showing the expectations users have that diverge from internal perceptions

To carry out Discovery, we nicknamed this part of the project Project️ Sherlock, precisely because it seeks to find clues in various perspectives for the investigation.

Phase 1: Understanding internal perceptions

In this stage, we aimed to:

  • Understand the internal perception of the Homepage;
  • Define objectives based on the findings;

Metrics Analysis and Internal Desk Research

Metrics and access to previous studies were essential to understanding the context in which we found ourselves. These tools allowed us to: understand relevant user segments, perform comparative assessments, and build upon the work that previous teams had already done.

Main learnings:

  • Globally, most users used the search box and ended their interaction before scrolling down;
  • The search box performed better before the rebranding;
  • The number of features grew along with Despegar, but the structure did not support new implementations;
  • We had other possible Jobs to Be Done, such as: rescheduling a flight or checking points in the Loyalty program;
  • There was a trend for exploratory searches.


The benchmarking showed who the main competitors of Despegar were in each country of interest. Using Google Trends and analyses with the intelligence team, we understood the main search trends.

There were two types of competitors:

  • Macro Competitors — have a general value proposition and positioning different from Despegar;
  • Specific Competitors — offer unique functionalities or services that differ from Despegar.

Learnings and opportunities:

  • Excessive number of elements in navigation;
  • High cognitive load in offer rows;
  • Increase the value of notifications and incentives;
  • Contextual use of the home screen;
  • Personalization of product suggestions;

Stakeholder Mapping

To achieve our goals, it was essential to build alliances that would enhance our work. For that, we needed to understand the teams that were interested, essential, important, and informed, so we could have a specific action plan for each of them.

We conducted Stakeholder Mapping in stages. This was crucial for improving internal communication and served as an essential tool for making decisions together. In the end, we were able to design a cross-team strategic plan that gave us visibility into the differing expectations for the Home.

Main learnings and outcomes:

  • Clarity of all stakeholders and teams involved;
  • Together, we defined the ideal Home concept for everyone;
  • We categorized teams as essential, important, or interested;
  • We understood the value each team contributed;

Phase 2: Understanding external perceptions

In this stage, we aimed to:

  • Capture users’ perceptions and expectations regarding the Home;
  • Define objectives based on the findings;

Exploratory Interviews

Once we understood the quantitative data, internal perception, and our competitiveness, it was time to understand why the phenomena we identified occurred.

We began rounds of in-depth exploratory interviews with the aim of building a solid product vision, capturing users’ perceptions of the functionalities available at that time.

We interviewed various users in key countries for Despegar’s strategy. Through recollection investigation, we captured behavior patterns, generated ideas, hypotheses, and brought teams closer to the context and daily lives of the users.

Main discoveries and insights

1- They started their searches on a different product than what they were looking for

Users reported discomfort when starting navigation on a homepage different from what they were used to. These reports were evidence of the rebranding results and the perception of the navigation stage. We also had records of an increased bounce rate, which reinforced our hypotheses.

“I entered looking for a flight, and I only realized later that I was on packages.”

2- Access to PV, recent searches, and other tasks

We gathered evidence that users were looking to complete tasks other than travel purchases, for example: accessing search history or checking a purchased trip.

“On Despegar, they should show suggestions based on my searches.”

“It would be practical if I had the reservation number on Decolar’s Home.”

3- Navigation structure as a travel organizer

We asked users to draw what they remembered from Despegar. This allowed us to understand more deeply what they valued during their searches.

Users commented that they let themselves be guided by the navigation during their trip planning. They only realized there were other menu accesses after looking back at the page while being interviewed.

We had indications that this happened due to the benchmark, Miller’s law, and other studies, but this event was crucial to support changes in navigation.

“Wow! Looking closely, I can plan my trip from the menu.”

4- Banners directly affected the experience

Users reported not using marketing banners. We found evidence that, in the case of exploratory navigation, when viewing the banners, they would end their navigation and not reach the personalized offer lines further down. This was because they believed that what they would see below would also be offers and not personalized suggestions.

“The offers that were below the search box never caught my attention.”

This was reinforced when we analyzed recordings and captures on Hotjar, where we saw that most users did not scroll past the banners.

We conducted an A/B test that confirmed our hypothesis, learning that banners should have context and be related to what was below. We also had opportunities to implement features according to the travel stage.

5- Exploration, inspiration, and personalization

We discovered that users chose destinations based on the activities they could do and that the emotional connection during travel planning is much stronger with activities than with flights or hotels. Moreover, users wanted important information about the destination, such as safety, temperature, and distance between the airport and the city.

“Before buying the hotel, I checked the activities that could be done at the destination.”

To transform the patterns and findings into actionable items, we drew inspiration from this research organization. We applied what was found as actionable for each of the areas of the websites/apps. Each time a specific quote pointed to the same actionable item, we colored the insight a darker shade. This more clearly conveyed the feelings captured during the dynamics.


With the actionable items, we set clear objectives concerning both what users wanted and the opportunities Despegar had.

User goal: Better travel experience from a Home that anticipates needs and adapts to users’ contexts.

Business goal: Have a more competitive Home, with a scalable structure that adds value to users.

We should make suggestions that empower users; the final decision is theirs, but the recommendation is ours.

We also created design principles for the upcoming implementations:

And we defined the top three cross-device KPIs for the team:

  1. CTR
  2. Bounce Rate
  3. Conversion from the homepage

Cross-referencing and feedback:

After completing all analyses, it was time to share the learnings with other teams. To do this, we held feedback meetings for each team and facilitated prioritization dynamics for the actionable itens. This was key to ensuring visibility and making the most of what was captured.

At this stage, we conducted dynamics, returned the data obtained, and drew conclusions together. This was crucial for disseminating knowledge, as we understood that holding meetings, giving presentations, or sending emails would not be enough.


Phase 1: Creation of the new strategic vision

We adjusted our strategy and vision for Home & Onboarding. What used to be a place to welcome and guide users during their searches became a space for containment and advice, aiming to intelligently and adaptively address users’ needs.

Now we had a list of improvements, which in turn were divided into the main stories of the semester.

Pain points:

  • High cognitive load during navigation and travel planning;
  • Low perceived value in suggested offers;
  • Lack of support during continuous search;
  • Inefficiency in exploratory searches;
  • Little perception of personalization in suggestions;
  • Breaks in cross-device experiences;

Always On: continuous design deliveries

Depending on the complexity, we decided on the level of depth for each story, ranging from direct A/B tests to production, to changes requiring usability tests, card sortings, questionnaires, or click testing.

All stories pointed to the strategic vision we defined. Four main stories were defined: improvements in navigation, last search functionality, search box implementations, and improvements in offer lines and marketing banners.

Story 1- Navigation and Search Box

We had the following points regarding navigation:

  1. Positioning — help users understand which part of the site they were on;
  2. Navigation — facilitate access according to the user’s context and flow;
  3. Incentive — encourage users to make decisions at the most appropriate time, such as: login, loyalty, and downloading the app;
  4. Contrast — it was necessary to emphasize the differences between navigation elements and search and information input elements;

Pain points:

  • Understanding the position
  • Overload of access points
  • Lack of relationship between access points and users’ mental models
  • No notable design difference between navigation areas and the search box

Problems and hypotheses:

User: “I don’t know which product I’m in and what I’m looking for” — We need to improve the affordance of the selected session;

User: “There are too many access points; it’s better to click only on what I already know” — reduce and group access points for hierarchical navigation;

Research: click tests, card sortings, and guerrilla tests.

Implementation results:

By giving more visibility to the sessions, it was possible to solve and validate the navigation and search problem in the chosen product quantitatively, with the following global, cross-device results:

  • Significant decrease in the number of clicks on the same product, for example: clicks on the flight icon while already on that page, mainly on mobile devices;
  • Increase in search accuracy;
  • On mobile, we reduced the transition to other screens without affecting conversion;
  • Significantly increased the use of search box elements;

In the end, we created a backlog with design proposals that included: grouping access points on Desktop and mobile devices, new incentives for users such as iterations to download apps, and access to the loyalty program.

Story 2- Last searches

We had evidence that users wanted to perform exploratory and comparative searches during travel planning. With this, we implemented the last search feature.

Pain point:

  • It was not possible to continue a search started earlier

Problem and hypothesis:

User: “I need to open a new tab or perform another search to continue my planning” — Provide the possibility to continue a search already performed contextually and personalized;

Last Searches on Home

We personalized the hierarchy of the module according to the user’s stage, focusing mainly on those who performed continuous searches, and a smaller hierarchy for users who had already purchased. Differentiations were created for each type of search, such as a product in the cart, search for a flight, or search for a package.

Research: usability tests.

Implementation results:

Creating the functionality validated our hypothesis. We significantly increased:

  • Global cross-device conversion;
  • The number of users who moved on to the following stages of the flow;

In addition, we created the opportunity for new additions to the feature, such as access to after-sales services if the user is at a more advanced stage of their travel journey.

Story 3 – Offer lines and marketing banners

We had the opportunity to make personalized suggestions and increase the engagement of users who had not yet defined a destination for their trip.

Pain points:

  • Marketing banners directly affected the interpretation of the offer lines;
  • Product suggestions did not give a sense of personalization;

Problem and hypothesis:

User: “I feel like these offers are not for me” – We can change the design and strategy of the offer lines and thus provide personalized suggestions.

User: “If I’ve already bought my trip, why do offers keep appearing?” – We can change the order of what is shown according to the user’s travel stage and change the design and strategy to give personalized suggestions.

Research:  usability tests and guerrilla tests.

Implementation results:

Our solution involved creating different versions with distinct personalized suggestions for users, from product offers to complement the travel experience to quick post-sale access when we identified users who had an active purchase.

The tests indicated that the new version achieved:

  • A significant increase in CTR compared to the marketing banners previously in the same position;
  • A significant increase in the use of the search box, which is positive as it leads users to flows with better conversion;
  • An increase in the quality of searches by offering suggestions with better contexts;

Future vision: Tangibilizing the ideal Home

Several other implementations were carried out during the second semester of 2019. It was six months of learning and intense work, where several tests we implemented produced unsatisfactory results, which is part of our role as interface designers.

More important than the failures is the learning they provide.

The work was carried out by an incredible team, completely open to suggestions and who welcomed this newly arrived Brazilian very well. I am proud to have been a part of all this. We created our vision and strategy for the ideal cross-teams home that changed the way we work and, mainly, the goals we sought to achieve.