Research marketplace redesign
About the project
After Cint was acquired, we ended up with two overlapping respondent-marketplace platforms — both complex and held together by user workarounds. To cut costs, all customers were migrated into one. Their workflows broke overnight, the learning curve was steep, and frustration quickly turned into churn.
Goal of this project wasn’t to add features, but to fix the foundations: streamline the experience, improve efficiency, and create a modern, coherent UI to shop the churn.
About the product
Exchange is a B2B platform where market researchers define the audiences they need for their surveys. It connects them with hundreds of respondent suppliers and finds the right people for each study.
My role in it
I led the end-to-end design process — research, information architecture, wireframing, usability testing, and supporting implementation. I worked with one junior designer and collaborated closely with product and engineering.
Why it mattered
After the acquisition, churn became a real risk. Customers were already unhappy with the migration, and without a fast, meaningful improvement to the experience, we risked losing long-term accounts. Redesigning the core workflows was essential to stabilising the customer base and restoring trust.
How we did it
Research
We had a strong repository of insights from support tickets and suggestions, but we needed to understand workflows and the workarounds behind them. So we:
• Interviewed ~20 customers and mapped their workflows end-to-end|
• Ran surveys with ~30 internal researchers to uncover daily friction
• Reviewed all tickets and suggestions, then created an affinity map to highlight priority areas
This resulted in personas, user journey map and a prioritised list of problem areas we needed to fix.
Process in a nutshell
It was a six-month, intensive project that rethought the product from the ground up. Every page could easily be a case study on its own.
Based on the research, we redefined the information architecture and used it as the foundation for the redesign. From there, we identified what was required for migration. Lower-tier customers didn’t need the full feature set, so we focused on the essentials and tackled each core workflow one by one.
Below is the prototype we created throughout the project (before applying the visual layer, as we focused on usability first). Every workflow was tested with internal researchers and customers, then iterated based on their feedback. Once the product was ready for the initial release, we continued adding features for higher-tier customers and started developing the visual layer and design system.
Redesign
Core workflow 1/3: Managing projects
Key research insights
• Customers work in projects, each containing multiple surveys. As a workaround, they link surveys through naming conventions.
• To monitor survey performance, users check each survey individually. Many keep multiple tabs open all day as a workaround.
Solution
We restructured the product around projects, giving customers a clearer way to organise work, scan multiple studies at once, and monitor performance without drilling into every survey.
Core workflow 2/3: Creating target groups
Key research insights
• Creating surveys conceptually is a linear, repetitive flow, but in our app it was fragmented and required jumping between multiple screens.
At the time of recording, creating target groups in the old platform was disabled, so the video below shows the flow using an existing TG. Users first filled out the TG form, then had to open the created TG and complete several tabs before they could launch it.
Solution
We extracted all fundaments that form the the day in day out and split them into four sections – basic settings, profiling, advanced setup and finalise. Their progress could be saved at any time as a draft.
Core workflow 3/3: Monitoring target groups
Key research insights
• Target group troubleshooting mostly came down to one question: what happened to my respondents?
• Respondent analysis codes, essential for troubleshooting, were hard to interpret, so many customers kept their own reference notes.
• Individual respondent data wasn’t available in the app — users had to generate a spreadsheet report to see it.
• The Performance tab felt overwhelming (“so many things screaming at me”), making it hard to focus on what mattered.
The dashboard surfaced everything at once with no prioritisation. Key metrics were buried or missing, while less important ones dominated the screen and pulled attention away from what actually mattered.
Solution
We simplified the structure, surfaced the right data, and made troubleshooting easier. Key metrics moved into a clear top panel, respondent statuses became easier to understand, detailed respondent data was accessible without spreadsheets, and new performance graphs gave customers instant insight into survey progress.
Impact
The redesigned experience helped stop churn among the customers who adopted the new platform. Core workflows became clearer and faster, support requests dropped, and customers reported higher confidence in managing and monitoring their projects. After the initial release, we continued expanding the product to cover the key functionalities from the old platform, ensuring a smoother transition for higher-tier customers as well.





