Building an MVP to Help Underrepresented Communities Find Inclusive Workplaces

Building an MVP to Help Underrepresented Communities Find Inclusive Workplaces

Intro

Intro

I’ve joined Your Equal as a volunteer aheading of launching their MVP to help building a employer review flow, streamlining the review process and setting up the right tracking of events to monitor the performance and raising improvement opportunities. I also participated in usability testing to validate proposed solutions.

#productdesign #analytics #growth #flowoptimisation #UI #notifications

I’ve joined Your Equal as a volunteer aheading of launching their MVP to help building a employer review flow, streamlining the review process and setting up the right tracking of events to monitor the performance and raising improvement opportunities. I also participated in usability testing to validate proposed solutions.

#productdesign #analytics #growth #flowoptimisation #UI #notifications

About YourEqual

About YourEqual

Your Equal is a non-profit that helps marginalized employees find safe, inclusive workplaces. Like Glassdoor, it lets them share firsthand company reviews to inform others. It also helps businesses improve workplace conditions.

Your Equal is a non-profit that helps marginalized employees find safe, inclusive workplaces. Like Glassdoor, it lets them share firsthand company reviews to inform others. It also helps businesses improve workplace conditions.

Opportunity

Opportunity

Based on the discovery interviews, it was clear that there has been an unmet need among marginalized employees. We needed to build an employer evaluation tool that was seamless for employees to complete and a reviews page that was easy for new applicants to scan.

Defining hypotheses

Defining hypotheses

How might we encourage and support marginalized employees in sharing honest and valuable insights about their workplaces to help future applicants?

How might we provide job seekers with structured, trustworthy, and actionable insights about potential employers?

Building an MVP for initial traction

Building an MVP for initial traction

Based on our analysis (create analysis) we’ve created 3 tiers for employer evaluation and a. step. to complete reviewer’s profile. Review consisted of the following sections. Goal for the review was to be flexible, ease to pause and resume (all changes will be autosaved), with glossary to help users understand the content and a clean structure of questions. Most of the questions were multiple choice, with a few more with input fields for open questions.

Company culture

Diversity

Welfare intitiatives

Inclusion tools

Open communication

Values

Compensation & Growth

Salaries and salary bands

Career development

Career path documentation

Compensation for extra shifts

Compensation & Growth

Remote work

Accessibility

Facilities

Benefits

Mobility support

The goal for the review was to be flexible, allowing users to pause and resume easily with autosaved changes. It featured a glossary for clarity and a clean question structure. Most questions were multiple-choice, with a few open-ended input fields.

Tracking the right events of user experience

Tracking the right events of user experience

Setting up events in Google Tag Manager

Setting up events in Google Tag Manager

Before we launched the MVP to our first users (who were contacted through platforms focused on marginalized communities and inclusion), I took the lead in defining events to help us monitor the performance of our review form.

I set up these events using Google Tag Manager and monitored their performance with Google Analytics. In addition to the default GA events, we also added the following.

Google Tag Manager (GTM) includes native events such as Visits, Time on Site, and Bounce Rate. I also added events aligned with our business objectives, such as Sign Up, Profile Created, Start Review, Review Submitted, and Review Edited.

Before we launched the MVP to our first users (who were contacted through platforms focused on marginalized communities and inclusion), I took the lead in defining events to help us monitor the performance of our review form.

I set up these events using Google Tag Manager and monitored their performance with Google Analytics. In addition to the default GA events, we also added the following.

Google Tag Manager (GTM) includes native events such as Visits, Time on Site, and Bounce Rate. I also added events aligned with our business objectives, such as Sign Up, Profile Created, Start Review, Review Submitted, and Review Edited.

Analyzing usability issues through Hotjar

Analyzing usability issues through Hotjar

To identify usability issues in the app, I installed Hotjar to track user progress through the Review Flow and address any challenges.

To ensure clean data, I excluded all team members' IP addresses from internal traffic.

To identify usability issues in the app, I installed Hotjar to track user progress through the Review Flow and address any challenges.

To ensure clean data, I excluded all team members' IP addresses from internal traffic.

Learnings after launching the MVP
to the first batch of users

Learnings after launching the MVP
to the first batch of users

We launched the first version of the MVP to allies and members of marginalized communities through word-of-mouth, a presentation at the Berlin Unicorns Hackathon, and by leveraging our networks. Hard data from Google Analytics, combined with Hotjar recordings and a round of usability testing, helped us identify functional usability issues and friction points. We synthesized the following points:

Unmet expectations: Users wanted to write a review without registering on the platform or providing an email.

Low traction: Website visitors read reviews but were unwilling to share reviews about their companies.

Usability issues: Users had difficulty navigating the lengthy review process, which consisted of multiple sections.


We launched the first version of the MVP to allies and members of marginalized communities through word-of-mouth, a presentation at the Berlin Unicorns Hackathon, and by leveraging our networks. Hard data from Google Analytics, combined with Hotjar recordings and a round of usability testing, helped us identify functional usability issues and friction points. We synthesized the following points:

Unmet expectations: Users wanted to write a review without registering on the platform or providing an email.

Low traction: Website visitors read reviews but were unwilling to share reviews about their companies.

Usability issues: Users had difficulty navigating the lengthy review process, which consisted of multiple sections

How did we solve users' unmet expectations

How did we solve users' unmet expectations

Conditioning 'Review submission' after verifying the email

Conditioning 'Review submission' after verifying the email

Early data showed that many users dropped off at the first step, where they were asked to provide their email (create a profile) before submitting their first review.

We've decided to move profile creation to the final step. Email verification will then trigger a review summary, allowing users to review their feedback one last time before publishing.

Early data showed that many users dropped off at the first step, where they were asked to provide their email (create a profile) before submitting their first review.

We've decided to move profile creation to the final step. Email verification will then trigger a review summary, allowing users to review their feedback one last time before publishing.

Offering insights into the benefits of sharing an email

Offering insights into the benefits of sharing an email

When users submitted and verified their email, they were informed that creating a profile would allow them to edit or remove their review in the future.

Enabling review edits was deemed important, as wording may change over time. By allowing users to update their reviews, we can ensure better accuracy for future applicants.

Additionally, we gave users the option to opt into the newsletter, helping us build a database to reconnect with them in the future.

When users submitted and verified their email, they were informed that creating a profile would allow them to edit or remove their review in the future.

Enabling review edits was deemed important, as wording may change over time. By allowing users to update their reviews, we can ensure better accuracy for future applicants.

Additionally, we gave users the option to opt into the newsletter, helping us build a database to reconnect with them in the future.

How did we increase amount of submitted reviews

How did we increase amount of submitted reviews

Conditioning seeing the reviews only upon submitting their own

Conditioning seeing the reviews only upon submitting their own

Our goal was to populate the Reviews page quickly. A platform with a variety of employer feedback would make us more attractive to potential investors and partners.

Since many users were visiting the site and reading reviews but not submitting their own, we decided to follow Glassdoor's example. We made viewing reviews conditional on users providing their own review first.

Our goal was to populate the Reviews page quickly. A platform with a variety of employer feedback would make us more attractive to potential investors and partners.

Since many users were visiting the site and reading reviews but not submitting their own, we decided to follow Glassdoor's example. We made viewing reviews conditional on users providing their own review first.

How did we improved issues with navigating the review flow

Improved stepper navigation

Improved stepper navigation

Hotjar recordings and usability testing revealed issues with the stepper. Users had difficulty navigating between steps, especially as some sections consisted of multiple substeps.

We replaced the horizontal stepper at the top with a clickable vertical stepper on the left-hand side and provided an indicator of progress in each section consisted of multiples sub-steps. Additionally, we added microcopy at each step to inform users that their input was saved.

Hotjar recordings and usability testing revealed issues with the stepper. Users had difficulty navigating between steps, especially as some sections consisted of multiple substeps.

We replaced the horizontal stepper at the top with a clickable vertical stepper on the left-hand side and provided an indicator of progress in each section consisted of multiples sub-steps. Additionally, we added microcopy at each step to inform users that their input was saved.

Improved readability and accessibility

Improved readability and accessibility

Another issue raised during usability testing has been related to customer beig overwhelmed with the amount of questions they had to answers and poor readability of the review as a whole.

I've decided to reduce the free input fields only to cases where further clarification was needed, added a character limit for those input fields.and simplified 'Glossary' boxes into expandables to reduce congitive load.

Additionally, I explored multiple font types to improve readability to cater for users with visual impairement.

Another issue raised during usability testing has been related to customer beig overwhelmed with the amount of questions they had to answers and poor readability of the review as a whole.

I've decided to reduce the free input fields only to cases where further clarification was needed, added a character limit for those input fields.and simplified 'Glossary' boxes into expandables to reduce congitive load.

Additionally, I explored multiple font types to improve readability to cater for users with visual impairement.

Immediate results

Immediate results

After increasing initial traction and making the necessary changes, we started to see some improvements:

Review completion rate improved by 15% (Form submission ↗️ 15%)

Time on site/Completion flow decreased by 47 seconds

Fewer rage click issues in Hotjar when navigating through the flow

Growth strategy to condition an option to see reviews is yet to go into prod. This initiative is yet to be closely monitored and iterated based on the results.

Next steps…

Next steps…

Implementation of NPS Score
Hotjar offers an out-of-the-box NPS score feature. Adding this as an additional source of feedback will help validate our assumptions and further improve our designs.

Further Readability Improvements
Readability remains an ongoing challenge. Other review platforms use a single-question-per-screen approach in their review flows. This initiative is worth testing, as it improves readability and provides more space for additional context when needed.

Different Channels for Outreach + Landing Pages
We shouldn’t rely solely on word of mouth to promote our service. Creating dedicated landing pages and targeted ads can help us increase traction, build our user base, and populate the platform with more reviews.

More Events to Track
I’m a strong advocate for using hard data to validate our assumptions. I believe it’s essential to combine both qualitative and quantitative data when making iterations that benefit users and ultimately drive business value.

© 2025 Filip Schwarz

MY RESUME

MY LINKEDIN