It takes intentionality.
At Searchlight, we evaluate candidates based on their soft skills rather than credentials — their working styles, strengths, and abilities — as measured by the people who know their work best. In doing so, we’re breaking the cycle of unconscious bias by actually measuring Quality of Hire and identifying proven signals for success.
This is how we operate as a team, and it’s also core to how we’ve built our product from the ground up.
Building a fair, ethical system
Let’s be real. There’s no way to guarantee bias doesn’t creep into the hiring process. We’re humans, after all. But, the good news is that there is a way to mitigate unconscious bias and push against it — through a system that’s accurate and fair, so it’s easy to recognize a good candidate when you see one. At Searchlight, we approach ethics in three parts:
Build an inclusive data pipeline.
The way we collect data is vetted end-to-end for inclusivity and objectivity, whether that’s about word choice, the way a question is asked, or a nuance in our user experience. Each touchpoint goes through a thorough examination by talent and DEI experts so our data is as objective as possible.
Connect recruiting data to real outcomes.
The true test of Quality of Hire comes when we measure our understanding of a candidate against outcomes data. Searchlight builds success profiles from pre-hire to promotion because true, unbiased artificial intelligence relies on outcomes data. Did the candidate we hire become a high performer? Did they stay and grow their career with our organization? Why or why not? Did we measure the right competencies for their role? What do we look for next time? Searchlight finds themes and patterns in the behavioral data so that we can proactively surface the right signals to look for in the interview process and cut through the noise created by traditional crutches like education and prior employers that bias our decision-making.
Build explainable, people-focused AI and continuously self-audit.
We assess ourselves regularly through internal data reporting and external audits from organizational psychology experts in pursuit of reducing adverse impacts of bias at every hiring touchpoint. We cut our behavioral data in multiple ways, along factors such as demographics, job families, and location. If we detect the potential for bias in our heuristics, we scale certain features like “reference relevance” until our distribution is comparable across subgroups. Above all, we value AI that is transparent and people-first and avoid black-box algorithms.
Searchlight’s system for mitigating bias helps teams hire with conviction across every role at your company, while bringing transparency and data to the process. We believe the best teams are diverse teams, and that it’s easier to build an inclusive culture when you have a system that can help you recognize good when you see it.
We're proud to help underrepresented talent and career changers
I am so so grateful to have been introduced to Searchlight as a candidate. In the past, I’ve been blocked from jobs based on incorrect assumptions about my background. Going forward, I will come to every opportunity armed with the references that can best speak to who I am and how I work.