Work With Asylum Seekers
If you are seeking asylum, we can help
Areas of Focus
New data-driven technologies, especially artificial intelligence (AI) and machine learning, are transforming societies on a global scale.
They’re also concentrating power in the hands of undemocratic actors and demanding we expand our very definition of human rights.
Yet civil society — the people, movements, and organizations dedicated to preserving democracy and making the future accessible and safe for us all — have virtually no grasp of how these technologies work and little access to the talent that does.
New data-driven technologies, especially artificial intelligence (AI) and machine learning, are transforming societies on a global scale. They’re also concentrating power in the hands of undemocratic actors and demanding we expand our very definition of human rights. Yet civil society — the people, movements, and organizations dedicated to preserving democracy and making the future accessible and safe for us all — have virtually no grasp of how these technologies work and little access to the talent that does.
We partner closely with human rights defenders to source challenges from the field that are ripe for innovation.
We provide opportunities for designers, engineers, and data scientists to apply their skills to protect democracy and human rights.
We bring together field practitioners and technologists to design, prototype, and release new tech-enabled tools, services, and experiences.
We support field practitioners and partners to apply our solutions to generate tangible impact in the field.
We have full-time and volunteer opportunities for mission-minded designers, engineers, and data scientists.
Technology developed at the Lab is being used and tested by investigators and advocates all over the world. The Lab’s tools take on white nationalist extremism in the United States, thwart harmful misinformation on Youtube, uphold the integrity of elections, protect human rights activists from persecution, and address other urgent needs. Some recent projects:
While anyone with a phone has the means to record human rights abuses, this proliferation of data can easily overwhelm those who want to use this visual evidence to hold bad actors accountable.
SurvAI uses motion and object detection technology to automatically detect and categorize violence in video, dramatically enhancing investigative capability in cases of unlawful violence.
The tool, still in pre-release, is being used in investigations into the January 6 insurrection, state violence against Black Lives Matter protestors at Lafayette Square, and is being tested by international human rights investigators and journalists.
Despite being the second most visited site in the world, YouTube remains among the most problematic platforms when it comes to the spread of misinformation being used to undermine democratic societies.
Journalists and researchers use Raditube to track and analyze narratives and the users that spread them on YouTube, allowing for unprecedented early detection of the spread of dangerous content.
Natural language processing technology allows Raditube to analyze the content of extremists and map the ecosystem of channels, users, and narratives that are used to spread harmful content.
The core technology behind Pyrra was incubated in the Innovation Lab, before becoming an independent venture-backed social enterprise in 2021.
Pyrra is a threat intelligence platform that collects data from dark social media, enabling investigations into threats and disinformation on these plaforms that target companies and governments.
Pyrra’s state-of-the-art natural language processing engine detects content including violent threats, hateful language, and harmful disinformation, then automatically alerts and reports to users when they are targeted.
Welton Chang, CEO of Pyrra, an AI product incubated by the Innovation Lab, spoke with NPR on disinformation narratives that grew online immediately after questioning by Sen. Josh Hawley, R-Mo., which distort her sentencing record in cases related to child pornography.
The Innovation Lab’s Brian Dooley spoke with Sky News about the dangers of the Taliban gaining access to US-built biometric systems in Afghanistan.
The Innovation Lab’s Kris Goldsmith spoke with the Washington Post about how far right extremists took advantage of a rule change on Twitter to further avoid public and law enforcement scrutiny online.
To Counter Domestic Extremism, Human Rights First Launches Pyrra
Social Media and Extremism
Human Rights First Calls on Twitter to End Policy Weaponized by Extremists
Technologists and Human Rights First
Social Media and Extremism
Counter Human Rights Disinformation
Amicus Brief – Wolf v. Innovation Law Lab
Human Rights First is a nonpartisan, 501(c)(3). We do not favor or oppose any candidate for public office.