Our Mission
The public square is no longer public. It's a fragmented landscape of billions of unique, algorithmic silos. Personalization has created information deserts and echo chambers that erode the shared understanding democracy depends on.
You cannot fix a broken information environment if you cannot see it.
Every day, algorithms decide what billions of people see. Not what they search for — what they're shown. These systems shape public opinion, political understanding, and health decisions at a scale no newspaper, TV network, or government ever could.
But nobody can see inside them. Researchers can't audit them. Journalists can't report on them. Policymakers can't regulate what they can't measure. And communities can't advocate for themselves when they don't know what information they're being denied.
Lurk provides the transparency layer. We simulate real audiences using synthetic personas — no real user data, no surveillance, no scraping individuals — and document exactly what the algorithm serves them. The result is structured, citable evidence of how information is distributed across demographics.
We do not scrape, track, or surveil real people. Lurk uses synthetic personas — AI-driven agents with specific demographic and psychographic traits — to interact with algorithms. Our data comes from simulated agents. We maintain zero personally identifiable information of real users.
We prioritize use cases that promote algorithmic accountability. We help organizations identify when life-saving information — climate solutions, public health, civic engagement — is being suppressed by algorithmic bias. We expose information deserts where communities are systematically excluded from the mainstream narrative.
Lurk is a tool for narrative defense, not psychological manipulation. We help legitimate institutions identify and pre-bunk harmful disinformation by understanding how and where it spreads. We strictly prohibit the use of our technology for coordinated inauthentic behavior, bot-farming, or the intentional polarization of communities.
We provide clear methodology for how our agents observe platforms. We don't hack anything — we observe feeds from the perspective of the marginalized, the skeptical, and the underserved to ensure their digital reality is not being shaped by black-box logic.
We never collect, store, or process personally identifiable information from real social media users. Every data point comes from synthetic personas we create and control.
We prohibit any use of Lurk to intentionally polarize communities, spread disinformation, or conduct coordinated inauthentic behavior campaigns.
Lurk is designed for organizations whose work benefits from understanding algorithmic feeds — and whose intent is to inform, not to manipulate.
Structured, citable evidence of what algorithms serve to different demographics. Methodology that holds up to peer review.
Data to measure what regulation needs to address. You can't hold platforms accountable for what you can't document.
Understand whether your public health, climate, or civic message is actually reaching the communities that need it.
Evidence that the communities you serve are experiencing information deserts or algorithmic bias — not anecdotes, but structured proof.
We only work with aligned organizations.
To gain access, please use your organization email and we'll be in touch.