Operation i1 is a deep web scraping tool built in Linux to streamline and automate intelligence-gathering in support of anti-human trafficking efforts. It targets publicly accessible platforms — forums, classified ad sites, and other corners of the open and deep web where trafficking activity is known to surface — and extracts structured data that would otherwise require hours of manual review.
The goal isn't surveillance for its own sake. It's efficiency — getting useful, organized information into the hands of people who can act on it faster than any manual process allows.
Anti-trafficking investigators and advocates face a signal-to-noise problem. The web is vast, the platforms shift constantly, and the volume of content that needs to be reviewed to surface a single actionable lead is enormous. Manual searching is exhausting, inconsistent, and doesn't keep pace with the way these networks operate.
Existing tools are often expensive, inaccessible to independent advocates, or designed for law enforcement environments that most community-level organizations can't access. Operation i1 exists in that gap — lightweight, open, and purpose-built for the problem.
Operation i1 runs as a set of modular scripts executed from the Linux command line. Each stage of the pipeline is designed to be inspectable, adjustable, and re-runnable — so the tool improves over time without requiring a full rebuild.
The stack is intentionally minimal — no unnecessary dependencies, no GUI overhead, nothing that creates friction between writing code and running it.
This is an independent project — no team, no external spec, no client brief. Every architectural decision, every line of code, and every iteration came from a single question: what does the person trying to use this actually need?