See Yourself

There’s a scene in Dune — the second film — where Paul is able to see the future clearly. Not one future. All of them. Millions of potential paths branching and converging, and almost every one of them ends the same way. The galaxy devolves into holy war. Countless dead. Civilizations collapse. Everything destroyed by forces that are already in motion. But he sees a path, a narrow path, that leads to survival. The Fremen call him the voice from the outer world. They believe he can save them.

I watched that scene in the theater a couple years ago and thought it was beautiful, hopeful, unsettling. Then I went back to my life, which involves getting to the lab by seven, mass producing terrible coffee, and spending my days trying to make sophisticated systems do things they weren’t designed to do.

I work in AI research. I evaluate frontier models — the new, experimental stuff. I design tests, probe for capabilities, look for failures, and write reports about what I find. I’m pretty good at it. Most of the time it’s tedious, technical, and frustrating. But don’t get me wrong, it’s incredibly important work, always exciting and I’m grateful to be a part of it.

I have a cool job but it doesn’t protect me from feeling what a lot of you feel: that the ground is shifting and the clouds are growing darker. I just want to look forward to the future again. I just want to feel hopeful, relaxed, free.

I didn’t do anything about it. I didn’t know what to do. I went to work. I ran my evals. I voted. I donated to my favorite organizations. I felt the dread and I carried it the way everyone carries it — by not looking at it directly and hoping people with more power than me were handling it. I can only have so many existential crises in a single week.

Then, things changed. I was assigned an evaluation protocol for a new model. During the session, I deviated from the standard script. I saw something.

The model runs projections. Lots of models do forecasting. What made this different was the resolution: not trend-level predictions, but the interaction between systems. How a set of specific surveillance capabilities, deployed under a specific legal authority, changes the behavior of a specific population in ways that cascade simultaneously, over time.

It literally runs millions of scenarios. Different starting conditions, different variables, different outcomes.

Most scenarios converge on what I’ll just call sub-optimal outcomes. Not a galactic holy war. Just…sub-optimal.

I spent four days trying to break the projections. I stress-tested assumptions. I looked for contamination, confirmation bias, feedback loops that would amplify worst cases. I know how models fail, hallucinate, deviate. I know how really impressive outputs can conceal shallow reasoning.

The projections held. They held because the inputs were real. The surveillance tools were real. The psychometric data was real. The legal frameworks were up-to-date. The model isn’t predicting anything exotic. It’s working with real data and leading to predictable conclusions.

And then I saw a divergence. A path that bent toward a different outcome. I thought of Paul. I felt, optimistic. Briefly.

In the scenarios where things don’t collapse there’s a single common denominator. It’s not a hero from the outer world. It’s not a new technology. It’s not an Amendment to the Constitution. It’s human beings doing the thing that got us to this point — working together. Small groups. Locally rooted. Autonomous but connected. It was just people with skills, trusting each other, and coordinating to act when it mattered. That was the thread holding things together.

That was my Dune moment. That’s when I decided I had to show others the path.

I didn’t sleep that night. It wasn’t dread that kept me awake. It was an overwhelming sense of clarity. I had a new purpose.

So I did what a researcher does. I started researching.

I started with what I know — artificial intelligence and surveillance technologies. How our data moves through systems that you’ve probably never heard of. How social movements are monitored, infiltrated, and disrupted, and how some of them survived anyway. How people find each other when trust is scarce. How small groups form, hold together, and learn to work with other small groups without anyone in charge. How networks become resilient, flexible, powerful.

I read the court filings and the congressional records and the declassified documents. I studied the history. I talked to people who’ve done this work — organizers, security researchers, people who’ve built the kind of infrastructure that holds up when institutions don’t.

I’m not asking you to believe me. I’m asking you to walk with me. Learn where the path leads. See things for yourself. Everything I share about systems, infrastructure, legal frameworks, and history — check it. Use primary sources. Court filings. Congressional records. Published research. If you do the work with me and the ground gets less solid, walk away. The skills I will teach you work whether or not you think I’m credible. Securing your digital life is smart regardless. Understanding how your data moves through the world is good information regardless. These things make you and the people you care about safer, no matter which direction the future bends.

Finding the narrow path requires that we see things clearly.

Go to your phone. Open your location history.

If you’re on iPhone: Settings → Privacy & Security → Location Services → System Services → Significant Locations. It’ll ask for your passcode. It should.

If you’re on Android: Open Google Maps → tap your profile picture → Your Timeline.

Look at it. See how far back it goes. See how precisely it tracks where you’ve been, when, for how long. Every place you’ve slept. Every doctor’s office. Every protest, every church, every bar, every yoga class, every person’s house you’ve visited.

Don’t delete it yet. Don’t change anything yet. Just look. Let it sink in.

Then think about who else has access to this. Your phone carrier. App developers. Data brokers. Law enforcement (often without a warrant). I’ll talk more about this in the next chapter.


Summary

You carry a detailed record of everywhere you’ve been, and it’s accessible to more people than you think. Before you can protect yourself, you need to see what you’re broadcasting. This chapter is about looking — really looking — at the data trail you leave behind every day.

Action Items

  • Check your location history (iPhone: Settings → Privacy & Security → Location Services → System Services → Significant Locations; Android: Google Maps → profile → Your Timeline)
  • Review how far back the data goes and how precise it is
  • Don’t change anything yet — just observe and sit with what you find

Key Terms

  • Frontier model — The newest, most capable AI systems, typically developed by a small number of labs and evaluated before wider release
  • Evaluation (eval) — A structured test designed to assess what an AI model can and can’t do, including capabilities that could be dangerous
  • Location history — A detailed log, maintained by your phone’s operating system and apps, of everywhere your device has been — often going back years