Security Is a Conversation Now

Your threat model was yours. Now it includes another person.

That changes things. When your security was individual, the decisions were yours alone — which browser, which password manager, whether to opt out of data brokers. You could make choices at your own speed, at your own risk tolerance, on your own schedule. That’s over. From here, the things you protect include someone else’s identity, someone else’s concerns, someone else’s willingness to be part of this. Your security decisions are now negotiations.


I want to name a paradox before we go further, because you’re probably already living it.

You and your partner found each other through compromised channels. Maybe it was a text message. Maybe a conversation at a coffee shop. Maybe Facebook, Discord, or a group chat on a platform that logs everything. That’s fine. I’m not going to pretend you should have established perfect operational security before your first conversation — you didn’t have the infrastructure, and you didn’t have the trust. You can’t coordinate a move to secure channels on the insecure channel you’re trying to leave. That’s the bootstrapping paradox, and every group in history has faced some version of it.

The EFF’s harm reduction philosophy — the same one that guides their Surveillance Self-Defense project — applies here: no one locks everything down in one day. You don’t go from texting on iMessage to running Tails on a burner laptop overnight. You establish a baseline, you commit to it together, and you build from there. The floor rises over time. But it has to start somewhere, and it starts with an honest conversation about where each of you actually is.


Here’s what’s different about shared security. When you were alone, a mistake cost you. Now a mistake can cost your partner. The economist Jack Hirshleifer described this as the weakest-link model in a 1983 paper on public goods: in systems where the outcome depends on the least contribution, the whole system is only as strong as its most vulnerable point. Security researchers have since applied this widely — airport security, network firewalls, epidemic response — and it maps precisely to group security. Your pair’s security isn’t the average of your individual practices. It’s defined by whichever one of you is less secure.

It’s a coordination problem. And the solution to a coordination problems is always the same: you talk about it.

In The Matrix, Neo and Trinity survive because their trust isn’t abstract — it’s operational. They don’t just believe in each other. They know each other’s capabilities, cover each other’s vulnerabilities, and communicate through channels they’ve verified. When Trinity tells Neo to trust her, she’s not asking for faith. She’s asking him to rely on the security practices they’ve built together. Their relationship isn’t separate from their operational security — it is their operational security. The love story is also a shared threat model.

That’s closer to reality than most people realize. The strongest security posture two people can have isn’t two individuals with perfect practices. It’s two people who know exactly where the other is strong and where they’re exposed, and who have agreed — out loud, explicitly — on how to protect each other.

So here’s your challenge. Sit down with your partner — in person if you can, on a Signal call if you can’t — and build a shared threat model. You both did individual threat models in Level 1. Pull them out. Now do this together.

Share your threat models with each other. Not the whole document — the relevant parts. What are you each most concerned about? Where do your risks overlap? Where do they diverge? One of you might be worried about an ex with a tracking habit. The other might be worried about an employer who monitors social media. Both are real. Both affect the pair.

Identify your shared floor. This is the minimum set of practices you both commit to. Not aspirational — real. Things you will actually do, starting today.

A starting floor that makes sense for most pairs:

Communication happens on Signal. Not some of it — all of it. If you’re not both on Signal yet, that’s the first thing you fix. Disappearing messages on, set to one week for general conversation. This isn’t paranoia. It’s hygiene — the same reason you don’t leave your medical records on the kitchen table when guests come over.

Device security baseline: alphanumeric passcode (not four digits, not a pattern), operating system current, notification previews off on your lock screen. You covered all of this in Level 1. Now you verify that your partner has too.

Information boundaries: what about your partnership stays private? Who knows you’re doing this? What would you tell someone who asked? Having an answer to these questions before someone asks is the difference between a considered response and a panicked one.

Write the floor down. Both of you keep a copy. Not because you’ll forget — because writing it makes it real. It transforms “we should probably use Signal” into “we agreed to use Signal, and here’s the document that says so.”


I want to plant something here that I’ll come back to in a later chapter.

Security, at the individual level, is a set of practices. You do them or you don’t. But security at the pair level — and eventually at the group level — is something else. It’s an act of care. When you configure disappearing messages, you’re not protecting yourself. You’re protecting your partner. When you use a strong passcode, you’re making a decision about someone else’s exposure, not just your own. When you have the awkward conversation about “hey, your notification previews are showing our messages on your lock screen,” you’re doing something that’s uncomfortable because you care about the person on the other end of that conversation.

Security as care. Remember that phrase.


There’s a case I need to tell you about, because it illustrates what happens when a group doesn’t have this conversation.

In 2020, the FBI paid a convicted felon named Michael Windecker to infiltrate racial justice organizing in Denver during the protests following the killing of George Floyd. The Intercept’s Trevor Aaronson reported the story in 2023, drawing on internal FBI records and undercover recordings. Windecker drove a silver hearse to protests, carried guns, and worked his way into activists’ inner circles. He was paid at least $20,000 by the FBI over that summer. He accused real activists of being informants — a tactic directly out of the COINTELPRO playbook documented by the Church Committee in 1975. He tried to entrap activists in violent plots. He pushed demonstrations toward destruction.

What made the infiltration effective wasn’t just Windecker’s tactics — it was the absence of shared security practices in the groups he entered. No one had sat down together and agreed on a security floor. There was no conversation about what information stayed internal. No agreement on communication channels. No protocol for when someone new showed up with unusual resources and extreme enthusiasm — which is exactly what Windecker brought. Signal existed. Encrypted channels were available. But no one had done together what you’re about to do: have the conversation, write the floor, commit to it as a pair.

I don’t tell you this to make you paranoid about the people around you. I tell you because the fix is concrete and you’re about to do it. The groups that survive aren’t the ones with the best tools. They’re the ones who agree to use them.


After you’ve built your shared floor, test it. Send each other a Signal message with disappearing messages on. Verify each other’s Safety Numbers — in person, scanning the QR code, not just comparing numbers on a screen. Check that notification previews are off on both phones. These aren’t trust exercises. They’re calibration. You’re making sure the floor you agreed to is actually under your feet.

Then write in your field journal what this conversation was like. Not the technical details — the experience. Was it awkward? Was it easier than you expected? What did you learn about your partner’s risk tolerance that surprised you? This is the beginning of something that shows up in every successful group formation I’ve studied: the ongoing conversation about how you protect each other. It’s not a one-time setup. It’s a practice.

The next chapter expands the frame. You and your partner have trust and shared security. The question now is: who else?


Summary

Individual security becomes shared security when a second person is involved. The weakest-link principle means your pair’s security is defined by whichever partner is less secure — not by the average. The fix is a shared security floor: minimum practices both partners commit to, written down, tested together. Security at the pair level is an act of care, not just a set of practices.

Action Items

  • Pull out your individual threat models from Level 1. Share the relevant parts with your partner.
  • Identify your shared floor — the minimum security practices you both commit to starting today. Suggested starting floor: all communication on Signal with disappearing messages (one week), alphanumeric passcode, current OS, notification previews off, information boundaries agreed.
  • Write the floor down. Both partners keep a copy.
  • Test the floor: send a Signal message with disappearing messages on, verify Safety Numbers in person (QR code scan), confirm notification previews off on both phones.
  • Record in your field journal: what the conversation was like, what surprised you about your partner’s risk tolerance, what was awkward, what was easier than expected.

Case Studies & Citations

  • Denver FBI infiltration (2020) — Michael Adam Windecker II, a convicted felon paid at least $20,000 by the FBI, infiltrated racial justice organizing in Denver during the summer of 2020 George Floyd protests. Windecker used COINTELPRO-style tactics including accusing real activists of being informants and attempting to entrap activists in violent plots. Reported by Trevor Aaronson, The Intercept (February 2023); documented in the “Alphabet Boys” podcast (iHeartPodcasts/Western Sound, 2023). The case illustrates what happens when groups lack shared security agreements — the absence was social, not technological.
  • Hirshleifer weakest-link model — Hirshleifer, J. (1983). “From weakest-link to best-shot: The voluntary provision of public goods.” Public Choice, 41(3), 371–386. Applied to security: the system’s protection level is determined by its least-secure component, not the average.
  • EFF harm reduction philosophy — The Electronic Frontier Foundation’s Surveillance Self-Defense project uses a harm reduction framework: incremental improvement over time rather than demanding perfect security immediately. Applied here to the bootstrapping paradox of establishing secure channels.
  • COINTELPRO and the Church Committee — The FBI’s domestic surveillance program (1956–1971) and its exposure by the Senate Select Committee to Study Governmental Operations with Respect to Intelligence Activities (the Church Committee, 1975). Tactics including snitch-jacketing (accusing real leaders of being informants) documented in the Church Committee’s final report.

Templates, Tools & Artifacts

  • Shared security floor template — (1) Communication: all conversations on Signal, disappearing messages set to one week. (2) Device baseline: alphanumeric passcode, current OS, notification previews off. (3) Information boundaries: who knows about the partnership, what stays private, agreed response to questions. (4) Verification: Safety Numbers confirmed in person via QR code scan.
  • Shared threat model worksheet — Compare individual threat models side by side. Identify: overlapping concerns, divergent risks, the least-secure practices in the pair, and specific commitments to raise the floor.

Key Terms

  • Bootstrapping paradox — The challenge of coordinating a move to secure channels while still on the insecure channel you’re trying to leave. Every group faces this. The solution is accepting imperfect starting conditions and establishing a floor that rises over time.
  • Security floor — The minimum set of security practices a pair or group commits to. Not aspirational — actual. Written down and verified. The floor rises over time but must start somewhere concrete.
  • Weakest-link (in security) — The principle that a group’s security is defined by its least-secure member, not the average. From Hirshleifer’s 1983 public goods model, widely applied to security contexts.
  • Security as care — The reframing of security practices from individual discipline to relational commitment. Configuring disappearing messages protects your partner, not just you. Introduced here, developed across Level 2.
  • Safety Numbers (Signal) — A verification feature in Signal that confirms you’re communicating with the intended person and not a man-in-the-middle. Best verified in person by scanning each other’s QR codes.