The Hardest Skill
The hardest skill I can teach you has nothing to do with technology.
It’s this: convincing someone you care about to take their security seriously when they think they have nothing to hide.
I’ve heard every version of the objection. “I’m not doing anything wrong.” “If they want to look at my boring life, let them.” “I don’t have time for this.” “You’re being paranoid.”
None of these are wrong, exactly. They’re just incomplete. The person saying them isn’t stupid and they’re not naive — they’re making a rational calculation based on incomplete information. They haven’t seen what you’ve seen over the last week. They haven’t checked their location history, or found their home address on a people-search site, or watched their browser fingerprint get read by a dozen trackers in real time. They’re assessing risk based on what they know. You now know more.
The mistake is leading with fear. Research on security behavior — particularly a 2018 longitudinal study by Mwagwabi, McGill, and Dixon on fear appeals and password compliance — found that persuasive communication improved security behavior in the short term, but the effects on compliance intentions didn’t last. More striking: neither perceived vulnerability to an attack nor perceived severity of an attack predicted sustained compliance. What predicted it was self-efficacy — the person’s belief that they could actually do something effective — and response efficacy — their belief that the recommended action would actually work. Fear gets people to act once. Believing they can act, and that acting matters, gets them to keep going.
This is why most security advice fails. It makes people feel helpless, so they stop listening.
What works is framing security as care. Not “the government is watching” — but “I want to protect our conversations.” Not “you’re exposed” — but “I found something that helps, want me to show you?” Not a lecture. A gift.
The behavioral research on adoption reinforces this. The MINDSPACE framework — developed in 2010 by Paul Dolan and colleagues for the UK Cabinet Office, and the foundation for the government’s Behavioural Insights Team — identifies nine factors that shape behavior. Three are especially relevant here. The messenger matters more than the message: people adopt practices from people they trust, not from experts or institutions. Social norms drive behavior more than logic: knowing that someone they respect already uses Signal is more persuasive than any technical argument. And defaults shape action: if Signal is where the group chat lives, people use Signal.
Here’s what works in practice.
Start with one person. Not a group — one person you’re close to. Someone who trusts you. Ask them to install Signal with you. Do it together — in person or on a call, not by text. Walk through the setup. Make it a shared activity, not an assignment.
Don’t explain the surveillance pipeline. Don’t mention metadata. Just say: “This is a better messaging app. The conversations are private, the group chats are cleaner, and it doesn’t show ads. Can we try it?”
If they push back, you have real stories now. Not abstract threats — specific cases from this book. A Catholic priest whose anonymized location data from a dating app was correlated with his church and his home until he was identifiable — and whose career was destroyed when The Pillar published the story. A cyclist in Gainesville who spent thousands of dollars on a lawyer because his location data placed him near a burglary he had nothing to do with. A woman identified through a chain that started with a reused username on a shopping site — one account linked to another linked to another until her real identity surfaced from behind layers she thought were separate. These are real people whose lives changed because the ordinary digital systems around them worked exactly as designed.
Use whichever story matches what the person cares about. Privacy? The Burrill case. Wrongful suspicion? McCoy. Identity exposure? Blumenthal. You’re not scaremongering. You’re translating what you’ve learned into something relevant to them.
Once Signal is installed, move one existing group chat there. A family chat. A friend group. A book club. The content of the chat doesn’t have to be sensitive — the point is normalizing a more secure default. Once people are using Signal for the mundane stuff, they’ll use it when it matters.
Then teach one thing. Just one. Not everything you know — one concept, one skill, one action from this book. Maybe it’s checking haveibeenpwned.com. Maybe it’s the password manager. Maybe it’s the SIFT method from Chapter 10. Pick the thing you think will land and share it the way you would share a useful tool you found, not the way you’d deliver a warning or a lecture.
Install Signal with one person who doesn’t have it yet. Do it together. In person or on a call.
Move one existing group chat to Signal. Pick the one with the most momentum — the chat people actually respond in.
Teach one concept from this book to someone who hasn’t read it. Record in your field journal: who you talked to, what you shared, what worked, what resistance you encountered.
If you’ve done everything in these chapters, you’ve changed how you move through the world. You see the infrastructure. You’ve secured yourself against the most common vectors. You have a threat model, a field journal, and a maintenance schedule. You know how to evaluate information at the source. And you have at least one person you’ve brought along.
That’s Level 1. Seeing clearly.
In The Hunger Games, there’s a moment when Katniss raises three fingers in a silent salute to the cameras. She’s not giving a speech. She’s not issuing orders. She’s making a gesture that says: I see you. I’m with you. It costs her nothing but attention — and it becomes the most dangerous thing in Panem because other people start doing it too. Not because they were told to. Because they recognized something in it.
The three-finger salute isn’t a token. It’s not a password, or a badge, or a membership card. It can’t be counterfeited because it isn’t a thing — it’s a recognition. The districts don’t exchange credentials. They recognize shared experience. You know the salute because you lived through what made it necessary.
That’s how this works too.
I’m not going to give you a key, or a code, or a secret phrase to unlock what comes next. There’s no gate between Level 1 and Level 2. You can read the next chapter right now if you want. But the content of Level 2 assumes you’ve done the work — not because I’m testing you, but because the skills build on each other. If you haven’t built a threat model, the group security practices in Level 2 won’t make sense. If you haven’t had the conversation from this chapter — the one where you sit with another person and help them install Signal, or teach them one concept, or share one story that makes abstract risk feel personal — then Level 2 will be instructions for a game you’re not yet playing.
The threshold isn’t something I give you. It’s something you already have if you’ve done the work. Check:
You have a field journal with your threat model, your security checklist, and your maintenance schedule.
You have Signal installed and at least one conversation happening there.
You’ve taught at least one person at least one thing from this book.
You recorded what worked and what didn’t.
If those are true, you’re ready. You have the required skills for what comes next.
The narrow path isn’t walked by individuals. Every scenario I’ve studied where things hold together — where communities maintain trust, where institutions face accountability, where the machinery of surveillance breaks down — those scenarios feature well-secured people who found each other, built trust, and organized.
They built a network.
Summary
The shift from individual security to social security is the most important step. Fear-based messaging about security produces short-term compliance. The bridge from individual practice to group practice is the key to accessing Level 2.
Action Items
- Install Signal with one person who doesn’t have it yet. Do it together, in person or on a call.
- Move one existing group chat to Signal — pick the one with the most activity.
- Teach one concept from this book to someone who hasn’t read it. Record in your field journal: who, what, how it landed, what resistance you encountered.
- Complete the self-assessment: field journal with threat model and security checklist, Signal installed with at least one active conversation, at least one teaching interaction documented.
Case Studies & Citations
- Jeffrey Burrill — Catholic priest identified through anonymized Grindr location data purchased from a commercial data broker. Reported by The Pillar (July 2021). Referenced as persuasion example for privacy-focused conversations.
- Zachary McCoy — Cyclist in Gainesville, FL, identified by Google geofence warrant as suspect in a nearby burglary. Spent thousands on legal defense before being cleared. Reported by NBC News (March 2020). Referenced as persuasion example for wrongful-suspicion conversations.
- Identity chain exposure (Blumenthal pattern) — Individual identified through a chain of reused usernames across platforms, linking pseudonymous accounts to real identity. Referenced as persuasion example for identity-exposure conversations.
- Fear appeals and security compliance — Mwagwabi, F., McGill, T., & Dixon, M. (2018). “Short-term and Long-term Effects of Fear Appeals in Improving Compliance with Password Guidelines.” Communications of the Association for Information Systems, 42(1). Found self-efficacy and response efficacy predict sustained compliance; perceived vulnerability and severity do not.
- MINDSPACE framework — Dolan, P., Hallsworth, M., Halpern, D., King, D., & Vlaev, I. (2010). “MINDSPACE: Influencing behaviour through public policy.” UK Cabinet Office / Institute for Government. Nine behavioral influences: Messenger, Incentives, Norms, Defaults, Salience, Priming, Affect, Commitments, Ego.
- Three-finger salute in real protest — The Hunger Games gesture was adopted by Thai pro-democracy protesters (2014, 2020–2021) as a symbol of resistance, demonstrating how fictional recognition signals cross into lived political practice.
Templates, Tools & Artifacts
- Persuasion script framework — Match the case study to the listener’s concern: privacy → Burrill, wrongful suspicion → McCoy, identity exposure → Blumenthal pattern. Lead with care (“I found something that helps”), not fear (“you’re being watched”).
- Signal migration checklist — (1) Install together, in person or on call. (2) Move one active group chat. (3) Make Signal the default for sensitive conversations. (4) The mundane conversations normalize the tool.
- Level 1 self-assessment — Four checks: field journal with threat model and security checklist; Signal installed with at least one active conversation; at least one teaching interaction completed and documented; maintenance schedule set with calendar reminders.
Key Terms
- Security fatigue — Covered in Chapter 11. The exhaustion that leads people to abandon security practices. Relevant here because fear-based persuasion triggers the same fatigue in others.
- Self-efficacy (in security adoption) — A person’s belief that they can effectively perform the recommended security behavior. Research shows this predicts sustained compliance more strongly than perceived threat severity.
- Response efficacy — The belief that the recommended action will actually work to reduce the threat. Together with self-efficacy, the strongest predictor of long-term security behavior change.
- MINDSPACE — A behavioral influence framework identifying nine factors that shape decision-making: Messenger, Incentives, Norms, Defaults, Salience, Priming, Affect, Commitments, and Ego. Developed by Dolan et al. (2010) for the UK Cabinet Office.
- Threshold (as used in this book) — Not a gate, badge, or token. A self-assessed readiness based on demonstrated competency. You cross the threshold by having done the work, not by receiving permission.