Skip to main content
September 3, 2014

Abstinence Not Required: Protecting Yourself Until the Privacy Utopia Arrives

Nude photos of various celebrities were leaked to all corners of the Internet a few short days ago. You already know that by now.

As we wait impatiently for the rest of the gory technical details surrounding the compromise(s), many in the security echo chamber have been debating how we ended up here and whether the celebs themselves shoulder any of the accountability. While some people are tweeting about the world we want to live in, others are more interested in discussing the world we actually live in. Both are relevant and interesting, but too often the conversation devolves into people talking past one another, unaware they've started with two completely different premises. Briefly, here are the two main themes:

  • Ideal world: We deserve privacy. The companies and services we rely on should respect our privacy and build foolproof, intuitive user experience (UX).
  • Real world: UX is hard. Many companies won't invest in security or privacy. Even privacy-respecting corporations can and will make mistakes. Users will make mistakes as well.

Let's break this down a little. Long term, as consumers, we should certainly demand that corporations do a better job. This is a challenging but worthy goal. Settings should be secure by default. UX should be intuitive. Security features should be put in place to mitigate simple attack vectors like brute force attacks. Many tabloid-worthy hacks -- from Paris Hilton to Sarah Palin to Scarlet Johansson -- would have been prevented by two-factor authentication (2FA), provided those accounts had the feature enabled. (Apparently, that wouldn't have helped this time though; remember how I said companies make mistakes?)

Day-to-day, however, we shouldn't put the entire burden on others to protect us. Some accountability belongs to us. You wouldn't leave a pile of cash on the seat of your car, right? Let's be clear: nobody deserves to be victimized. But as a user, there are choices I can make in order to reduce my exposure. Here are a few.

  • I can disable cloud synchronization of photos and/or other data. Do I really need every photo I've ever taken to be accessible to me at all times from every device I own? No, really... do I?
  • I can disable broadcasting my location in every photo I take. Do I really gain enough value here to compensate for the risk?
  • I can pay attention to my device's default settings. Unfortunately, most devices default to TURN EVERYTHING ON and UPLOAD ALL THE THINGS, and many people are none the wiser. At least take a few minutes and maybe a couple Google searches to understand how your phone is configured.
  • I can treat my nude photos differently than I treat my other photos. Notice I didn't say stop taking them; this isn't a morality debate. I could do something as simple as using a standalone camera or an alternate device with cloud storage disabled. Yes, there is a convenience trade-off, but it's pretty minimal relative to the privacy benefit.
  • I can opt-in to two-factor authentication wherever possible. That way if I re-use the same password everywhere (as many people do), I'll have an extra layer of defense when other sites are inevitably breached. Hopefully Apple will remedy their inconsistent 2FA implementation soon. Meanwhile, here's a list of other websites and their stance on 2FA.
  • I can change my perspective. Without ever relinquishing the belief that I deserve privacy, I can remind myself at all times that systems are not perfect and breaches will happen. Think of it as online situational awareness. Hope for the best, but plan for the worst.

The thing is, we already know that we assume the responsibility for our actions. That's why we give our teenagers practical advice rather than shielding them from reality and pretending they will always get the privacy they deserve.

Let's not pretend we are just helpless victims. Let's acknowledge the realities of the world we live in and the technology we have today while simultaneously working to improve things where we can.

Chris Eng, Chief Research Officer, is responsible for integrating security expertise into Veracode’s technology. In addition to helping define and prioritize the security feature set of the Veracode service, he consults frequently with customers to discuss and advance their application security initiatives. With over 15 years of experience in application security, Chris brings a wealth of practical expertise to Veracode.

Love to learn about Application Security?

Get all the latest news, tips and articles delivered right to your inbox.