The following post is about a beta software release, which may — and hopefully will — change. You know what they say about assuming... My faithful army of security-minded Twitter followers alerted me to a sudden change in the Ubuntu Linux distribution's 12.10 beta build that they found alarming: Amazon search had been integrated into the system search bar by default, so that, for example, searching for a musician's name to find your MP3s on your local hard drive would also suggest albums on the Amazon store. As everyone assumed, the purpose of this surprise feature is to help Ubuntu raise funds through a partnership program, which in and of itself is not a bad thing. However, as a security professional with a keen eye for privacy problems, this immediately started setting off all sorts of alarm bells in my head. How is it implemented? Why is it in the default search? Have they done a thorough study on how the data is managed? Where can I read the privacy policy? I assumed that, of course, Ubuntu would be submitting these requests over HTTPS to screen your queries from prying eyes. I mean, that is just the most basic possible threshold for decent software these days! Hence my concerns jumped to common security mistakes I have seen in other real-world software analyzed by Veracode: do they check that the encryption certificate is issued for the correct website, and from a proper certificate authority? Is Canonical (the company that produces Ubuntu) logging any of this, intentionally or inadvertently? Are they anonymizing and encrypting any logs they may have? Are they sanitizing data flowing both ways? And I still can't find that privacy policy! It was with these deeper questions in mind that I tracked down the source code and cracked it open.

private const string OFFERS_BASE_URI = "";

I believe kids these days call this "My Face When": Melissa looks angrily at whatever Apple device is nearest. Or maybe: Owls can't believe you did that (Dear Google: Thank you for knowing exactly what I mean when I search "oh no they didn't owl!") If you missed it: that's HTTP, not HTTPS, meaning there is no encryption, meaning there is no network privacy whatsoever; everything you type into your desktop search bar is, by default, blasted onto the Internet in plaintext. Gee, how could that ever go wrong? Who would ever use the default search bar to look for deeply personal or sensitive files stored on their own computer while on public Wi-Fi or at school or work? (Where was that story I was working on... "PrincessLeia_Meets_CaptainPicard.doc"... oops). This is what we'd call a side channel or an information leak, because it's disastrously easy to accidentally shunt personal data to the Internet in a context where the user is thinking "here on my private computer." This is a user-hostile design because it demands that the user consciously decide whether to avoid the default search and either uninstall the Amazon plugin or make a separate local-only search on a case-by-case basis. This is assuming they even realize that this problem exists, and even then they may, as I did, assume that the Amazon integration is implemented in a privacy-respecting manner. There's a certain irony to this plaintext business in the fact that Canonical's founder Mark Shuttleworth also founded respected HTTPS certificate issuer Thawte -- not that he personally wrote this Amazon plugin, of course, but he did defend it on his personal blog, mostly focusing on the question of whether it's adware. As an organization, Canonical should be enforcing strict HTTPS policies for any networked applications they are responsible for. This Amazon search plugin is still in beta, so there's time to turn this around. Speaking both as a security nut and as a potential future user: I want to see HTTPS-only. I want to see a clear indication to the user that the Amazon functionality has been enabled for the default search and how to disable it or modify its settings. I want to see a clear-cut privacy policy for this plugin on what is done with the data and what Canonical believes its responsibilities are to the user (if such documentation exists I have not yet found it). We are just starting to get the message across to developers that all personal data should be treated with respect by default, and users are just starting to fully appreciate how often that isn't the case. Amusingly, the plugin's flood of attention means it is already racking up quite the bug count. For example, it does not currently do any NSFW prevention, and the results are returned in plaintext also. Have fun explaining that one to the ol' boss-a-roni!

Melissa Elliott is an application security researcher who has been writing loud opinions from a quiet corner of the Veracode office for two years and counting. She enjoys yelling about computers on Twitter and can be bribed with white chocolate mocha.

Love to learn about Application Security?

Get all the latest news, tips and articles delivered right to your inbox.




contact menu