There’s a great scene in Shakespeare’s The Merchant of Venice, where Shylock, the reviled money lender, is summoned to court to appear before the Duke of Venice, who is looking for a way that the bankrupt Antonio, the play’s namesake, can be absolved of his bond to Shylock - the now infamous “pound of flesh.”

It’s a riveting scene. Bassanio - Antonio’s friend - offers to pay Shylock many times the amount he lent his friend, if he will release him from his bond. The Duke and others join in, asking Shylock to show mercy. Nobody questions whether Shylock is within his rights to demand payment from Antonio. Rather, they want him to let Antonio make good in a manner that is less offensive to their morals. But Shylock - spurred by his hatred of Antonio - will have none of it. He stands fast: arguing that the bond is his to collect as he sees fit:

You have among you many a purchased slave
Which, like your asses and your dogs and mules,
You use in abject and in slavish parts
Because you bought them. Shall I say to you
’Let them be free, marry them to your heirs.
Why sweat they under burdens?. . .

You will answer
’The slaves are ours.’ So do I answer you.
The pound of flesh which I demand of him
Is dearly bought. ‘Tis mine, and I will have it.

That scene came to mind this week as I was reading the back and forth over the Maltese security firm ReVuln’s announcement that its researchers had uncovered a hoard of previously unknown vulnerabilities in commonly used industrial control products.

Holes in SCADA products aren’t such big news, of course. Hardly a week goes by without news of some new hole in SCADA and ICS products. What made news was ReVuln’s claim that it would not share information about the holes with the respective vendors whose products were affected or with DHS’s ICS-CERT. Rather, ReVuln said it was keeping the details of the holes a secret and would provide information on them only to the organizations that purchased its services.

Not much is known about the holes discovered by ReVuln. The company released a video compilation that purports to show the holes being exploited. ReVuln named the affected vendors: Siemens, Schneider Electric, General Electric and others, which make some of the most commonly used ICS software in the world. But the video provides precious few clues as to what software products are affected. If you want to know, you gotta pay.

That seems like a violation of some deeply held principle. After all, we spent much of the last decade debating the relative merits of “full disclosure” of security holes and what Microsoft used to call “responsible disclosure.” In the latter case, security researchers who found holes were expected to turn them over, quietly, to the responsible software vendor and then give that vendor time to fix the hole and ready a patch before going public. Researchers who played nice got a “thank you” from Microsoft’s Security Response Center and a shout out in the vulnerability note. Cool!

But “responsible disclosure” was always a fraught term. Microsoft had trouble convincing the most talented and spirited researchers to play along and the mandarins in Redmond often found themselves in wars of words with free spirited researchers who didn’t want to play by their rules. The company famously got into a dispute with Google researcher Tavis Ormandy, who decided not to wait for Microsoft to fix a critical hole in Windows Help and Support Center before he disclosed it. Ormandy said that “responsible disclosure” wasn’t always in the best interests of users - especially if it was likely that those folks with the black hats on might find a zero day on their own. Microsoft cried foul, but it also ceded ground to Ormandy: changing its nomenclature from “responsible disclosure” to “coordinated disclosure” in 2010 in an effort to use less pejorative language.

Of course, by that late date, the debate over “full” versus “responsible disclosure” was an academic one. In the cyber underground and, increasingly, in above-ground markets, private vulnerability sales were (and are) the norm.

Today, ReVuln isn’t the only above-board firm that offers zero days for a price. The French firm Vupen made news in March after it demonstrated a remotely exploitable zero day hole in Google’s Chrome Web browser at the CanSecWest security conference, but pointedly refused to explain to Google how it worked. That, despite Google’s offer of $60,000 for a working exploit. “We wouldn’t share this with Google for even $1 million,” Vupen founder Chaouki Bekrar was quoted as saying, doing his best impersonation of Shakespeare’s money lender. And there are others. An investigation by Forbe’s Andy Greenberg uncovered evidence that firms like Northrop Grumman and Raytheon play in this market to, along with specialized firms like Endgame and Netragard.

Like Shakespeare’s disapproving Venetians, security researchers, privacy activists and academics take offense at this practice. The Electronic Frontier Foundation wrote that security researchers selling zero day exploits to those who want to take advantage of them are “responsible for making the Internet less secure for others” and have an “ethical responsibility” to help improve technology, not help subvert it.

Chris Soghoian, a principal technologist at the ACLU, has also warned about the implications of private vulnerability sales to private corporations or freelance firms like Vupen and ReVuln. “The existence of the exploit business is problematic,” he told “For the government to have this offensive capability, the rest of us have to be vulnerable.”

But Dale Peterson of Digital Bond, a longtime expert in the security of SCADA and industrial control systems, takes a much more nuanced, Shylock-ian view.

“Whoever finds the hole can decide what to do with it,” he said this week when I asked him about the ReVuln controversy. “We can spend time talking about whether its right or its wrong. But its been the case for decades and its not going to change.”

The real scandal, says Peterson, is the woeful state of many ICS applications, which were designed without “any concept of application security.” “Any company that’s looking for them will find a steady stream of SCADA vulnerabilities,” Peterson said.

In fact, his organization, which performs security assessments and architecture reviews on industrial control systems, often finds vulnerabilities that they don’t disclose, also. “We have not sold vulnerabilities and we don’t have plans to not sell them, but there’s a number that we keep to ourselves,” he said.

Sometimes that decision is based on prior experience with the vendor,” he said. “We know the vendor and we know it won’t get fixed.” In other cases, Peterson said, they publicly disclose the vulnerability, or pass it along to ICS CERT and the vendor. “We looked at it a year ago and couldn’t find any one policy that made sense,” he said.

In many cases, the only real “fix” for the problem is to go back to the drawing board and re-architect the entire product, Peterson said - not an answer any bottom line oriented software vendor wants to hear.

Secrets, greed, human fallibility? It’s all very Shakespearean. And, as is often the case with Shakespeare: there’s no guarantee of a happy ending when you get to the final act.

About Paul Roberts

Paul Roberts is an experienced technology writer and editor that has spent the last decade covering hacking, cyber threats, and information technology security, including senior positions as a writer, editor and industry analyst. His work has appeared on NPR’s Marketplace Tech Report, The Boston Globe,, Fortune Small Business, as well as ZDNet, Computerworld, InfoWorld, eWeek, CIO , CSO and He was, yes, a guest on The Oprah Show — but that’s a long story. You can follow Paul on Twitter here or visit his website The Security Ledger.

Comments (0)

Please Post Your Comments & Reviews

Your email address will not be published. Required fields are marked *

Love to learn about Application Security?

Get all the latest news, tips and articles delivered right to your inbox.