The L0pht Legacy

EH-Net - Sanabria - L0pht2020 Years Ago, Some Hackers Visited Congress…

May 19th, 1998. I was just wrapping up my first year of college. My grades were terrible. Instead of going to classes, I had been huddled in my tiny dorm room with the computer my parents bought me. I grew up with a computer in the house from an early age, but having my own computer was different. I was interested in every aspect of it, learning everything I could and enjoying hard-wired access to the then very young Internet.

L0pht Heavy Industries was way ahead of me. At the time, L0pht was a 6-year-old hacker think tank based out of Cambridge, MA. On the morning of May 19th, 1998, seven members of the L0pht were in the nation’s capital, donning suits and preparing to testify before a United States Senate Committee. They hadn’t flown to D.C. in first class or even economy. Instead, they rented a van, piled in, and drove down. L0pht wasn’t just ahead of me — they were ahead of an entire industry.

In fact, in some ways, they ‘seeded’ much of the cybersecurity industry. On a regular basis, I’ll be talking to someone or researching a story and someone’s work history, product or a company will somehow lead back to L0pht or @stake, its commercial successor. It’s still a relatively young and small industry. An industry with a workforce still in the hundreds of thousands or low millions and with revenues in the tens of billions. That might not sound small but look at it this way: Home Depot’s revenue is still bigger than our entire industry.

Upon reflection of this historic anniversary, it occurred to me that some of the issues L0pht raised in 1998 probably remained unresolved. I started to wonder — just how far ahead were L0pht’s ideas then? How many of the problems discussed at the Capitol 20 years ago are solved today? How many still exist? Are the L0pht’s recommendations still relevant today?

Video – Hackers Testifying at the US Senate, May 19, 1998 (L0pht Heavy Industries)

EH-Net Live! June – “Time’s up! GDPR is here – Now what?

Join Adrian Sanabria for a free webinar on Thurs June 7.

EH-Net Live! June 2018 - GDPR - Register Now

Catching Up with Weld and Mudge of L0pht

I had an opportunity to exchange emails with two of the L0pht members that spoke to the senate committee that day. Peiter Zatko (Mudge) and Chris Wysopal (Weld Pond) haven’t strayed far from the issues they so passionately drew attention to 20 years ago. They’ve both built organizations designed to address these issues over the years. They’ve both given talks to increase awareness and even developed related standards and best practices for the industry.


EH-Net - Sanabria - L0pht Then

Even over email, it’s clear the passion captured on camera all those years ago hasn’t faded for Zatko and Wysopal. The difference now is that both are able to point to progress that has been made — some by their own hands or with others. Wysopal co-authored an IETF standard for vulnerability disclosure with Steve Christey. Zatko and his wife, Sarah, co-founded the Cyber Independent Testing Labs (Cyber-ITL) together.


EH-Net - Sanabria - L0pht Now

L0pht turned into the consultancy @stake in 1999, which was then acquired by Symantec in 2004. Since then, Mudge has largely focused on the federal space, while Weld Pond co-founded Veracode in 2006 with other ex-@stake members to help companies to find vulnerabilities in applications.

EH-Net - Sanabria - L0pht - Zatko

Peiter Zatko AKA Mudge
EH-Net - Sanabria - L0pht - Wysopal

Chris Wysopal AKA Weld Pond

Is 20 years a long time for our industry?

“This is clearly going to be something that’s going to hit somebody big time one of these days…” — Senator Fred Thompson

Most of what was presented that day could be published today and would fit, unnoticed, into current cybersecurity conversations. Twenty years is certainly a long time for technology, but is it as long for security? Is it possible twenty-year-old security policy is still solid today?

The L0pht covered a lot of ground in 1998. Specific vulnerabilities in existing systems and software were shared. The group detailed more systemic risks and vulnerabilities in critical infrastructure including the Internet itself. Vulnerability notification and disclosure were hot, polarizing topics in 1998, and they are hot, polarizing topics today. Finally, the group shared issues with awareness and transparency in the industry.

To give some context, Windows NT was the primary business operating system in use at the time and there was little to no awareness of how insecure it was. For example, the average person wasn’t aware of all the issues with LM hashes and the NTLM protocol that NT defaulted to. Microsoft was claiming these hashes would take thousands of years to crack. The L0pht knew better, because they created L0phtcrack, in part, to disprove this claim. LM hashes are still found in use (though more rarely) by penetration testers today.

Detailed Coverage of the Issues

#1: Secure software, liability and perverse incentives

“There’s not incentives for the industry to do more here, is there?” — Senator Fred Thompson

The problem with software quality and security was clear to the L0pht back then, and, if they can be credited with anything, it would be spreading awareness of the lack of incentives for vendors to produce secure products. Unfortunately, effective incentives for secure software still elude us.

An interesting observation is that individuals and organizations appear to view vendors as inherently good and trustworthy. The reality is that most software vendors are inherently profit-seeking. The leading factor here is that security adds time and complexity, and thus potential threats to profitability and growth.

Zatko points out that the incentive structure is inherently perverse. Investors reward growth over all else even profitability. Fast growth requires speed. Time-to-market matters and also prioritizes speed over quality or security. Worrying about privacy and security seems a lot like dragging a boat anchor in the eyes of most companies.

Imagine where Uber would be if it waited for its services to be legal before operating in markets. There’s no way it could have grown as quickly as it has. It simply operates illegally until the popularity of the service forces legal changes. In the case of commercial software and products containing software, there aren’t even legal hurdles to worry about — vendors aren’t required to ship secure code and aren’t legally required to fix vulnerabilities.

Currently, [there are] no assurances from the software vendor relating to security and no transparency regarding what, if anything, they’ve done to make it ‘secure’. — Chris Wysopal in 1998

Wysopal notes today that, “It is, of course, cheaper for them [software vendors] and more expensive for customers who operate the software and then need to deal with patching and breaches.”

Have we made progress here?

After mostly removing the most chilling effects of export controls on cryptography, the Wassenaar Arrangement threatened to take us a few steps back. The US State of Georgia earlier this month narrowly avoided passing legislation that would have made it illegal to perform security research within its borders. Still, Wysopal and Zatko point to some places where progress has been made.

Zatko points out that current bounty and black market rates (well into six figures USD) for working exploits on some systems and applications proves that we’ve made significant progress in some areas. Wysopal points out that his company, CA Veracode, offers a service where they act as a mediator between software vendors and their customers. Veracode will perform diligence on the software and offer a sort of certification path for software vendors to get ‘verified’. While this isn’t a guarantee of secure software, it serves as proof that a third party has reviewed the code for security issues – a basic step that many feel should be required for software vendors to at least sort out the most basic and glaring security issues.

What still needs to happen?

Zatko laments, “…only a few instantiations exist as exemplars to hold up. [For] each of these hardened systems, there have been hundreds of thousands of IoT devices… that have completely ignored these defensive capabilities.” He notes that even software produced by the security industry itself, touting to be the latest and greatest defense, often haven’t implemented the most basic hardening and secure software development practices.

Again, it seems we still need better incentives for software companies. Senator Fred Thompson predicted that the market might naturally correct itself to address this issue (it didn’t), or that existing provisions in common law could possibly be used to apply legal consequences (they haven’t). Wysopal points out that “…software companies don’t want to even agree to simple things, like a bill of materials listing the open source components and versions they use in a product and a guarantee that they won’t ship known vulnerable open source libraries.”

Guarantees are an interesting and polarizing topic. In 1998, Zatko shared an example from the bike lock manufacturer, Kryptonite. At the time (and still today), Kryptonite offered to reimburse owners a certain amount if their bike was stolen as a result of the lock failing. Similarly, Jeremiah Grossman is known in the security industry for offering similar guarantees at companies for which he’s worked namely WhiteHat Security and SentinelOne. Trustwave, Armor and Cymmetria have offered similar warranties.

Zatko calls for better standards and independent organizations or government agencies that can provide the transparency that vendors are currently unwilling to share. He makes the point that existing standards tend to focus on design and documentation processes, rather than a product’s state or properties in terms of security. The result is, for example, a product that might use ‘military-grade’ encryption, but ships with default credentials and no guidance or enforcement to change insecure defaults.

Both Zatko and Wysopal have built organizations that aim to provide more transparency around software security. Wysopal’s company, Veracode, offers a software review program for clients that results in a certification. Zatko differs here, arguing that perverse incentives create a conflict with for-profit companies and recommending that the necessary transparency be provided by a non-profit or government organization. His own Cyber-ITL organization is a 501(c)3 non-profit.

#2 Taking Down the Internet

One of the biggest takeaways from the original hearing was Zatko’s statement regarding the ability for any L0pht member to take down the Internet “with just a few packets”. Though Zatko didn’t dive into details, he was referring to Border Gateway Protocol (BGP) attacks which are still a serious issue today. In short, the BGP is used to help make routing decisions based on several criteria and is a foundational technology utilized in making the Internet autonomous. Zatko was the source of the research on this attack at the L0pht. In addition to BGP attacks, he was looking into the effect DNS attacks and application-layer denial-of-service (DoS) attacks could have on core Internet infrastructure.

Unfortunately, we’ve seen Zatko’s warnings realized in recent years. The Mirai botnet’s attack against Dyn, a DNS provider that has since been acquired by Oracle, impacted a significant number of popular websites and services for a short time. We see both accidental and intentionally malicious BGP route changes on a regular basis.

Wysopal observes, “It happens almost every day in minor ways. We saw a big attack a few weeks ago on Amazon’s Route 53 DNS service. A BGP attack routed DNS requests to a rogue DNS server that resolved to a server in Russia. This was done to trick people into depositing bitcoin in the attacker’s wallet. The attack lasted a few hours and the criminals netted around $150K.”

Have we made progress here?

In some ways, I think we have. Akamai and CloudFlare are capable of absorbing incredible amounts of DoS attack traffic. We haven’t seen anyone take down the entire Internet, but clearly parts of it can be taken down or hijacked via DNS and BGP attacks. It isn’t clear whether the lack of events impacting the entire Internet all at once is the result of improvements in resilience or simply a lack of bad actors with motives.

What still needs to happen?

Again, Zatko today points out that this is technically a solvable problem, but, since businesses aren’t incentivized to do so, it doesn’t get fixed. He writes, “Again, this is a system issue and an incentive structure based on letting the free market be the main solution to a community health issue.”

There are also issues that can’t be addressed by a free market. In cases where one private or public organization controls a gas line or Internet segment, there’s often no financial incentive to secure or improve it.

#3 Critical Infrastructure

Stefan Von Neumann (Stefan Wuensch) focused heavily on critical infrastructure with his commentary. He pointed out that these companies widely used insecure protocols and methods, controlling, monitoring and managing critical equipment over public infrastructure accessible to anyone with the right tools. He also points out that organizations, utilities in particular, were more prepared for physical attacks than digital ones.

Stefan was also concerned that utilities and software vendors were tight-lipped about security issues; an approach (secrecy) that Zatko already pointed out had been “more detrimental than beneficial by a long shot.” In addition, customers and users were unaware of security issues and therefore could take no action to protect themselves.

“I would personally like to see the same type of independent review process that should exist for software companies extended to utility companies and Internet service providers.” — Stefan, 1998

Have we made progress here?

We’ve certainly seen a lot more attention on critical infrastructure, but the risks didn’t become widespread public knowledge until much later. NERC, for example, created information security standards in 2003, but it was still common to find utilities with critical systems exposed to the internet a decade later. Stuxnet, uncovered in 2010, was a sobering revelation and resulted in more attention on the problem.

What still needs to happen?

I didn’t reach out to Stefan and this isn’t my area of expertise, so I’d love to hear from others. Share in the comments section at the bottom of the article. What do you think? How are we doing here today?

#4 Full Disclosure

“Should there be public notice of successful hack attempts or software vulnerabilities?” — Senator Joe Lieberman

Disclosure was a big topic, especially in relation to software liability and commercial vendors. Zatko’s response was concise and highlighted the reason disclosure is still a hot topic today.

“This [disclosure] is definitely a double-edged sword, because if you give the information out, other people can figure out how to exploit it. However, if you don’t give the information out, the people out there can’t protect themselves.” — Zatko, 1998

Vulnerability notification was also discussed as an issue. It is the other side of the disclosure coin in notifying consumers to ensure they’re both aware and protected. In the case of full disclosure, everyone is aware, because the issues are publicly released. However, Space Rogue made the point in 1998 that software vendors should ideally be notifying their own customers when issues exist.

The example he used was a recent one, where VW’s New Beetle (then, a new model) had been recently recalled, which involved mailing 8500 letters — one to each consumer that purchased an affected vehicle. Wysopal pointed out that some companies wouldn’t even tell you a fix is available for a known issue, unless you contact them and ask them for it. They keep it secret from customers unless public or legal pressure forces them to do anything different. The Catch-22 of this approach is that, since customers weren’t aware, they wouldn’t know to ask.

“Full disclosure is very important. You have to educate people. Education is one of the largest things that’s missing out of this. If I’m an administrator and there’s a problem in what I have to control, but companies don’t let me know about it, I can’t be expected to fix it. Even if companies don’t have a fix themselves, if I know of the problem, I might be able to put other things in front of it so I can catch it.” — Zatko, 1998

Have we made progress here?

Currently, we’ve seen coordinated disclosure have some success as a compromise between full disclosure and whatever the vendor might choose to do, if given full control of the situation. In coordinated disclosure, the researcher uses the threat of exposure to keep the software vendor honest and prompt.

Google Project Zero mandates a strict 90-day timeline after notifying a vendor of a security bug. Other organizations set shorter or longer limits. Generally, it seems to work, though not all researchers use this approach. Full disclosure still occurs, while other researchers simply notify the vendor and wipe their hands of the issue — not interested to have to hound or monitor the vendor until they release a fix. Maybe it gets fixed, maybe it gets swept under the rug.

What still needs to happen?

There’s still no federal or legal requirement for vulnerabilities to get reported or fixed. There’s no requirement for software to be reasonably free of vulnerabilities when sold. There’s no requirement for a product to undergo third-party due diligence as many electrical, physical and medical products are. Software vendors are still free to release a product that’s completely insecure and hackable.

Fear of public discovery, backlash, breach fines and civil suits are currently the primary incentives for companies to make an effort to release reasonably secure software. As Wysopal and Zatko both point out today, it’s still cheaper to release an insecure product. Federally-enforced standards are likely the only effective solution and are most likely to arrive in the form of IoT security standards before they arrive for software products in general.

The ‘Forgotten’ Eighth Expert

EH-Net - Sanabria - L0pht - NeumannAnother thing the media tended to miss was that the seven L0pht members were not the only ones to testify that day in 1998. Before they testified, they shared their testimony with Dr. Peiter Neumann, who was already a common sight in these sorts of subcommittee events on Capitol Hill.

Dr. Neumann was what you might call a ‘greybeard’ before current greybeards could grow a beard at all. Involved with computing since the 1950s, Dr. Neumann has a biography that could fill a small book and was present and active during many of the computing industry’s watershed moments. He has worked extensively at the problem of computing resilience, which overlaps into cybersecurity. For anyone that doesn’t mind losing an entire evening (or three), his website is fairly comprehensive and includes his written testimony from 1998.

Though Dr. Neumann’s background was deeply steeped in academia and formality, L0pht’s members knew him and got along well. In fact, Zatko recalls, “…right before his testimony, he was provided a copy of our testimony and read it. He marked it up and used it to highlight and inform parts of his comments.” Similarly, Wysopal remarks about the respect he has for Dr. Neumann and his efforts to “sound the bell on computer risks through his comp.risks mailing list since the early 90s”. In fact, that mailing list still exists and is accessible via the website and RSS feeds. It has existed since the 1980s and recently released its 30th volume.

The friendship between Neumann, already a trusted face on Capitol Hill, not only provided additional credibility for the L0pht crew, it reinforced the need for clear, open and willing communication between all parties. Unfortunately, bringing vendors and service providers to the same table remains a challenge and are typically only seen before these sorts of subcommittees after a failure has occurred (e.g. Equifax’s breach; Facebook and Cambridge Analytica; Uber and HackerOne).

The L0pht Legacy

The idea of a watchdog group, or average individuals disclosing vulnerabilities to large software companies, was fairly new at the time. It seemed risky then, but it’s now commonplace.

“Dude, are you really going to call out Microsoft… in PUBLIC?! Are you crazy? They’re gonna sue you to SLEEP.”

L0pht, along with other similar hacker groups of the time, emboldened a generation of researchers and IT staff to hold software companies accountable for the code they created and sold. Looking back, there was more to this meeting with Congressional members than just fleshing out problems. It served as a positive example we could always point to and say, “See? This can work. Hackers don’t have to be at odds with private industry and the government.”

The relationship between these groups has been tested over the years, but this event proved that both sides could come together, be reasonable and see eye-to-eye on important security topics.


“In conclusion, hopefully you having us here is not a fluke and hopefully we’ve not offended in any way, but this might be the beginning of an ongoing dialogue between the government and hacker groups such as ourselves.” — Zatko, 1998 

We have the means to make systems more secure and resilient — this much has been proven. We lack, however, the business incentives to make it happen. Several L0pht members are specifically calling for the US government to enforce some baseline security measures, especially with regards to software liability and network peering at the tier 1 and tier 2 providers. I leave you with a few additional notable quotes from 1998:

An unhackable system shouldn’t be the goal — it should be to make attacks more difficult; to raise the bar. — Space Rogue, 1998

“Think about what the government could do, if anything, through law to address these issues” — Senator Joe Lieberman, 1998

“In an industry where ‘time to market’ matters, who wants or cares to add security or even thoroughly test their product? Well, you should. You, the government and consumer, should care and want software products to include security and authentication mechanisms and I think you do. You should encourage the companies to include this in their products and hold them liable when their products fail.” — Zatko, 1998

“The first time some big company is compromised… it may fix itself. There will be a massive lawsuit and everyone will wonder why we didn’t address this in the beginning.” — Senator Fred Thompson, 1998

Author note on the final quote from Senator Fred Thompson: It didn’t.

As the L0pht appearance in front of Congress was a teachable moment, it was also a call to communicate. As such, we make that same pledge and call to whomever might be reading this. Keep the conversation going not only at the bottom of this article and elsewhere on EH-Net, but everywhere else for that matter.

Keep researching. Keep sharing. But most importantly keep communicating!


Author Bio

EH-Net Columnist - Adrian Sanabria - Pic

Adrian Sanabria is the VP of Strategy and Product Marketing for NopSec. He spent a decade building security programs and defending large financial firms. He also spent many years as a consultant, performing penetration tests, PCI audits and other security-related assessments. Adrian learned the business side of the industry as a research analyst for 451 Research, working closely with vendors and investors.

Prior to NopSec, Adrian co-founded Savage Security, an applied research and consulting firm dedicated to making it easier for defenders to succeed. He continues this mission at NopSec, who helps practitioners by prioritizing vulnerability data and automating workflows.

Adrian is an outspoken researcher that doesn’t shy away from uncomfortable truths. He loves to write about the industry, tell stories and still sees the glass as half full. He can also be found on Twitter @sawaba.

All articles by Adrian Sanabria

Viewing 2 reply threads
  • Author
    • #168521
      Adrian Sanabria

      20 Years Ago, Some Hackers Visited Congress… May 19th, 1998. I was just wrapping up my first year of college. My grades were terrible. Instead of goin
      [See the full article at: The L0pht Legacy]

    • #168568

      Interesting article. Good to remember our past.

    • #170043
      Michael J. Conway

      This is a great read and I thoroughly enjoyed it. In 98, I went to basic and missed this completely. On the federal side of things, I think we have made more progress towards secure systems, particularly in the DoD. We still have a ways to go as cybersecurity still feels like a bolt on at times but I think it is getting better. This is thanks in part to the DoD pushing the NIST Risk Management Framework and some of the ground work laid by the NDAA of 2013. One of the unique provisions there was to require software developers to start doing some kind of source code analysis and looking for security bugs earlier in the development process. Since then, I have worked on several projects that have implemented source code analysis. It may take a little longer to develop “secure” software, but that time to develop is far shorter than the time to fix after it has been developed.


Viewing 2 reply threads
  • You must be logged in to reply to this topic.

Copyright ©2021 Caendra, Inc.

Contact Us

Thoughts, suggestions, issues? Send us an email, and we'll get back to you.


Sign in with Caendra

Forgot password?Sign up

Forgot your details?