Interview: Bugcrowd Founders on Herding Ninjas for Crowdsourced Bug Bounties

| March 29, 2013

Bugcrowd LogoBy Jason Haddix

Love it or hate it, crowdsourcing is here to stay. While it’s been mostly confined to development and design, eventually it was going to come to security.  Two such gentlemen trying to pioneer the space are Casey Ellis and Sergei Belokamen. Being long-time hackers and having seen how the security space works, they decided to start Bugcrowd. Here’s a description directly from the source:

“Bugcrowd is by far the most comprehensive and cost-effective way to secure websites and mobile apps. We’ll do a brief consultation and help you set the budget, the duration, and which websites or apps you’d like our curated crowd of researchers to test. The Bugcrowd researchers get to work finding security flaws in your applications. All testing can be routed through Bugcrowd’s crowd-control system, providing control and accountability. Any bugs are submitted to our Secure Operations Centre as soon as they are found. We validate the flaws and, at the end of the bounty, reward the first researcher to find each unique flaw. We provide you with an easy to understand report for you to hand to your developers… We can even recommend partners to help you fix what we find!”

Join me as I interview them both about their new venture and uncover some interesting information about security testing on a massive scale, as well as how to start. For example, if you are a tester looking to participate, it couldn’t be easier. Fill out the “Ninja” form and create an online profile (public or private) in which you provide Bugcrowd with your PayPal email address. Then you wait until you receive an email message announcing a new bounty… and it looks a little something like this…

How cool is it to get something like this from Bugcrowd in your inbox:

“********** BOUNTY ALERT: Bounty X is now open! **************

Hi All,

Bugcrowd bounty Beta X is now open. It will run for 5 days and the reward pool to USD 3,500.

1st Place: USD 1,000 + 20 points
2nd Place: USD 500 + 15 points
3rd Place: USD 250 + 10 points
All other valid bugs (if first to find and disclose): USD 100 (or the remainder of the reward pool divided by the number of valid bugs, whichever is higher) + 5 points
All other valid bugs: 2 points

Testing is permitted on the web application at the nominated target addresses only.
All other systems and applications are out of scope. This includes testing systems linked to by the target application, the target application’s underlying host, it’s SMTP and auxiliary services, it’s control panels, the target’s hosting provider, and any other hosts or web apps which may be in the target domain. This is not a traditional penetration test.

Permission to test is given on condition of non-disclosure of the target and any findings. If you do not agree to this please discard this email and do not participate.
Excluded findings:

• Findings from systems other than the targets specifically nominated in the bounty brief
• Findings from social engineering
• Webserver banner disclosure
• Logout cross-site request forgery (CSRF)
• 404s
• DDoS
• Functional or UI/UX bugs
• Spelling mistakes
• Information disclosure gathered through OSINT

Cheers, and good hunting!”

That’s almost a better shot of energy than my morning coffee… almost. And if you’d like to have them run a bug bounty for your organization, well that’s simple, too. Just visit their http://bugcrowd.com/contact/ and give them your email addy and budget, and you’re on your way. Check out The List to see who has already created a bug bounty program.

So with that, I go hunting for more information on Bugcrowd from Mr. Ellis and Mr. Belokamen.

Jason Haddix (JH): Welcome Guys!

Casey Ellis (CE) and Sergei Belokamen (SB): Hello Jason! Thanks for having us!

JH:  How many ninjas, on average, compete in a paid bounty? What about a charity bounty?

CE: A recent open paid bounty saw about 500 people participate to various degrees. You’d expect the charity bounties to have less traction, but it’s actually about the same… Which is awesome when you think about it… says good things about our industry.

JH:  In your event start emails you mention scoring, 1st place, 2nd, etc., etc. Are the ‘places’ determined by amount of bugs submitted or by risk/criticality of vulnerabilities found?

CE: “Placed rewards” as we call them are for creative or high impact issues. This approach allows us to cap the budget for the client. Interestingly though, the person who takes 1st place doesn’t necessarily end up with the most cash or the most points… We’re incentivizing coverage as well as creativity.

JH: What has been the most creative bug found so far?

CE: My favourite so far is still from our first POC bounty, where one of the testers piped obfuscated javascript back into the app via Twitter to trigger an onMouseover XSS on a hidden page. Not an earth-shattering issue, but very creative and well executed.More recently we’ve been getting a fair few framework 0-days, which is pretty cool too.

JH: The competitions have leaned towards web applications so far.  Will Bugcrowd eventually move to binaries and mobile apps?

CE: Yep. I think by and large the concept of web application security testing is more mature which means the demand is higher. That said, we’ve got some mobile app gigs coming soon, and a couple of client-server bounties where client side daemons will be involved.

JH: During some of the last bounties, you have implemented a system called CrowdControl. Can you tell us more about it?

CE: The idea of CrowdControl is to handle the issue of how to control the testing, how to keep the participants accountable, and how do clients tell good traffic from bad traffic.

Version 1 (our current iteration) provides an encrypted tunnel for testers to route their traffic through. We can do traffic shaping, pause or stop testing, traffic logging, and so on.  CrowdControl also allows our clients to run a crowdsourced test on a staging instance without opening it up to the general internet.

JH: Are you participating ninjas as well?

CE: We validate the bugs after they are submitted and do take a bit of a poke around, but for obvious reasons we don’t participate in the bounties.

JH: What is the time progression for vulnerabilities? Meaning how fast do your ninjas work? Do most get found within the 1st hour? 2 hours? 3+ hours?

SB: The typical progressions are:
- 1 hr – Clickjacking, simple XSS, SSL issues, etc.
- 2 hr to 8 hr – More “simple” issues but ones that are more deeply embedded in the app
- 8 hr to 24 hr – Usually a bit of a lull
- 24 hr + – The really interesting stuff… Bug chaining, etc. submissions from those who’ve given the time to properly grok the app and the possibilities for exploitation. I’ve talked to a few of them about this and it tends to be that they’ve have an initial look, gone away from a bit, had an “oh… I wonder about that…” type of moment and re-engaged.

JH: Bug bounty legalities are a scary threat to some bug hunters. How do you protect the liability of the ninjas?

CE: Sure. We get the client to put a page up that renders the bounty ID, the timeframe, and a few other bits of info. We’re putting a countdown timer in for the next one which will be cool. The main condition we put on the testers is non-disclosure… unless the client waives it (which some are doing, especially when they want to make a lot of noise about the fact that they are running a bug bounty).

JH: So far the testing has been on live sites that seem to be customer owned. Do you offer a service for customers to copy their apps, and then Bugcrowd would host them for testing thus limiting the liability and exposure of the clients’ network?

CE: Yes. We’ve got one coming up which will be an application which is generally sold and hosted onsite. We’ll be hosting it for the test. In terms of hosting VMs and the like for testing, we aren’t encouraging that, but we definitely have the infrastructure and the capability to do that.

JH: What percentage of the bugs is found via automated testing vs. manual testing?

SB: By volume roughly 50/50.

JH: How do you feel about bug hunters using automated tools? Is it discouraged or encouraged?

SB: There are a lot of things in web app testing that are completely impractical to *not* automate.

Spidering, finding unlinked pages, BSQLi and input fuzzing are good examples.

It is allowed, but we ask testers to throttle back, given that they aren’t the only ones testing at the time. We use CrowdControl to manage this.

JH:  Off the top of your head, what vulnerabilities have been the most present among your bug bounties?

SB: Clickjacking, autocomplete not set, poor error handling are the simpler ones that are basically everywhere and get reported very quickly. Aside from those, LOTS and LOTS of XSS.

JH: Have relationships with clients been agreeable thus far?

CE: Yep. One thing we’ve had to manage a few times is a level of embarrassment on the part of the client due to the volume of issues they receive under this approach.

JH: Tell us more about your charity portion.

CE: We run free bounties for charities and NFPs. I’ve been involved in charities in lots of different ways, and this is a great way to help them out with something that almost always gets overlooked. There’s the social responsibility angle to it, but it also help us keep the crowd warm in between paid bounties, and helps the testers build up their Kudos points so they can participate in private bounties.

How it works is we do our part for free and the researchers who participate do their part for free. We’re actually a little surprised at the level of engagement we’ve received out of this – A combination of researchers taking every opportunity to test that they get, and a desire to help out. Any charity is invited to apply for a bounty at http://bugcrowd.com/contact or by emailing us hello@bugcrowd.com.

JH:  How is the work on the point tracker?

CE: We’ve just stood up the first version of our tester management system, including a leader board and optionally public profile pages. It took a little longer than expected because, given the skills of the people who’ll be using it, we want it to be extra robust. We’ll be putting a lot of work into improving it and making it more valuable for testers over the coming months.

JH: Rather than pose a question that you’ve already answered quite well elsewhere, do you mind if we cite your blog and a Reddit thread for a moment?

SB & CE: Feel Free!

When approached with the questions about traditional assessments vs. Bugcrowd via a Reddit thread:

CE:  “We see 3 main consumers for Bugcrowd initially (at the moment that is… we are still very young and are working off the assumption that at least 50% of our assumptions are wrong).

1) The charity/nfp/pfp market, for which the work is completely pro-bono (aside from our points system, which testers can use as a 3rd party validation) and Bugcrowd acts as a charitable facilitator.
2) Those who think bug bounties are a great idea but haven’t implemented one yet. In this group in particular we will strongly recommend matching the reward of the bounty to the market.
3) People like the folks who just ran a $5k bounty, who are large enough to need thorough security testing (which, we’ve now confirmed, the crowd seems to be pretty good at providing) but small enough to have a budget issue when it comes to getting a useful amount of time from a consultancy.
$5k is the baseline bounty pool, and we will only recommend this for startups and lower-value targets. The larger the client the more important that their bounty is higher (like $50k to $100k in the pool for example), and you will see this reflected in future bounties. That said, I also love the idea that a really small businesses can throw out $1k and still get something more usable than 1/3 of a day of normal consulting time. So maybe you’ll see us running a few of these too at some point.

The beauty of a crowd, especially in this field, is that there are a very wide range of motivations – from curiosity, to learning, to poops-and-giggles, to folks you can’t get out of bed with less than “buy a new kitchen” money. The broad goal of Bugcrowd is to provide a service which connects all types within the InfoSec community with all types of clients. Excuse the infomercial… but hopefully I’ve stayed on point and answered a few things. This thread has been incredibly useful for us, so thank you all (both likers and haters).”

JH: Thanks for your time guys!

CE & SB: Anytime Jason!


 

Tags: , ,

Category: Haddix

Comments are closed.