Post Wed Sep 27, 2006 5:00 pm

Famous Writer Calls for End to CSI/FBI Survey

A serious topic deserves a serious survey, says Ira Winkler, author of Spies Among Us. And the CSI/FBI Annual Survey is not it.

The information security industry doesn't go more than a couple of weeks between the releases of surveys, most of which exist for marketing purposes rather than as reportage of major discoveries. Though venerable, the annual CSI/FBI Computer Crime & Security Survey is no exception -- and some of the claims it makes would, or should, stop a reasonable security pro in his tracks.

The survey is run by the San Francisco-based Computer Security Institute, which was founded in 1974. The survey began in the mid-1990s. In its early days, CSI got the FBI's Computer Intrusion Squad to co-sponsor its survey, providing a certain name cachet to a study by an organization with which few people were otherwise familiar.

While CSI offers useful training courses, education programs and major conferences, the organization feels compelled to keep conducting and releasing results from this poorly executed study. That's unfortunate, because a number of problems with the survey methodology compromise the credibility of an otherwise good organization.

The primary weakness of the CSI study is sample control -- that is, its sources aren't sound. The initial respondent pool is drawn from two sources: CSI membership rolls and the roster of paying attendees at its conferences and training events. CSI claims that it surveys 5,000 people, but that's simply how many surveys it sends out. One year, I personally received six of those 5,000 surveys, which are sent via both e-mail and snail mail; I can only imagine how many other people got. People who should be receiving the survey, for instance. I'm not one of those people, but still I get one or more copies every year. I could have easily made up data and return the survey to skew the results -- several times over.

Then you've got the response rate to that mailing of 5,000 people. By my calculations, except for one year, the response rate for the survey historically hovers around 10%. (This year's survey garnered 616 responses, or 12.32% of those solicited.) While this doesn't necessarily mean that the study is faulty per se, it does mean that there is an extremely high margin of error.


He concludes by writing:

While in the course of consulting, I might normally give people advice on how to apply a study or what messages to take away, my best advice in this case is reserved for CSI. At this point, the organization is only hurting its reputation with the survey as conducted. CSI needs to respect its audience and give us a report that incorporates proper survey techniques and accurate descriptions of the limits of its polling. Their results are so far from what the average member/subscriber/reader experiences on a regular basis or reads in every other source of information that it's just time to call the study quits and preserve CSI's otherwise good reputation.


For full story:
http://www.computerworld.com/action/art ... Id=9003640

Don
CISSP, MCSE, CSTA, Security+ SME