I am about 60% done creating a new web application crawler. My goal is not to replace Nikto or any other tool for that matter, but I am more thinking about an application that we can use in the reconnaissance/information gathering phase.
So far, all my prototypes have been successful. Some of tasks it performs are already done by existing tools, and I know that. My goal is to put some of these existing functionalities plus many new ones in one single tool.
It will hae a GUI and a command line interface. Could be multi-threaded later if people like it. Finally, I will give it for free!
So here are the tasks it can/will do against a web site:
- Create a wordlist
- Find all emails, telephone numbers, fax numbers, etc
- Find names and guest possible usernames based on email address
- Find broken links
- Create a site structure
- Display robot.txt file details
- Find typical html files
- Identify all forms
- Search for login screens
My questions are:
1) What else would you find useful from a tool like this?
2) Output format from the command line version?
I want to launch a usefull tool. Not a script or two...
Thanks for you comments!