What other way is there of finding 216 million flaws in sub-second scanning time? Google, of course. How about 160,000 strictly within .gov? These numbers are absurd, especially since I’m only searching for one type of URL rewriting for J2EE. This type of flaw usually rates to a medium – the result of the combination of high impact and low likelihood.

URL rewriting is really only a problem because people aren’t especially good at rotating the user’s session ID post-authentication, so if you can trick a user into clicking on a link that has the session ID in it, they will be using that session ID from that point forward. All you have to do is wait 5 minutes, then use the session ID you sent them and hijack their identity and, most likely, their Twitter account because apparently they take security lessons from Oracle, which, for the uninitiated, is like taking gun safety lessons from Plaxico Burress (my 2nd round pick in fantasy football, I am salty).

URL re-writing is bad for one-click session fixation, SEO (page 8), usability – it’s just a bad, bad idea. Why don’t you use some clever JavaScript for tracking cookieless user state instead?

Of course none of these Google hacking techniques are new, but neither is David Spade and he’s banging it out with all kinds of hotties (note to Nicollette Sheridan: you can invade my Gaza strip anytime – also do you have a son named Eric?). It’s just that the numbers for this particular area are so crazy I had to write something up. And when I brought this up to j-dubs he of course tried to outdo me (typical) by trying to conjure up huge numbers in Google’s Code Search looking for the most blatant reflected XSS and the most obviously exploitable SQL injection vulnerabilities. He couldn’t come close, but notes correctly that many of those could be in widely deployed software.

Can anyone beat that number? With a similar-or-higher-severity vulnerability?