I caught up with Michael Eddington’s short and sweet analysis of the request validation in ASP.NET 2.0. So far I’ve seen a few people blast it, but I think it will actually help ASP.NET security against XSS in general, thanks to the Pareto principle (also called the 80/20 rule). I’ll quickly summarize Mike’s post and the situation in general.

ASP.NET 1.1 had relatively strong blacklist protection against XSS turned on by default. There were a few ways discovered to get around the blacklist, but after the hotfixes were released it seemed pretty strong. It was, in fact, too strong for its own good, however. It “catches” so much that developers frequently turn it off. Anecdotally I can say that most (70-80%) of the ASP.NET applications I look at explicitly disable the feature.

Then, in ASP.NET 2.0, Microsoft dumbed down the request validation significantly. To quote Mr. Eddington:

While asp.net v2.0 and higher performs the following:

  1. Look for &#
  2. Look for ‘<’ then alphas or ! or / (tags)
  3. Skip elements with names prefixed with double underscore (__)

So, after thinking about this for about 30 seconds we can at least see that they’re vulnerable to the following types of attacks:

  • Attribute-based XSS (injecting into an attribute)
  • JavaScript-based XSS (injecting into JavaScript)
  • UTF7/US-ASCII encoded attacks

So this is a step backwards for Microsoft right? Were they just munching on FAIL cookies?

Microsoft developers fail?

They no longer look for event handlers, expression calls, etc. This is great for attackers, right? This means that I can focus more on improving my Halo 3 scores and banging that Michelle chick in GTA IV since XSS is now much easier to perform? Well, those are very important goals, but I think the truth is that this helps Microsoft customers reduce the number of XSS incidents they’ll have.

Let’s think about the facts:

  1. The validation strength in ASP.NET 1.1 was very strong
  2. Most everybody turned off request validation in ASP.NET 1.1
  3. The request validation in ASP.NET 2.0 is weaker than 1.1
  4. Most people will leave on request validation in ASP.NET 2.0

If this holds, the security will actually be better. The big assumption of my argument lies in #4. I’m assuming that because the 2.0 validation mechanism won’t be as big a roadblock to developers, it will be left on. Developers may though, out of habit, just turn it off.

Maybe I’m giving them too much credit, but I bet Microsoft analyzed the XSS vulnerabilities customers produced and found that some percentage, say 75%, were what I call “body” XSS vulnerabilities – those that are not attribute-based/JavaScript-based/DOM-based. So, the 80/20 rule comes into play. If they prevent “body” XSS vulnerabilities (I hate the name, please don’t use it), they eliminate 75% of the vulnerabilities. This is better than the 0% they were preventing before when developers were turning it off completely.

True, the ASP.NET validation, when looked at as an XSS validation mechanism without context, appears to be something Shooter McGavin would eat for breakfast since it doesn’t stop all attacks. However, in its new version, it will stop most of the attacks (the “body” XSS vulnerabilities) while not interfering with developers.

It seems like voodoo that they actually may improve their security by weakening it. Although, my question after writing this post is- what problems and legitimate test cases did developers fail with request validation on? Should users be legitimately allowed to send in strangely encoded data or HTML tags? Sounds like Giorgio’s problems with “false positives” with NoScript (hilarious read).