I caught up with Michael Eddington’s short and sweet analysis of the request validation in ASP.NET 2.0. So far I’ve seen a few people blast it, but I think it will actually help ASP.NET security against XSS in general, thanks to the Pareto principle (also called the 80/20 rule). I’ll quickly summarize Mike’s post and the situation in general.
ASP.NET 1.1 had relatively strong blacklist protection against XSS turned on by default. There were a few ways discovered to get around the blacklist, but after the hotfixes were released it seemed pretty strong. It was, in fact, too strong for its own good, however. It “catches” so much that developers frequently turn it off. Anecdotally I can say that most (70-80%) of the ASP.NET applications I look at explicitly disable the feature.
Then, in ASP.NET 2.0, Microsoft dumbed down the request validation significantly. To quote Mr. Eddington:
While asp.net v2.0 and higher performs the following:
- Look for &#
- Look for ‘<’ then alphas or ! or / (tags)
- Skip elements with names prefixed with double underscore (__)
So, after thinking about this for about 30 seconds we can at least see that they’re vulnerable to the following types of attacks:
- Attribute-based XSS (injecting into an attribute)
- UTF7/US-ASCII encoded attacks
So this is a step backwards for Microsoft right? Were they just munching on FAIL cookies?
They no longer look for event handlers, expression calls, etc. This is great for attackers, right? This means that I can focus more on improving my Halo 3 scores and banging that Michelle chick in GTA IV since XSS is now much easier to perform? Well, those are very important goals, but I think the truth is that this helps Microsoft customers reduce the number of XSS incidents they’ll have.
Let’s think about the facts:
- The validation strength in ASP.NET 1.1 was very strong
- Most everybody turned off request validation in ASP.NET 1.1
- The request validation in ASP.NET 2.0 is weaker than 1.1
- Most people will leave on request validation in ASP.NET 2.0
If this holds, the security will actually be better. The big assumption of my argument lies in #4. I’m assuming that because the 2.0 validation mechanism won’t be as big a roadblock to developers, it will be left on. Developers may though, out of habit, just turn it off.
True, the ASP.NET validation, when looked at as an XSS validation mechanism without context, appears to be something Shooter McGavin would eat for breakfast since it doesn’t stop all attacks. However, in its new version, it will stop most of the attacks (the “body” XSS vulnerabilities) while not interfering with developers.
It seems like voodoo that they actually may improve their security by weakening it. Although, my question after writing this post is- what problems and legitimate test cases did developers fail with request validation on? Should users be legitimately allowed to send in strangely encoded data or HTML tags? Sounds like Giorgio’s problems with “false positives” with NoScript (hilarious read).