Tuesday, December 28, 2010

Man-in-the-middle fun with Perl LWP

Note: If you don't care about Perl anymore, and you never have to audit projects that have PERL in them, just skip this entirely.  This is basically about a perl gotcha that seems to have hit every project I've come across that uses LWP.  Nothing more, nothing less.

When you use perl and you want to access HTTP/HTTPS/FTP content, you'll typically use a pre-made module to accomplish the task.  One of the most common modules to do this is LWP.  LWP is not only used directly, but by several other popular modules to handle network communications.  And this scares me...

Why? Because perl programmers often use LWP to handle "web stuff".  Code I have encountered that uses LWP may disallow certain protocols that can be accessed, such as "file:" because they know that they only want to access remote URLs, but does not often do anything special to handle HTTPS securely.  This may be because the entire notion of validating certificates the LWP-way is so counterintuitive that even the most experience perl folks I know have never encountered this.

Here's the issue with LWP and HTTPS:

Unless you define a custom HTTP request header called "If-SSL-Cert-Subject" and set it to a regular expression that properly matches the subject field of an SSL certificate, you won't be doing proper hostname validation.  Without hostname validation, you might as well not be using SSL.

I have not come across a single project that uses LWP and has hostname validation enabled.  There's a lot of code out there and I'm sure someone has done it right somewhere.  I have a feeling this lack of validation is because (a) programmers expect HTTPS libraries to do this by default, and (b) the interface basically makes no sense. 

Here's a snippet from the LWP perldoc that describes this functionality:

       The request can contain the header "If‐SSL‐Cert‐Subject" in order to make the request conditional on the content of the server certificate.  If the certificate subject does not match, no request is sent to the server and an internally generated error response is returned.  The value of the "If‐SSL‐Cert‐Subject" header is interpreted as a Perl regular expression.

Remember that a certificate's subject may look like this:


subject=/C=US/ST=California/L=Mountain View/O=Google Inc/CN=www.google.com

or like this:

subject=/C=US/ST=Arizona/L=Scottsdale/, Inc/OU=MIS Department/CN=www.GoDaddy.com/serialNumber=0796928-7/, Clause 5.(b)

As the value of this header is a regex, one cannot simply use a hostname.  Remember that /domain.com/ will match "someotherdomain.com" or even "hacker.anotherdomain.computerparts.topleveldomain".  Your code will still be vulnerable to attackers unless your regular expression is properly anchored and you take the time to properly parse the subject line which is a chore to say the least.  Additionally, subjects are not guaranteed to all follow the same order or format, which makes writing a general purpose check rather difficult.

And remember, if you don't do this (correctly), you might as well not be using SSL.

I have filed a couple of bugs against the larger perl projects I am aware of that use LWP.  I would strongly suggest that any perl programmers reading this check to see if they have used LWP and if so, if they have actually enabled correct hostname validation. 

Here's a snippet of perl blatantly ripped out of the perldoc for LWP::UserAgent that demonstrates setting the header so you can play around with it:

require LWP::UserAgent;
my $ua = LWP::UserAgent->new;
# If no If-SSL-Cert-Subject header exists, no hostname validation is on
# Here's a certificate check that can by bypassed by anyone. Think attacker.domain.combinationlockfactory.com
# Any of the following lines will validate https://www.godaddy.com...
#$ua->default_header("If-SSL-Cert-Subject"=>'Domain Control Validated');
my $response = $ua->get('https://www.godaddy.com/');
if ($response->is_success) { print $response->content; } else { die $response->status_line; }

This seems like a pretty dangerous, hard to use interface.  I've reached out to some people in the perl community about trying to fix this.   I'm interested in your comments.  Hopefully reading this has not been a waste of your time.

Sunday, December 5, 2010

iOS Safari text search - a feature that boldly ignores user privacy and security

In case you missed it, Apple added a new feature -- one that I don't understand how they haven't had until now -- the ability to search for text in the current page. They must have thought pretty highly of it, given that they tout it on their main iOS feature page (here).  Here's their description of the feature:

In Safari, you can do a quick text search to find and highlight specific words and phrases on even the longest web pages.

I'm highlighting this feature because the way this feature works is ridiculous.  I mean, staggeringly ridiculous.

1. You go to a web page
2. You start entering the text you want to find in the search box
3. If your search term is found in the page, the you'll see it in the search results and can select it to be found in the current web page.

Here are some screenshots of this in action searching for "iPhone" at https://www.paypal.com (note the nice green EV title!):

And the corresponding request:

GET /complete/search?json=t&nolabels=t&client=iphonesafari&q=iPhone HTTP/1.1
Host: clients1.google.com
User-Agent: Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5
Accept: */*
Accept-Language: en-us
Accept-Encoding: gzip, deflate

Okay, so what's wrong with this?  Clearly anyone using this interface knows they are typing data into the search box from Google (or whatever search engine is selected).

Wait, stop, what?

You mean they designed a brand new feature to let you search the text of the current page, but only on the condition that you give up your privacy and security by sending it cleartext over the wire to your search provider and everyone watching the network? Yes.

Well, it's a good thing that sniffing local networks is not much of an issue.. oh wait.. ever heard of FireSheep?  The fact that FireSheep caused such problems not only highlighted the lack of basic encryption on popular websites, but also that there are indeed often attackers on the network sniffing your data.

I've always been a big fan of the way Apple makes user interfaces work.  The settings I need are normally right where I'd look for them, nice and integrated with each other.  Whoever made this feature was probably hoping for that kind of elegance, but clearly fell on their face.

Here's why I think so:

1.  You shouldn't force users to do something insecure to get the job done.  Users will do that insecure thing, and you really can't blame them.  They need to get their job done on their smart phone.

2. You shouldn't mix security context in such an irresponsible way.  I doubt most users understand they are sending this data cleartext over the wire, for everyone to see.  A traditional text search in a web page doesn't do this.  I'm not sure why you think users will get it.  Especially since they probably have faith that Apple will do its job properly.

This was a fail.  Apple is setting users up for failure.  There is absolutely no reason for this data to be disclosed, let alone in the clear, on the wire in order to search on the current web page.

I urge Apple to fix this, because they are good enough, smart enough, and doggone it people like them.  Also, it's about to be 2011.  You should make a new years resolution not to let people work on Safari UI who don't understand at least the basics of web security.