WatchGuard 11.1 and HTTP headers

Archive for November, 2009

WatchGuard 11.1 and HTTP headers

Posted by

WatchGuard 11.1 firmware came out recently and it features a new security option: replacing HTTP headers. The firewall admin can maintain a set of approved HTTP headers. As web traffic flows thru the WatchGuard proxy, it inspects the packets, and removes header not in the list.

Certain websites may have an issue with this, such a websites that rely on non-standard HTTP headers. If that happens, the firewall admin has two choices. The non-standard headers can be added to the approved list. Alternatively, the website can be added to a proxy bypass list. Then the web traffic from this site bypasses the proxy rule altogether.

What risk is this control mitigating? Several HTTP attacks rely on host header manipulation or header injection. There are also web attacks that cram two or more HTTP responses into one TCP packet (HTTP response splitting). Both are thwarted by configuring the HTTP proxy in 11.1.

Is it worth the effort? Time will tell.

Pentetration testing Microsoft Office Communication Server

Posted by

Pentesting your Microsoft Office Communication Server? Need a tool? Viper Labs updated their OAT (OCS Assessment Tool) to v2.0 this month. OAT automates testing OCS with: online dictionary attack, domain user enumeration, presence stealing, contact list stealing, domain IM flood, communicator call DoS, and domain call walk. Like SimWitty, OAT is written in C# and available under the BSD license.

“VIPER Lab created OAT because OCS and other Microsoft products are frequently being used as part of a unified communications infrastructure in many enterprises. Our mission is to help IT manager and security practitioners evaluate the security architecture of their deployments and ensure that their mission-critical communications and systems are protected.”

http://voat.sourceforge.net/

The pack is not online — Diskpart errors on some file systems

Posted by

VDS returns the following when you select a partition format that it does not recognize:

C:\> Diskpart

DISKPART> list disk
DISKPART> select disk (id)
DISKPART> list part
DISKPART> select part (id)

Virtual Disk Service error:
The pack is not online.

The pack is not online error (VDS_E_PACK_OFFLINE 0x80042444L) is returned when Diskpart attempts to get the file system properties on, say, an ext3 or hfs+ file system. Diskpart works only with Fat and Ntfs file systems. If the goal is to delete the non-Microsoft partition, use the clean command.

DISKPART> list disk
DISKPART> select disk (id)

DISKPART> clean

Audit for SSL/TLS renegotiation

Posted by

An SSL/TLS renegotiation attack has been carried out against Twitter. The Register has some details on the Twitter attack, while Educated Guesswork has the technical details on the renegotiation vulnerability itself.

 

SSL/TLS renegotiation has been used to get a web server to downshift its cipher and key length before. The new angle is using renegotiation to cause both the web server and the browser to renegotiate and create a man-in-the-middle scenario. Once in the inserted in the middle of web server and browser, the attacker can access the HTTP stream unencrypted.

 

Being an IT operations security guy, my focus is on auditing for and protecting against the weakness. The mitigation is simple: disable renegotiation. As for auditing, you can use openssl on any Linux OS to test.

 

sudo openssl s_client -connect www.yourhosthere.com:443

 

You will see the certificate chain, server certificate, SSL handshake, and SSL session details. The session is established when you get prompted verify return code: 0 (ok).

 

Now suppose OpenSSL reports verify error:num=20:unable to get local issuer certificate. I have seen this error on GoDaddy websites. To resolve, browse to the website with Firefox. Open the certificate viewer and click the details tab. There, below the details, click the Export button. Save the certificate file in the x.509 PEM format with a .pem extension (Example: godaddy.pem). Then rerun OpenSSL and specify the certificate authority file.

 

sudo openssl s_client -connect www.yourhosthere.com:443 –CAfile godaddy.pem

 

Make an HTTP request and then request renegotiation.

 

HEAD / HTTP/1.0

R

 

The error ssl handshake failure indicates the web server is denying renegotiations.  If OpenSSL renegotiates successfully, you will see a new certificate path and then read read:errno=0. Contact your web server administrator if the server renegotiates.

 

(Update 2009-12/18: You can use the Matriux distro to perform the above steps.)

Building our own cloud

Posted by

I have been thinking a lot about IT service architecture. After all, my theme this year is “Security is Design”. How can we maximize the benefits of new technologies while minimizing the security risks?

Take cloud computing. The buzz is that cloud computing reduces costs and increases scalability. Cloud computing, specifically with cloud hosting, does this by putting our servers in a multi-tenant environment and then charging based on utilization. So organizations get pay-as-you-go pricing that is shared across scores of customers (tenants). Add self-service and rapid provisioning, and you get a fast and flexible solution.

That makes the IT operations side of my brain happy. But then my IT security side chirps up.

Multi-tenant increases security risks as we no longer have end-to-end visibility and control coverage. Think of the property security of an apartment versus a private home.  Multi-tenant decreases responsiveness, too, as the service provider must balance the needs of his organization against the needs of yours. Think the customer service you get from your telephone utility versus your in-house telecommunications specialist. Above and beyond that, simply by being a new architecture, cloud computing will bring an entirely new set of risks that can only be identified with time.

So how can we balance the benefits and risks of cloud computing? One way is to bring the cloud computing technologies in-house. The basics are readily available: virtualization, rapid provisioning, self-service, resource pooling, charge back. A data center built on the cloud computing model, but leveraging the best of an internal IT team: responsiveness, responsibility, and business domain knowledge.

My team has been using the terms “in-house cloud” or “private cloud” to describe our efforts to achieve this balance. This week, vendors led by EMC launched www.privatecloud.com as a resource building such beasts. Check out their definition of private cloud. While the blog is VMware and EMC based, I wager it is only a matter of time before Microsoft and Compellent come out with comparable information.

Done right, private clouds or cloud computing built in-house will provide a smooth transition for organizations to get the benefits of this new architecture.