TLS proxies: Insecure by design, say boffins

Home antivirus is rubbish, so you don't use it on your work PCs. Do you?


Have you ever suspected filters that decrypt traffic of being insecure? Canadian boffins agree with you, and have said that TLS proxies – commonly deployed in both business and home networks for traffic inspection – open up cans of worms.

“Not a single TLS proxy implementation is secure with respect to all of our tests, sometimes leading to trivial server impersonation under an active man-in-the-middle attack, as soon as the product is installed on a system," wrote Xavier de Carné de Carnavalet and Mohammad Mannan of the Concordia Institute of Systems Engineering in Montreal.

The trio's paper (PDF) goes on to say that users could be exposed to man-in-the-middle attacks or other CA-based impersonations.

While the researchers focused their attention on consumer anti-virus and Web filtering products, The Register reckons at least one of their warnings is borne out in the enterprise space.

It's not enough that vulnerabilities like POODLE and FREAK are patched by project maintainers, for example: customers of downstream products that happen to use the buggy software have to wait until their specific patches are available.

As de Carnavalet and Mannan write, outdated proxies might “lack support for safe protocol versions and cipher suites, undermining the significant effort spent on securing web browsers.“

The researchers tested eight antivirus and four parental control products (all for Windows), and two products that import a CA root certificate, checking their vulnerability for MITM attacks.

The depressing assertion:

We found that four products are vulnerable to full server impersonation under an active man-in-the-middle (MITM) attack out-of-the-box, and two more if TLS filtering is enabled. Several of these tools also mislead browsers into believing that a TLS connection is more secure than it actually is, by e.g., artificially upgrading a server’s TLS version at the client.

There's also the matter of how products protect their root certificates' private key. It's not pretty, as the table below shows.

Location Protection Access
Avast CAPI Exportable key Admin
AVG Config file Obfuscation Unknown
BitDefender DER file Hardcoded passphrase User
BullGuard AV DER reg key Hardcoded passphrase User
BullGuard IS DER reg key Hardcoded passphrase User
CYBERsitter CER file Plaintext User
Dr Web CAPI-cert1 Exportable key Admin
ESET CAPI Non-exportable key Admin
G DATA Registry Obfuscated encryption User
Kaspersky DER file Plaintext User
KinderGate CER file Plaintext User
Net Nanny Database Modified SQL Cipher User
PC Pandora CAPI-cert Non-exportable key Admin
ZoneAlarm DER file Plaintext User

Although key recovery was non-trivial, the authors note “we retrieved four passphrase-protected private keys and a key stored in a custom encrypted SQLCipher database”.

Here's another treat: “The version of Kaspersky we analyzed in March 2015 continues to act as a TLS proxy when a 30-day trial period is expired; however, after the license expiration, it accepts all certificates, including the invalid ones”. Although fixed later, people who installed the pre-fix product and never uninstalled it are therefore MITM-vulnerability.

For the home user, de Carnavalet reckons filters that simply block domain names are probably effective enough.

The Register keenly hopes de Carnavalet and Mannan get the chance to repeat their tests against corporate proxies, and will keep some popcorn for just such an event. ®

Similar topics


Other stories you might like

  • A peek into Gigabyte's GPU Arm for AI, HPC shops
    High-performance platform choices are going beyond the ubiquitous x86 standard

    Arm-based servers continue to gain momentum with Gigabyte Technology introducing a system based on Ampere's Altra processors paired with Nvidia A100 GPUs, aimed at demanding workloads such as AI training and high-performance compute (HPC) applications.

    The G492-PD0 runs either an Ampere Altra or Altra Max processor, the latter delivering 128 64-bit cores that are compatible with the Armv8.2 architecture.

    It supports 16 DDR4 DIMM slots, which would be enough space for up to 4TB of memory if all slots were filled with 256GB memory modules. The chassis also has space for no fewer than eight Nvidia A100 GPUs, which would make for a costly but very powerful system for those workloads that benefit from GPU acceleration.

    Continue reading
  • GitLab version 15 goes big on visibility and observability
    GitOps fans can take a spin on the free tier for pull-based deployment

    One-stop DevOps shop GitLab has announced version 15 of its platform, hot on the heels of pull-based GitOps turning up on the platform's free tier.

    Version 15.0 marks the arrival of GitLab's next major iteration and attention this time around has turned to visibility and observability – hardly surprising considering the acquisition of OpsTrace as 2021 drew to a close, as well as workflow automation, security and compliance.

    GitLab puts out monthly releases –  hitting 15.1 on June 22 –  and we spoke to the company's senior director of Product, Kenny Johnston, at the recent Kubecon EU event, about what will be added to version 15 as time goes by. During a chat with the company's senior director of Product, Kenny Johnston, at the recent Kubecon EU event, The Register was told that this was more where dollars were being invested into the product.

    Continue reading
  • To multicloud, or not: Former PayPal head engineer weighs in
    Not everyone needs it, but those who do need to consider 3 things, says Asim Razzaq

    The push is on to get every enterprise thinking they're missing out on the next big thing if they don't adopt a multicloud strategy.

    That shove in the multicloud direction appears to be working. More than 75 percent of businesses are now using multiple cloud providers, according to Gartner. That includes some big companies, like Boeing, which recently chose to spread its bets across AWS, Google Cloud and Azure as it continues to eliminate old legacy systems. 

    There are plenty of reasons to choose to go with multiple cloud providers, but Asim Razzaq, CEO and founder at cloud cost management company Yotascale, told The Register that choosing whether or not to invest in a multicloud architecture all comes down to three things: How many different compute needs a business has, budget, and the need for redundancy. 

    Continue reading

Biting the hand that feeds IT © 1998–2022