This article is more than 1 year old

Apple's privacy pledges: We sent dev checks over plain HTTP, logged IP addresses. We bypass firewall apps

Big Sur highlights shortcomings in OCSP comms, APIs

Analysis Apple plans to revise the way it checks the trustworthiness of Mac applications when they're run – after server problems last week during the launch of macOS Big Sur prevented people's desktop apps from starting.

On Monday, Apple modified its Gatekeeper support page to address privacy concerns raised in the wake of the breakdown.

Gatekeeper is the system utility that checks that an application's developer certificate is valid before it allows the user to run the program. This verification process involves contacting Apple's servers to check the status of the certificates involved, and if those servers go down, those folks will find themselves unable to launch their software packages.

Before the weekend, it became widely known that macOS's cert-checking code effectively sends a digital fingerprint – a hash – of the app developer under scrutiny to Apple's back-end servers via plain-text HTTP. That means Apple as well as anyone eavesdropping on the network path can at least link you by your public IP address to the kinds of application you use.

Now Apple has stressed that this app security check does not send anyone's Apple IDs nor device identifiers over the 'net, though it did log people's public IP addresses. The tech giant promised to no longer retain this information, and said it will implement additional privacy improvements.

"To further protect privacy, we have stopped logging IP addresses associated with Developer ID certificate checks, and we will ensure that any collected IP addresses are removed from logs," Apple said.

The Silicon Valley titan also said it plans to implement an encrypted protocol for developer ID certificate revocation checks, to take steps to make its servers more resilient, and to provide users with an opt-out mechanism. The Register understands that the certificate checks are cryptographically signed by Apple, so they cannot be tampered with in transit without detection, though they can be observed, and so now Apple will wrap that communication channel in encryption to shield it from prying eyes.

Various Apple services failed last week. Though the Mac goliath has yet to publish any analysis of the incident – a common practice among cloud service providers – Apple's system status page showed service degradation and outages. These issues are separate from the problems reported by people who have apparently bricked older MacBook Pro models (2013-2014) by applying the Big Sur update.

A glass of apple juice

Apple drops macOS Big Sur on the world – and it arrives with a thud, sound of breaking glass, sirens in the distance...


One of the services that fell over was Apple's Online Certificate Status Protocol (OCSP) responder, which performs the aforementioned developer certification checks: it reveals if a cert is valid or has been revoked. The purpose of these certificates is ostensibly to limit the spread of malware. In a blog post, Jeff Johnson, who runs app development biz Lapcat Software, explains that if Apple finds a developer has distributed malware, it can revoke the developer's code-signing certificate and prevent macOS from launching all programs signed by that certificate.

"Unfortunately, if there's an internet connection problem involving the Developer ID OCSP, that can also prevent Mac apps from launching," said Johnson. "For several hours on Thursday, Mac users around the world experienced extreme slowness when launching their installed apps."

In other words, if a Mac loses its internet connection, the operating system's cert-checking code is supposed to fail in a way that allows the application to run in the meantime, but because people's Macs were able to connect to the internet, and it was Apple's servers that fell over, the OS failed to let apps launch.

Hanlon's razor

Complicating matters further is the use of plain-text HTTP for the certificate checks, which means anyone with access to that network traffic, like an ISP, can determine the developer certificate involved and at least guess the associated app. That's less than ideal from a privacy perspective, particularly given Apple's efforts to market itself as a corporation that cares about privacy, for users outside China.

Developers who have looked into the transmitted data argue it's not particularly sensitive. However, Apple's decision to stop logging IP addresses associated with developer ID certificate checks demonstrates that the privacy concerns aren't entirely imagined.

After various technically inclined types looked into ways to prevent the OCSP slowdown and perhaps block it persistently, word spread that network filtering apps weren't up to the task, thanks to API changes in macOS Big Sur.

Patrick Wardle, principal security researcher at Jamf and founder of Objective-See, noted that in Big Sur, Apple requires third-party firewall apps and app-based VPNs to use network monitoring and proxy software interfaces that are sidestepped by traffic from Apple's own apps and operating system processes. Thus, data packets from Apple's own programs and OS code goes straight out to the network, and aren't funneled through these firewall and VPN apps.

In a phone interview with The Register, Wardle explained that the various kernel programming interfaces (KPIs) previously available to developers for network monitoring are no longer allowed. Officially endorsed APIs, NEFilterDataProvider and NEAppProxyProvider, don't allow third-party firewalls apps like Objective Development's Little Snitch or Objective-See's LuLu, or app-based VPNs, to block OCSP requests or other Apple-exempted processes. (System-wide VPNs implementing NETunnelProviderManager reportedly do cover Apple traffic.)

"There are legitimate processes on macOS that need to talk to various Apple endpoints for the system to function properly," Wardle explained. "Apple decided to make sure these could always talk to the internet, even with a third-party firewall installed."

Wardle said while he understands Apple's reasoning for doing so and doesn't believe there's any malice or conspiracy involved, he still has privacy and security concerns.

"There are users out there who probably don't want their traffic going to Apple's servers," he said, pointing to past reports about how intelligence agencies monitor network traffic. He also noted that the inability to block Apple network traffic might not be appreciated by macOS users tethered to a cellular device since they would have to bear the cost of Apple data – an issue that recently led to a lawsuit against Google.

Wardle said he developed a proof-of-concept file exfiltration exploit that takes advantage of the exemption Apple gives to its own network traffic to get past firewall applications.

"That's problematic," he said. "I think a firewall should be able to do its job and comprehensively analyze traffic."

Wardle repeatedly expressed sympathy for the challenge Apple faces balancing usability and security. But he argues the Cupertino titan hasn't thought through the repercussions of its approach. He pointed to recent changes Apple made to prevent scans of remote process memory space as a defense against code injection. Hackers known as the Lazarus Group responded by creating memory-only payloads, knowing that macOS security tools could no longer scan or capture the malicious code.

"It just seems that as Apple locks down the operating system, that lockdown process also locks out some of the capability of third-party security tools," said Wardle.

However, he endorsed Apple's decision to revise its OCSP process and said he was happily running Big Sur, despite the rough start. "From a security and privacy point of view, it's still a no-brainer," he said. "Launching a new operating system is always difficult." ®

More about


Send us news

Other stories you might like