Blog

OpenAI Limits Data Inspection for (Most) Organizations

Lumia Security Labs

August 7, 2025 | 5 min read

To examine OpenAI’s privacy practices, we analyzed the ChatGPT Mac desktop application. Our findings raise important questions about fairness and the ability of smaller organizations to protect themselves.

Introduction

Nowadays, it’s hard to find anyone who hasn’t interacted with ChatGPT. Thanks to its convenience and versatility, many users choose to engage through OpenAI’s dedicated apps – whether on their phones or computers. Given the sensitive nature of conversations people have with ChatGPT, OpenAI’s user and enterprise privacy is of utmost importance. To explore this, we analyzed ChatGPT’s Mac desktop application. What we found raises important questions about fairness and the ability of smaller organizations to protect themselves.

Certificate Pinning: a Double-Edged Sword

Normally, a client validates the server by checking that its SSL/TLS certificate was issued by a trusted certificate authority. With certificate pinning (Figure 1), the client further restricts which certificates are accepted, typically to a specific known certificate or issuer. Certificate pinning makes man-in-the-middle attacks impossible – even for research purposes or by an external adversary – by preventing the use of a locally added trusted certificate, for example.

Figure 1: SSL/TLS Certificate Pinning

The macOS version of the ChatGPT app implements an SSL/TLS certificate pinning mechanism to prevent third parties from peeking into its communications. This means that if the ChatGPT app detects a root certificate that is not in its list of allowed issuers, it displays a message and will not start (see Figure 2).

Figure 2: ChatGPT refuses to start if the presented server certificate is not trusted

While certificate pinning is a powerful technique for strengthening the TLS trust model, it doesn’t just block malicious actors – it also prevents organizations from inspecting outbound traffic. Reason being that they cannot insert a custom certificate to decrypt TLS traffic.

This presents a significant challenge for companies focused on preventing data leakage since security teams typically rely on SSL inspection – a capability built into most Secure Web Gateways (SWGs), next-generation firewalls, and other security solutions – to monitor and detect sensitive information leaving the organization.

When certificate pinning is in place, SSL inspection becomes impractical, limiting the organization’s visibility and undermining the ability to enforce its security policies effectively.

Uncovering ChatGPT's Trusted Certificates

Though not documented, we recently uncovered how ChatGPT’s certificate pinning mechanism works.

Buried within the macOS ChatGPT binary, we found a list of trusted root certificates that ChatGPT uses to authenticate servers. Specifically, the list consists of the hash of each certificate’s public key (SPKI). Fingerprinting a certificate by its key information is a common and effective practice, as it allows the certificate’s validity to be extended without changing its identifier.

The trusted root certificate list contained the standard root certificates – those you’d expect to find in your Windows or macOS trust stores (like DigiCert, LetsEncrypt and GlobalSign) which are obvious to identify. In addition to the standard certificates, ChatGPT’s whitelist contains ten additional certificates that do not typically appear in standard certificate stores. Because the list contains only the SPKI hashes, identifying those allowed certificates is challenging.

However, Lumia Researchers were able to identify half of these certificates, and are still searching for the rest (see Table 3). We have been tracking these changes over time, and the latest app version as of the time of writing this blog is 1.2025.210.

{{table1}}

Cloudflare ZTNA

The first non-standard root certificate was added to the ChatGPT macOS allowlist on September 3, 2024 (app version 1.2024.240). We identified it as belonging to Cloudflare’s ZTNA solution, likely added to enable SSL inspection of ChatGPT traffic by Cloudflare. However, this certificate expired on February 2, 2025 and was not replaced (expect a blog post dealing with this soon.).

As a result, Cloudflare’s ZTNA can no longer perform SSL inspection on ChatGPT for macOS.

Zscaler Internet Access

Three days later, on September 6, 2024, another version of ChatGPT for macOS was released (version 1.2024.247). This version added a new certificate to the list – this time we were able to identify it as belonging to Zscaler and used for SSL inspection in Zscaler Internet Access. Thanks to this inclusion in the ChatGPT mac allowlist, and as far as we can tell, Zscaler is currently the only Secure Web Gateway solution that enables SSL interception of ChatGPT traffic on macOS.

By integrating with various SWG solutions, including Zscaler, Lumia is able to provide visibility into encrypted ChatGPT traffic on macOS as well.

Enterprise owned Certificates

On March 14, 2025 (version 1.2025.070) we first observed that OpenAI has added several enterprise-owned, not vendor, certificates to the list of those allowed for ChatGPT macOS.

We were able to identify the certificate belonging to WellPoint (“WellPoint Internal Root CA”). Three months later, June 10, 2025 (version 1.2025.154), Walmart’s root certificate (“WalmartRootCA-SHA256”) was added, followed by Lowe’s certificate (“MSCAROOT-LOWES”), added on June 20, 2025.

Our working assumption is that OpenAI adds enterprise-owned certificates to allow individual organizations, using their own root CAs, to perform SSL inspection on encrypted ChatGPT traffic on macOS.

Five Unidentified Certificates

Five certificates from the allowlist (marked as Unknown in the table above) remain unidentified, even after a thorough search across various certificate transparency logs. We assume these certificates also belong to individual enterprises and were added through a mechanism similar to that used for the enterprise-owned certificates that we were able to identify.

If you manage to identify any of them, feel free to drop us a note at info@lumia.security.

Bottom Line: Certificate Pinning Hurts Organizations

This is not a vulnerability or security issue – we haven’t found any way for an attacker to exploit OpenAI’s pinning exemption mechanism to gain access or bypass validation.

As demonstrated, however, certificate pinning is a poor practice, especially in environments where data leakage is a concern and monitoring or governance is necessary. Furthermore, allowing certificate pinning exceptions for large enterprises while offering no option for smaller organizations to monitor traffic is unfair and widens the privacy gap. OpenAI should either remove certificate pinning from the native Mac app entirely or avoid allowing exceptions of any kind. Even the Compliance API – OpenAI’s alternative monitoring mechanism – is limited to the costly Enterprise tier, accessible only to large organizations.

SPKI SHA-256Certificate NameTime Added
x+C0kJ2uYxDLS5lLqDkAFQRmwWLeak0Kk1WsiuDRnZ4=
Cloudflare for Teams ECC Certificate Authority
September 3, 2024
(v1.2024.240)
o9Q4YKkmWqA5TM5bF6b06LPxPfv2Oa4PVY5VtTVJJ3k=
Zscaler Root CA
September 6, 2024
(v1.2024.247)
VyPDgz7oetjwXvN4hSK/SKdsSMiuDw9rnZlej7yiD2o=
September 27, 2024
(v1.2024.268)
Xxhhbjq3NEImgVIfGzx+Ubl6ebJS1kwRjV/2YlHJ3Zs=
September 27, 2024
(v1.2024.268)
7LXacUIOxheqMpwcwCIc2JZSVNjHLEo2Avbn1i5AQ8w=
September 27, 2024
(v1.2024.268)
6NxkZR5oFt9XLCvgX4pz9b8b/YgzH3ZodIjE3LFGGGc=
February 19, 2025
(v1.2025.043)
LdSH9FFXssRXCky9fBfZkelEXY55eYitehEwbnZjGmE
WellPoint Internal Root CA
March 14, 2025
(v1.2025.070)
uhnd8BgEkdZwZAKq01QRZuBNn4zMVS+oA9i0HdWd0CA=
WalmartRootCA-SHA256
June 10, 2025
(v1.2025.154)
T3fJD2sDHyhSUzz6sseYEd/Ix4lWXky6FC+3jU3n/HQ=
June 10, 2025
(v1.2025.154)
k90HHLJe0oFAgIXcv7O8tX6T6ffbR46Bn67hHvQkVlg
MSCAROOT-LOWES
June 20, 2025
(v1.2025.168)
Table 3: ChatGPT Mac app allows ten non-standard root certificates to bypass pinning

Blocking AI apps is not an option anymore. Adopt AI. Safely. Reach out today to learn more.

We use cookies to enhance your browsing experience, serve personalised ads or content, and analyse our traffic. By clicking "Accept", you consent to our use of cookies.