Third-party scanner: Protect AI
Protect AI’s Guardian catches pickle, Keras, and other exploits as detailed on their Knowledge Base page. Guardian also benefits from reports sent in by their community of bounty Huntrs.
Example of a report for danger.dat
We partnered with Protect AI to provide scanning in order to make the Hub safer. The same way files are scanned by our internal scanning system, public repositories’ files are scanned by Guardian.
Our frontend has been redesigned specifically for this purpose, in order to accomodate for new scanners:
Here is an example repository you can check out to see the feature in action: mcpotato/42-eicar-street.
Model security refresher
To share models, we serialize the data structures we use to interact with the models, in order to facilitate storage and transport. Some serialization formats are vulnerable to nasty exploits, such as arbitrary code execution (looking at you pickle), making sharing models potentially dangerous.
As Hugging Face has become a popular platform for model sharing, we’d like to protect the community from this, hence why we have developed tools like picklescan and why we integrate third party scanners.
Pickle is not the only exploitable format out there, see for reference how one can exploit Keras Lambda layers to achieve arbitrary code execution.
< > Update on GitHub