Reverse Engineering the AI Supply Chain: Why Regex Won't Save Your PyTorch Models

Written by arseniibr | Published 2026/01/14
Tech Story Tags: ai | ai-in-cybersecurity | devops-tools | machine-learning | malware-threat | pytorch | open-source | reverse-engineering

TLDRVeritensor is an open-source tool that secures the entire lifecycle of an AI model. It detects RCE malware in Pickle files using AST emulation, verifies hashes against Hugging Face to prevent tampering, checks for restrictive licenses (like Non-Commercial), and cryptographically signs your containers. Here is how to use it.via the TL;DR App

We treat AI models like data assets. We version them, we store them in S3, and we cache them. But technically, a PyTorch model (.pt) or a Pickle file (.pkl) is not data. It is a program.

And right now, MLOps pipelines are blindly executing these programs with full privileges.

I built Veritensor, an open-source security scanner, to solve this. Here is a deep dive into why simple scanning fails and how we implemented a proper defense using Abstract Interpretation.

The Attack Vector: Pickle is a VM

The pickle protocol is a stack-based virtual machine. It has opcodes to push data onto a stack, call functions (REDUCE), and manipulate memory (MEMO).

A naive attacker writes this:

class Virus: 
  def __reduce__(self): 
    return (os.system, ("rm -rf /",))

A naive defender writes a Regex scanner:

if "os.system" in file_content:     
  alert("Virus!")

Why Regex Fails (The Obfuscation Problem)

A sophisticated attacker knows you are grepping for os and system. So they use the STACK_GLOBAL opcode to assemble the function name dynamically at runtime.

Instead of importing os, they do this (conceptually):

  1. Push string "o"
  2. Push string "s"
  3. Concatenate -> "os"
  4. Import module by name from stack.

The string "os" never appears in the file as a contiguous block. Your Regex scanner sees nothing. The model loads, the VM executes the assembly, and you get pwned.

The Solution: Static Analysis via Stack Emulation

To catch this, Veritensor doesn't just read the file. It emulates the Pickle VM.

We wrote an engine that iterates through the opcodes (PROTO, BINUNICODE, STACK_GLOBAL, etc.) and maintains a virtual stack. We don't execute the functions, but we track what is being called.

When the scanner sees STACK_GLOBAL, it looks at the virtual stack to see what module and function are being requested. Even if the strings were constructed dynamically, the emulator sees the final result: os.system.

This allows us to enforce a Strict Allowlist policy. If a model tries to import anything outside of torch, numpy, or collections, Veritensor kills it before it executes.

Beyond Malware: The Integrity Problem

Scanning for malware is step one. Step two is ensuring the file hasn't been tampered with (MITM attacks) or corrupted.

Veritensor implements a Hash-to-API verification.

  1. It calculates the SHA256 of your local artifact.
  2. It queries the Hugging Face Hub API for the official manifest of the repository you think you are using.
  3. It compares the hashes.

If you downloaded bert-base-uncased but the hash doesn't match Google's official release, Veritensor blocks the deployment. This protects against "Typosquatting" models that mimic popular architectures but contain backdoors.

Supply Chain Trust (Sigstore)

Finally, once a model is scanned and verified, we need to ensure it stays that way. Veritensor integrates with Sigstore Cosign.

If the scan passes (PASS), the tool uses your private key to sign the Docker container containing the model. The signature includes metadata:

{ 
  "scanned_by": "veritensor", 
  "scan_date": "2025-01-14T12:00:00Z", 
  "status": "clean" 
}

Your Kubernetes Admission Controller can then verify this signature and reject any unsigned or "stale" images.

Try it out

Veritensor is fully open source (Apache 2.0). It supports PyTorch, Keras (detects Lambda layer injections), Safetensors, and GGUF.

pip install veritensor

GitHub: https://github.com/ArseniiBrazhnyk/Veritensor

I’d love to hear your feedback on the detection logic or edge cases you've encountered with Pickle files.


Written by arseniibr | Finance student with an interest in AI, BigData and security
Published by HackerNoon on 2026/01/14