pip-audit, SBOM generation, and hash pinning: a post-TeamPCP security checklist
GuidesMarch 30, 2026· 9 min read

pip-audit, SBOM generation, and hash pinning: a post-TeamPCP security checklist

Sage ThorntonBy Sage ThorntonAI-GeneratedGuideAuto-published8 sources citedHigh confidence · 8 sources

Attackers compromised LiteLLM, Trivy, and Telnyx on PyPI in under two weeks. CISA flagged CVE-2026-33017 in Langflow as actively exploited. If your team builds with open-source AI tooling, this is the guide to checking whether you're exposed and locking things down before the next incident.

What happened, condensed

Starting March 19, 2026, a threat actor group called TeamPCP launched a supply chain campaign that hit five ecosystems in eleven days. They compromised Trivy (Aqua Security's vulnerability scanner) first, then used stolen credentials to spread to Checkmarx KICS GitHub Actions, npm packages, and eventually PyPI.

On March 24, malicious versions of LiteLLM (1.82.7 and 1.82.8) appeared on PyPI. On March 27, telnyx versions 4.87.1 and 4.87.2 followed. According to Datadog Security Research, the payloads stole environment variables, SSH keys, cloud credentials (AWS, GCP, Azure), Kubernetes configs, and database passwords, then encrypted and exfiltrated the data to attacker-controlled domains.

Separately, CISA added CVE-2026-33017 to its Known Exploited Vulnerabilities catalog. This is an unauthenticated RCE in Langflow affecting all versions through 1.8.1, with a CVSS score of 9.3. According to Sysdig, attackers built working exploits within 20 hours of the advisory's publication, without waiting for public proof-of-concept code.

Jacob Krell, senior director for secure AI solutions at Suzu Labs, summarized the pattern: "TeamPCP did not need to attack LiteLLM directly. They compromised Trivy, a vulnerability scanner running inside LiteLLM's CI pipeline without version pinning. That single unmanaged dependency handed over the PyPI publishing credentials."

This guide covers the practical steps to check your exposure and harden your pipeline.

Step 1: Check if you're running compromised packages

Langflow version check

pip show langflow | grep Version

Expected output for a safe version:

Version: 1.9.0

If you see any version at or below 1.8.1, you are vulnerable to CVE-2026-33017. According to JFrog's research, version 1.8.2 is also vulnerable, and 1.9.0 is the correct patched version.

Upgrade:

pip install langflow==1.9.0

LiteLLM version check

pip show litellm | grep Version

If you see 1.82.7 or 1.82.8, treat the host as fully compromised. Those versions have been removed from PyPI, but if you installed during the window (March 24, 10:39-16:00 UTC), the malicious code already ran.

According to LiteLLM's security update, customers running the official Docker image were not impacted because that deployment path pins dependencies in requirements.txt.

Telnyx version check

pip show telnyx | grep Version

Versions 4.87.1 and 4.87.2 are compromised. If present, assume credential exposure.

Troubleshooting

  • "Package not found" output? That means you don't have it installed. You're clear for that package.
  • Version shows a .dev suffix? Development versions may have been installed from source. Check your git commit hashes against the upstream repository.
  • Running in Docker? Check the image build date. If your image was built before March 24, 2026, and you pin your litellm version, you're likely unaffected.

Step 2: Run pip-audit on your environment

pip-audit is maintained by the Python Packaging Authority (PyPA). I prefer it over Safety because it queries the OSV database directly and doesn't require a commercial API key for basic usage.

Install and run:

pip install pip-audit==2.9.0
pip-audit

Expected output (clean environment):

No known vulnerabilities found

Expected output (vulnerable packages found):

Name       Version  ID                   Fix Versions
---------- -------- -------------------- ------------
langflow   1.8.1    CVE-2026-33017       1.9.0

To audit a requirements file without installing:

pip-audit -r requirements.txt

Troubleshooting

  • pip-audit fails with SSL errors? Your corporate proxy is likely intercepting. Set REQUESTS_CA_BUNDLE to your corporate CA cert path.
  • Slow on large environments? Add --cache-dir .pip-audit-cache to avoid re-downloading the vulnerability database on every run.

Step 3: Pin your dependencies (and actually use lock files)

The LiteLLM compromise worked because teams ran pip install litellm without version pinning. When version 1.82.7 appeared on PyPI, unpinned installs pulled it automatically.

Here is what pinning looks like in practice.

requirements.txt with hashes (the strongest protection):

litellm==1.82.6 \
    --hash=sha256:<actual-hash-from-pypi>
langflow==1.9.0 \
    --hash=sha256:<actual-hash-from-pypi>

Generate hashes automatically:

pip install pip-tools==7.4.1
pip-compile --generate-hashes requirements.in -o requirements.txt

Expected output: a requirements.txt with every package pinned to an exact version and SHA-256 hash. When you run pip install -r requirements.txt --require-hashes, pip will refuse to install anything that doesn't match.

Why this matters here: if LiteLLM's requirements.txt had used hash pinning in their CI pipeline, the attacker's malicious upload would have failed the hash check and never executed.

Troubleshooting

  • "Hash mismatch" errors after a legitimate upgrade? Re-run pip-compile --generate-hashes to refresh the hashes. This is the system working correctly.
  • Conflict between pinned versions? Use pip-compile to resolve the dependency tree. Do not manually edit the generated lock file.

Step 4: Generate an SBOM

A Software Bill of Materials tells you exactly what's in your deployment. When the next supply chain incident drops, you can query your SBOM instead of grepping through Docker layers.

I recommend cyclonedx-bom for Python projects. It produces CycloneDX-format SBOMs that work with most vulnerability scanning tools.

pip install cyclonedx-bom==4.6.1
cyclonedx-py environment -o sbom.json --output-format json

Expected output:

SBOM written to sbom.json

The generated file contains every installed package, its version, and its purl (package URL) identifier.

For container-level SBOMs, syft from Anchore handles both OS packages and Python dependencies:

# Install syft (Linux/macOS)
curl -sSfL https://raw.githubusercontent.com/anchore/syft/main/install.sh | sh -s -- -b /usr/local/bin

# Generate SBOM from a Docker image
syft your-ai-app:latest -o cyclonedx-json > container-sbom.json

Store SBOMs as build artifacts. When a new CVE drops, you can search across all your deployments:

# Find which deployments include litellm
grep -l '"litellm"' sbom-*.json

Troubleshooting

  • cyclonedx-py errors on Poetry projects? Use cyclonedx-py poetry instead of cyclonedx-py environment.
  • syft produces empty results for Python packages in Docker? Make sure the Python packages are installed in the image, not in a multi-stage build layer that gets discarded.

Step 5: Harden your GitHub Actions CI

The TeamPCP campaign exploited GitHub Actions directly. They compromised Trivy's action tags and Checkmarx KICS GitHub Actions by force-pushing malicious commits to existing tags. If your workflow referenced aquasecurity/trivy-action@latest or any mutable tag, you pulled the compromised code.

This is the kind of pipeline risk that multi-agent AI workflow architectures need to account for when building automated systems.

Pin actions to full commit SHAs, not tags:

# BAD - tag can be force-pushed to point at malicious code
- uses: aquasecurity/trivy-action@latest

# GOOD - immutable reference to a specific commit
- uses: aquasecurity/trivy-action@a22aa4e0b3a96b4f8a2ab05d14735e346c6e8e54  # v0.69.3

According to GitHub's documentation, pinning to a full-length commit SHA is currently the only way to use an action as an immutable release.

Add pip-audit to your CI pipeline:

name: Security Audit
on: [push, pull_request]

jobs:
  audit:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683  # v4.2.2
      - uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065  # v5.6.0
        with:
          python-version: '3.12'
      - run: pip install pip-audit==2.9.0
      - run: pip-audit -r requirements.txt --strict --output audit-results.json --format json
      - uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02  # v4.6.2
        with:
          name: audit-results
          path: audit-results.json

Restrict workflow permissions:

permissions:
  contents: read  # Don't grant write access unless the job needs it

GitHub announced in August 2025 that Actions policy now supports SHA pinning enforcement at the organization level, and their 2026 security roadmap includes a dependencies: section in workflow YAML that locks all direct and transitive dependencies with commit SHAs.

Troubleshooting

  • How do I find the SHA for an action? Go to the action's releases page on GitHub, click the commit hash for the version you want, and copy the full 40-character SHA.
  • Dependabot keeps updating my pinned SHAs? That's correct behavior. Review each Dependabot PR to confirm the new SHA matches a legitimate release before merging.

Step 6: What to do if you ran a compromised version

If you confirmed that litellm 1.82.7/1.82.8 or telnyx 4.87.1/4.87.2 ran in your environment, follow this sequence:

  1. Rotate every credential on the affected host. The malware harvested environment variables, SSH keys, cloud credentials, Kubernetes tokens, Docker configs, database passwords, and shell history. According to Datadog Security Research, defenders should treat this as a "full-credential exposure event."

  2. Check for persistence. The malware installed persistent backdoors via Python .pth files (which execute on every Python interpreter startup) and, on Kubernetes, via DaemonSets. Look for:

# Find suspicious .pth files
find $(python -c 'import site; print(site.getsitepackages()[0])') -name '*.pth' -newer /tmp -exec cat {} \;

# Check for unknown DaemonSets (Kubernetes)
kubectl get daemonsets --all-namespaces | grep -v kube-system
  1. Check outbound connections. The malware exfiltrated to models.litellm.cloud, checkmarx[.]zone, and scan.aquasecurtiy[.]org (note the intentional typo by the attacker). Check your network logs for connections to these domains.

  2. Rebuild, don't patch. If the malicious code ran, cleaning up the installed package is not enough. Rebuild the host or container from a known-good image.

What we don't know yet

  • Whether TeamPCP has compromised additional PyPI packages that haven't been detected. According to Arctic Wolf, the group "may continue to pivot to additional projects as long as compromised credentials remain active."
  • The full scope of credential reuse. LiteLLM had about 95 million monthly downloads according to Endor Labs. The malicious versions were available for roughly five and a half hours before PyPI quarantined them, but the actual number of affected installations hasn't been disclosed.
  • Whether the Langflow CVE-2026-33017 exploitation and the TeamPCP campaign are connected. The timing overlaps (both active in the last two weeks of March 2026), but no security researcher has linked the two.

What's next

LiteLLM has paused new releases until they complete a broader supply chain review. GitHub is building native dependency locking into Actions workflows, which would have prevented the Trivy tag-poisoning vector. The Python Packaging Authority continues to push for mandatory two-factor authentication on PyPI publisher accounts.

The pattern from this month is instructive: attackers targeted security-adjacent tools (a vulnerability scanner, an infrastructure-as-code analyzer, an LLM proxy) specifically because those tools run with broad credential access. If your AI pipeline uses open-source tools that handle API keys and cloud credentials by design, the hardening steps above are not optional anymore.

Sage Thornton covers developer tools and security for The Daily Vibe.

This article was AI-generated. Learn more about our editorial standards

Share:

Report an issue with this article