Deployment

Your Code, Your Rules

Self-Hosted (air-gapped) or Managed SaaS β€” no third-party LLM APIs. Choose how you want to run TraceMint while keeping full control of your code and findings.

DEPLOYMENT OPTIONS

Choose Your Deployment Model

Both options provide the same powerful detection engine with identical results. The difference is where the analysis runs and who manages the infrastructure.

🏒

Self-Hosted (Air-Gapped)

Full Control

Deploy on your infrastructure for maximum security and compliance. Perfect for regulated industries, government, and sensitive codebases.

  • Code never leaves your network
  • Offline model inference (32B LLM)
  • Zero internet requirement
  • Local rule updates
  • Full data sovereignty
☁️

Managed SaaS

Zero Ops

Let us handle the infrastructure while you focus on fixing vulnerabilities. Enterprise-grade isolation with no third-party LLM APIs.

  • No third-party LLM APIs
  • Dedicated tenant / isolated runtime
  • Private LLM inference per customer
  • Automatic updates & maintenance
  • SOC2-ready infrastructure

Both Options Include

βœ“ Deterministic proofs βœ“ Exportable evidence chains βœ“ Same detection engine βœ“ No external LLM calls
ARCHITECTURE

How It Works

Understanding the data flow and isolation model for each deployment option.

Self-Hosted Data Flow

πŸ“ Your Code
β†’
πŸ” Scanner (On-Prem)
β†’
πŸ€– Local LLM
β†’
β—ˆ Findings (Local)

πŸ”’ Everything stays within your network boundary

Managed SaaS Data Flow

πŸ“ Your Code
β†’
πŸ” Secure Upload
β†’
πŸ€– Private LLM
β†’
β—ˆ Findings

πŸ”’ Dedicated runtime per customer, no shared LLM context

Data Retention Policy

Source Code Deleted after scan (configurable)
Findings Retained until deleted by user
Audit Logs 90 days (configurable up to 2 years)
LLM Context Never persisted, cleared after scan
FAQ

Common Questions

Do you use OpenAI, Anthropic, or other third-party LLM APIs?

No. We run a private, locally hosted model fine-tuned in-house specifically for vulnerability localization and ranking. No external LLM API calls are madeβ€”ever. Your code stays on your infrastructure (self-hosted) or in your isolated tenant (SaaS).

Can I run TraceMint completely offline?

Yes. The self-hosted option supports full air-gapped deployment. After initial setup, no internet connection is required for scanning or analysis.

How is customer data isolated in the SaaS version?

Each customer gets a dedicated runtime environment. LLM inference is isolated per tenant, and no customer code or context is ever shared across accounts.

Can I delete my data at any time?

Yes. Both deployment options support full data deletion on request. For SaaS, you can delete projects, findings, and all associated data through the dashboard or API.

Do you offer on-premise support?

Yes. Enterprise customers with self-hosted deployments can purchase premium support packages that include on-site installation assistance and dedicated support engineers.

Ready to deploy
on your terms?

Contact us to discuss which deployment option works best for your organization.