Skip to main content
In one sentence: PCCI lets you use AI through the same API you already know — but nobody, including us, can see your data. Not our servers, not our staff, not the infrastructure provider. The hardware itself enforces this, and you can verify it cryptographically.

The Problem

When you use an AI API today, your data — prompts, files, conversations — travels through servers you don’t control. The provider can read everything. Even with HTTPS, your data sits in plaintext on their infrastructure once it arrives. For many teams, this is acceptable. But if you work with patient records, financial data, legal documents, trade secrets, or any information that regulations or common sense say should stay private — standard AI APIs require a leap of faith. PCCI removes the leap of faith.

What PCCI Does

PCCI (Prem Confidential Compute Infrastructure) gives you the same AI capabilities you’d get from any OpenAI-compatible API but with a fundamental difference: your data is encrypted from end to end, and processing happens inside tamper-proof hardware that nobody can access.

Encrypted in Transit

Data is encrypted on your device before it leaves. The network only carries scrambled bytes.

Encrypted at Rest

Files and keys are stored encrypted. No plaintext ever touches our storage.

Encrypted in Use

Processing happens inside hardware-sealed environments called Trusted Execution Environments (TEEs). Even someone with physical access to the server cannot extract your data.

How It Compares

Standard AI APIPCCI
Who can see your data?The provider, their staff, potentially their cloud hostOnly you
What if the server is compromised?Your data is exposedAttacker sees only encrypted bytes
How do you know it’s private?You trust the provider’s privacy policyYou verify it with hardware-signed cryptographic proof
What about future threats?Vulnerable if quantum computers break current encryptionProtected today by quantum-resistant algorithms
What about the cloud provider?They can inspect server memoryHardware isolation prevents this, regardless of who owns the machine
API compatibilityOpenAI-compatibleSame OpenAI-compatible interface — no rewrite needed

Who Is This For?

Regulated Industries

Healthcare, finance, legal, and government organizations that must meet HIPAA, GDPR, nFDAP, SOC 2, or internal compliance mandates when using AI.

AI Application Builders

Development teams building products on LLMs who need to guarantee data privacy to their users — without building custom infrastructure.

Security-First Organizations

Any team where data privacy is a hard technical requirement, not a policy checkbox. Where “we promise we won’t look” isn’t good enough.

Enterprises with Sensitive IP

Companies that want to use AI for internal documents, proprietary code, or strategic planning — without exposing that information to third parties.

What You Can Do Today

PCCI is a drop-in replacement for OpenAI-compatible APIs. If you’ve used ChatGPT, GPT-4, or any OpenAI-compatible service, you already know the interface. Everything works the same way — just encrypted:
  • Chat with AI models — Streaming conversations, multi-step reasoning, tool use
  • Transcribe and translate audio — Upload recordings, get text back
Every one of these capabilities runs entirely inside hardware-sealed environments on our own confidential infrastructure. All models are self-hosted — your data never leaves our CVMs, and no requests are forwarded to third-party AI providers. Your data never exists in plaintext outside your device and the sealed processing environment.

Getting Started

Whether you’re a developer integrating PCCI into an application or evaluating it for your organization, the path is straightforward:
1

Install the SDK

npm install @premai/pcci-sdk-ts
2

Create a client with your encryption key

import { createRvencClient } from "@premai/pcci-sdk-ts";

const client = await createRvencClient({
  apiKey: "your-api-key",
  clientKEK: "your-master-key", // You generate this. We never see it.
});
3

Use it exactly like OpenAI

const response = await client.chat.completions.create({
  model: "your-model",
  messages: [{ role: "user", content: "Hello, privately." }],
});
Already using Python, Go, or another language? The SDK includes a local proxy server — point your existing OpenAI client at it, and encryption happens transparently. Zero code changes.

How It Works

Understand the architecture — what each component does and how data flows through the system.

Security Model

Deep dive into the trust model — TEEs, attestation, threat model, and honest limitations.

Developer Experience

Integration guide — SDK options, capabilities, code examples, and API reference.

Platform Status

What ships today, what’s not ready yet, and what’s on the roadmap.
Ready to jump straight in? Go to the Quickstart guide.