Last updated: February 2026
Ghost AI is built on a fundamental principle: your data belongs to you. We engineered Ghost so that we cannot access your data — not because of policy, but because of architecture. All AI inference, memory storage, voice processing, and routing happen entirely on your device.
Ghost collects zero personal data. No conversations, no voice recordings, no memory contents, no usage patterns, no analytics, no device identifiers, and no location data are transmitted to any server.
All AI inference uses models running locally on your device. The language models (Gate 270M, Reflex 0.6B, Workhorse 3B) are downloaded once from our CDN and thereafter operate without any network connection. Whisper speech-to-text and TTS run entirely on-device.
All data stored by Ghost — conversations, memories, embeddings, session state — is encrypted using AES-256. Encryption keys are stored in the iOS Keychain (backed by Secure Enclave) or Android Keystore and never leave the device. See our Security Architecture for full details.
You have complete control over your data at all times:
Model packs are downloaded from our CDN on first launch. These downloads do not include user identifiers. The CDN may log standard HTTP access data (IP address, timestamp, file requested) for operational purposes. No personal data or device identifiers are included.
If you explicitly opt in to cloud escalation for tasks exceeding on-device capability, your prompt is sent to an approved third-party AI provider. This is always explicit, never automatic, and clearly indicated in the Proof Drawer. Ghost does not enable cloud features by default.
During cloud escalation: context is encrypted in transit via TLS 1.3. The cloud provider contractually agrees to not retain context or output after inference. Every escalation is logged in the AI Evidence Log with: reason, encrypted payload size, provider identity, transit encryption method, and contractual non-retention status. This is a contractual guarantee, not a cryptographic one — we are transparent about this distinction.
When cloud escalation occurs (user explicitly approves or policy allows), context is encrypted with TLS 1.3 during transmission. The cloud provider's endpoint terminates TLS for processing. Non-retention is contractual, not cryptographic. Every escalation is logged in the AI Evidence Log with: reason, encrypted payload size, provider identity, and contractual non-retention status.
Ghost's cloud escalation, even without true zero-knowledge, is vastly more transparent than any competitor.
When you use Ghost across multiple devices with sync enabled, all synced data is end-to-end encrypted. Our relay server sees only encrypted blobs and cannot read your conversations, memories, or preferences. Key exchange and encryption are performed device-to-device (or via a key you control); we do not hold decryption keys.
For Team and Organization tiers, shared memories and knowledge graphs are controlled by team administrators. Visibility levels (who can see or edit which memories) are set by your organization. Ghost does not use team data for any purpose other than delivering the service you configure.
For organizations using fleet management, device identity is used only for cohort assignment (e.g., staged rollouts or policy groups). We use a privacy-preserving hash for device identification — not tracking or behavioral profiling. Fleet data is not used for advertising or sold to third parties.
Ghost's Proof Layer generates cryptographic proof records for AI actions. These records are stored locally in a tamper-evident, hash-chained AI Evidence Log on your device. For a full overview of capabilities, see Features. Each record contains:
Proof records never leave your device unless you explicitly export them. Exported proof bundles are independently verifiable by anyone with your device's public key. The AI Evidence Log is not used for analytics, marketing, or behavioral profiling.
Proof records protect privacy in several ways: context fingerprints use SHA-256 hashes, not raw text, so the AI Evidence Log never stores private content. Transparency modes let you control visibility depth — Simple shows plain-language summaries, Advanced shows metadata, Paranoid shows raw data. Exported proof bundles do not include raw context unless you explicitly include it.
Proof records prove that privacy was maintained without exposing private content.
When deterministic mode is active, Ghost retains additional data to support exact replay of AI interactions: the input, model version, context, random seed, and output hash. Context is compressed (zstd) and encrypted at rest with your device key. Retention period is configurable in Settings. This data is used exclusively for replay verification and is never transmitted externally.
Contexts retained for replay are encrypted at rest with the device key and are accessible only with that key. Exported proof bundles do not include raw context unless you explicitly include it. Retention period is configurable per organization or per user. Replay data is compressed with zstd for storage efficiency.
You can export proof records as signed JSON bundles at any time. These exports contain: proof record metadata, hash chain integrity data, and your device's digital signature. In Advanced and Paranoid transparency modes, exports may include model identifiers, context hashes, memory reference hashes, and policy identifiers. Raw memory content is only included if you explicitly select Paranoid mode with content inclusion enabled. We have no access to your exports.
Sensitive operations (e.g., memory access, policy changes, admin actions) are recorded in the tamper-evident AI Evidence Log with hash chain integrity. These records are designed for accountability and compliance; they are exportable by authorized users and are not used for marketing or analytics.
Ghost does not integrate with any third-party analytics, advertising, or data processing services. The only external communication is model pack downloads (CDN) and optional platform-level purchase verification.
Ghost does not knowingly collect data from children under 13. Since Ghost collects no personal data from any user, this is inherently satisfied.
For privacy inquiries: privacy@ghostfied.com