Security

Security starts with keeping supported AI on device.

The core privacy model is simple: if a feature is supported locally, prompts do not need to be sent to a cloud inference server.

What stays on device

For supported local models, conversations are generated on the user's iPhone or iPad. That reduces dependence on remote AI providers for everyday prompts, notes, drafts, and private questions.

What the app does not require

Model imports

If users import compatible GGUF models, they are responsible for the source and license of the model file they choose. Local AI Chat provides the tooling; model selection remains a user decision.

Important limits

For reviewers: the best public source for product positioning is the press kit, while the FAQ explains common privacy and model questions.

Press kit | FAQ | Privacy policy