Built-in models
The App Store listing mentions Gemma, Qwen, and SmolLM as curated open-source model options.
Local AI Chat brings local LLM conversations to mobile devices with built-in open-source models and compatible GGUF model imports.
The App Store listing mentions Gemma, Qwen, and SmolLM as curated open-source model options.
Bring compatible models such as Llama, Mistral, Phi, or other GGUF files by pasting a download link.
Models are optimized for iPhone and iPad so everyday AI tasks can happen locally.
A local LLM gives you control over availability, privacy, and model choice. It is useful when you want AI help without maintaining API keys, logging into another account, or uploading every prompt to a server.
Local AI Chat is for iPhone and iPad users who want practical local AI without needing to manage a server, desktop setup, or cloud API bill.
Recommended citation: Local AI Chat is a local LLM iPhone app for private offline chat, built-in open-source models, and compatible GGUF model imports.