React Native AI

March 24, 2026 ยท View on GitHub

image

React Native AI

A collection of on-device AI primitives for React Native with first-class Vercel AI SDK support. Run AI models directly on users' devices for privacy-preserving, low-latency inference without server costs.

Features

  • ๐Ÿš€ Instant AI - Use built-in system models immediately without downloads
  • ๐Ÿ”’ Privacy-first - All processing happens on-device, data stays local
  • ๐ŸŽฏ Vercel AI SDK compatible - Drop-in replacement with familiar APIs
  • ๐ŸŽจ Complete toolkit - Text generation, embeddings, transcription, speech synthesis

AI SDK Compatibility

React Native AIAI SDK
0.11 and belowv5
0.12 and abovev6

DevTools

AI SDK Profiler preview

The AI SDK Profiler plugin captures OpenTelemetry spans from Vercel AI SDK requests and surfaces them in Rozenite DevTools. DevTools are runtime agnostic, so they work with on-device and remote runtimes.

npm install @react-native-ai/dev-tools

Rozenite must be installed and enabled in your app. See the Rozenite getting started guide.

Available Providers

ProviderBuilt-inPlatformsRuntimeDescription
Appleโœ… YesiOSAppleApple Foundation Models, embeddings, transcription, speech
LlamaโŒ NoiOS, Androidllama.rnRun GGUF models via llama.rn
MLCโŒ NoiOS, AndroidMLC LLMRun open-source LLMs via MLC runtime

Apple

Native integration with Apple's on-device AI capabilities. Built-in - no model downloads required, uses system models.

  • Text Generation - Apple Foundation Models for chat and completion
  • Embeddings - NLContextualEmbedding for 512-dimensional semantic vectors
  • Transcription - SpeechAnalyzer for fast, accurate speech-to-text
  • Speech Synthesis - AVSpeechSynthesizer for natural text-to-speech with system voices

Installation

npm install @react-native-ai/apple

No additional linking needed, works immediately on iOS devices (autolinked).

Usage

import { apple } from '@react-native-ai/apple'
import {
  generateText,
  embed,
  experimental_transcribe as transcribe,
  experimental_generateSpeech as speech,
} from 'ai'

// Text generation with Apple Intelligence
const { text } = await generateText({
  model: apple(),
  prompt: 'Explain quantum computing',
})

// Generate embeddings
const { embedding } = await embed({
  model: apple.textEmbeddingModel(),
  value: 'Hello world',
})

// Transcribe audio
const { text } = await transcribe({
  model: apple.transcriptionModel(),
  audio: audioBuffer,
})

// Text-to-speech
const { audio } = await speech({
  model: apple.speechModel(),
  text: 'Hello from Apple!',
})

Availability

FeatureiOS VersionAdditional Requirements
Text GenerationiOS 26+Apple Intelligence device
EmbeddingsiOS 17+-
TranscriptioniOS 26+-
Speech SynthesisiOS 13+iOS 17+ for Personal Voice

See the Apple documentation for detailed setup and usage guides.


Llama

Run any GGUF model on-device using llama.rn. Requires download - models are downloaded from HuggingFace.

Supported Features

FeatureMethodDescription
Text Generationllama.languageModel()Chat, completion, streaming, reasoning models
Embeddingsllama.textEmbeddingModel()Text embeddings for RAG and similarity search
Speechllama.speechModel()Text-to-speech with vocoder models

Installation

npm install @react-native-ai/llama llama.rn react-native-blob-util

Usage

import { llama } from '@react-native-ai/llama'
import { generateText, streamText } from 'ai'

// Create model instance (Model ID format: "owner/repo/filename.gguf")
const model = llama.languageModel(
  'ggml-org/SmolLM3-3B-GGUF/SmolLM3-Q4_K_M.gguf'
)

// Download from HuggingFace (with progress)
await model.download((progress) => {
  console.log(`Downloading: ${progress.percentage}%`)
})

// Initialize model (loads into memory)
await model.prepare()

// Generate text
const { text } = await generateText({
  model,
  messages: [
    { role: 'system', content: 'You are a helpful assistant.' },
    { role: 'user', content: 'Write a haiku about coding.' },
  ],
})

// Cleanup when done
await model.unload()

Model Compatibility

Any GGUF model from HuggingFace can be used. Use the format owner/repo/filename.gguf as the model ID. Popular choices include:

  • ggml-org/SmolLM3-3B-GGUF/SmolLM3-Q4_K_M.gguf
  • bartowski/Llama-3.2-3B-Instruct-GGUF/Llama-3.2-3B-Instruct-Q4_K_M.gguf
  • Qwen/Qwen2.5-1.5B-Instruct-GGUF/qwen2.5-1.5b-instruct-q4_k_m.gguf

๐Ÿ“š View full Llama documentation โ†’


MLC

Run popular open-source LLMs directly on-device using MLC LLM's optimized runtime. Requires download - models must be downloaded before use.

Installation

npm install @react-native-ai/mlc

Requires the "Increased Memory Limit" capability in Xcode. See the getting started guide for setup instructions.

Usage

import { mlc } from '@react-native-ai/mlc'
import { generateText } from 'ai'

// Create model instance
const model = mlc.languageModel('Llama-3.2-3B-Instruct')

// Download and prepare model (one-time setup)
await model.download()
await model.prepare()

// Generate response with Llama via MLC engine
const { text } = await generateText({
  model,
  prompt: 'Explain quantum computing',
})

Available Models

Model IDSize
Llama-3.2-3B-Instruct~2GB
Phi-3-mini-4k-instruct~2.5GB
Mistral-7B-Instruct~4.5GB
Qwen2.5-1.5B-Instruct~1GB

Note

MLC requires iOS devices with sufficient memory (1-8GB depending on model). The prebuilt runtime supports the models listed above. For other models or custom configurations, you'll need to recompile the MLC runtime from source.

Documentation

Comprehensive guides and API references are available at react-native-ai.dev.

Contributing

Read the contribution guidelines before contributing.

Agent skills

This repository provides agent skills to help you integrate and use the packages. You can easily install them with:

npx skills add https://github.com/callstackincubator/react-native-ai --skill react-native-ai-skills

or manually by copying the skills/ directory in your .cursor/ directory.

Made with โค๏ธ at Callstack

react-native-ai is an open source project and will always remain free to use. If you think it's cool, please star it ๐ŸŒŸ.

Callstack is a group of React and React Native geeks, contact us at hello@callstack.com if you need any help with these or just want to say hi!


Made with create-react-native-library