📅 Day 13
🛠️ tool

Vercel AI SDK: The Ultimate LLM Integration for React Native

Discover how the Vercel AI SDK makes integrating AI into your Expo apps incredibly simple with built-in streaming, hooks, and API route support

AIAPI Routes

Welcome to Day 13 of the React Native Advent Calendar!

Today we’re exploring the Vercel AI SDK - a game-changing toolkit that makes integrating AI into your Expo apps as simple as fetching data. No more wrestling with OpenAI’s API, managing streaming responses, or building UI state from scratch!


Why the Vercel AI SDK is a Game-Changer

Building AI features traditionally involves:

  • Managing complex streaming responses manually
  • Handling loading states, errors, and retries
  • Parsing Server-Sent Events (SSE) or custom protocols
  • Building UI state management for conversations
  • Switching between different LLM providers means rewriting code

The Vercel AI SDK solves all of this.

It’s a unified interface that works with OpenAI, Anthropic, Google, Mistral, and 20+ other providers. Switch from GPT-4 to Claude with one line of code. Stream responses without touching SSE. Build chat UIs with zero state management.

Why this matters: You can go from idea to working AI feature in minutes, not days.


The Power Trio: Key Functions & Hooks

The AI SDK gives you three core tools that cover 90% of AI use cases:

1. streamText() - Server-Side Streaming

What it does: Streams AI responses from your API route back to the client.

Real-world use case: Building a recipe generator that streams cooking instructions in real-time.

app/api/recipe+api.ts
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';

export async function POST(request: Request) {
	const { ingredients } = await request.json();

	const result = streamText({
		model: openai('gpt-4-turbo'),
		prompt: `Create a recipe using these ingredients: ${ingredients}. Include prep time, cook time, and step-by-step instructions.`
	});

	return result.toDataStreamResponse();
}

Why it’s great: You don’t handle SSE parsing, chunk management, or response formatting. It just works.


2. useChat() - Chat Interfaces Made Easy

What it does: Manages full chat conversations (messages, loading, errors) with zero state management.

Real-world use case: A travel assistant that remembers context across questions.

app/travel-assistant.tsx
import { useChat } from 'ai/react';
import { View, TextInput, FlatList, Text } from 'react-native';

export default function TravelAssistant() {
	const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat({
		api: '/api/travel'
	});

	return (
		<View>
			<FlatList
				data={messages}
				renderItem={({ item }) => (
					<View>
						<Text>{item.role === 'user' ? 'You' : 'AI'}</Text>
						<Text>{item.content}</Text>
					</View>
				)}
			/>

			<TextInput value={input} onChangeText={handleInputChange} placeholder="Ask about travel..." />

			<Button onPress={handleSubmit} disabled={isLoading} title="Send" />
		</View>
	);
}

Why it’s great: All the chat logic (message history, streaming updates, error handling) is handled. You just build the UI.


3. useCompletion() - Single-Shot Generations

What it does: Generates one-off completions without conversation context.

Real-world use case: Auto-generating product descriptions from images or keywords.

app/product-generator.tsx
import { useCompletion } from 'ai/react';
import { Button, TextInput, Text } from 'react-native';

export default function ProductDescriptionGenerator() {
	const { completion, input, handleInputChange, handleSubmit, isLoading } = useCompletion({
		api: '/api/generate-description'
	});

	return (
		<View>
			<TextInput
				value={input}
				onChangeText={handleInputChange}
				placeholder="Enter product keywords..."
			/>

			<Button onPress={handleSubmit} title="Generate" disabled={isLoading} />

			{completion && <Text>Generated Description: {completion}</Text>}
		</View>
	);
}

Why it’s great: Perfect for forms, auto-complete, or anywhere you need AI-generated text without a back-and-forth conversation.


Real-World Examples Beyond Chat

Here are practical use cases that go beyond the typical chatbot:

1. Smart Form Auto-Fill

Use useCompletion() to generate form data based on partial input:

  • Example: User types company name → AI fills in industry, size, location
  • Hook: useCompletion() for instant suggestions
  • API Route: Uses streamText() with a structured output schema

2. Image-to-Text Descriptions

Combine vision models with the AI SDK:

  • Example: User uploads product photo → AI generates SEO-optimized description
  • Hook: useCompletion() with image input
  • API Route: Uses openai('gpt-4-vision-preview') with streamText()

3. Context-Aware Support Bot

Build a support assistant that knows your app’s state:

  • Example: User asks “Why can’t I save?” → AI analyzes current screen context
  • Hook: useChat() with custom context injection
  • API Route: Passes app state as system message

4. Batch Content Generation

Generate multiple items at once with streaming:

  • Example: Generate 10 social media posts from one article
  • Hook: Custom hook wrapping fetch() with streaming
  • API Route: streamText() with array output

Pro Tips

1. Switch Models in Seconds

Want to try Claude instead of GPT-4? Change one line:

app/api/chat+api.ts
import { anthropic } from '@ai-sdk/anthropic';

const result = streamText({
	model: anthropic('claude-3-5-sonnet-20241022'), // Was: openai('gpt-4-turbo')
	messages
});

2. Add Streaming UI Updates

Show live typing indicators or partial results:

app/index.tsx
const { messages, isLoading } = useChat();

return (
	<View>
		{messages.map((msg) => (
			<Text key={msg.id}>{msg.content}</Text>
		))}
		{isLoading && <Text>AI is typing...</Text>}
	</View>
);

3. Handle Errors Gracefully

The SDK gives you error states for free:

app/index.tsx
const { error, reload } = useChat();

if (error) {
	return <Button onPress={reload} title="Retry" />;
}

Want to Dive Deeper?

Check out our full tutorial on building AI-powered features with streaming responses:

AI Streaming with Vercel AI SDK in React Native


Wrapping Up

The Vercel AI SDK removes all the complexity from AI integration. Stream responses, manage conversations, and switch providers without rewriting code.

Quick recap:

  1. Unified API - One SDK for 20+ LLM providers
  2. Perfect for Expo - Works seamlessly with API routes
  3. Zero boilerplate - useChat() and useCompletion() handle everything
  4. Real-time streaming - Built-in SSE handling
  5. Type-safe - Full TypeScript support

Ready to build?

Built something cool with the AI SDK? Share your project on Twitter!

Tomorrow we’ll explore Day 14’s topic - see you then! 🎄