Suggest follow-up questions after AI responses, shown only on the last message.
Overview
After the AI responds, generate contextual follow-up questions using a fast model. These are streamed asdata-* parts which are UI-only - they’re filtered out before sending context to the LLM.
File convention: lib/ai/followup-suggestions.ts → components/followup-suggestions.tsx
How it works
- Generate suggestions after the main response using a cheap/fast model
- Stream as
data-followupSuggestionspart (data-* prefix = UI-only) - Ignore data-* parts via
convertToModelMessages({ convertDataPart: () => undefined }) - Render only on the last message
Code
1. Generate & Stream Suggestions
lib/ai/followup-suggestions.ts
2. Call After Response
app/(chat)/api/chat/route.ts
3. Ignore Data Parts in Conversion
lib/ai/core-chat-agent.ts
4. Render on Last Message Only
components/followup-suggestions.tsx