Skip to main content

Overview

After the AI responds, ChatJS can generate contextual follow-up questions using a fast, cheap model. These appear as clickable buttons below the last assistant message. Clicking a suggestion sends it as a new user message.

Quick Start

Enable follow-up suggestions in chat.config.ts:
ai: {
  tools: {
    followupSuggestions: {
      enabled: true,
    },
  },
}
No API keys or external services are required. Suggestions use one of your existing language model providers.

How It Works

  1. The main AI response completes
  2. A fast model generates 3-5 contextual follow-up questions
  3. Suggestions stream to the client as the model produces them
  4. They render as buttons below the last assistant message
  5. Clicking a suggestion sends it as a new user message
Suggestions are UI-only. They are never included in the conversation context sent to the LLM.

Configuration

Model Selection

Configure which model generates suggestions in chat.config.ts:
ai: {
  tools: {
    followupSuggestions: {
      enabled: true,
      default: "google/gemini-2.5-flash-lite",
    },
  },
}
Use a fast, inexpensive model here. The suggestions prompt is lightweight and does not require a powerful model.