Skip to content

Chat

High-level conversational hook for multi-turn chat. Manages the messages array, streams assistant replies token-by-token, and provides built-in helpers for editing and regenerating messages.

Why not Vercel AI SDK's useChat?

The AI SDK ships its own useChat hook. Here's what aibind's Chat adds on top:

Vercel useChataibind Chat
Message editingchat.edit(id, newText)
Regenerationchat.regenerate()
Branching historyReactiveChatHistory
Tool callingClient-sideServer-side (toolsets)
Optimistic messagesManualchat.optimistic()
Multimodal attachmentsfileToAttachment()
Auto title generationautoTitle: true / chat.generateTitle()
Framework supportReact onlySvelteKit, Next.js, Nuxt, SolidStart, TanStack

The core difference: aibind treats the message list as a tree, not an array. That's what makes edit, regenerate, and branching work without custom state management.

Quickstart

svelte
<script lang="ts">
  import { Chat } from "@aibind/sveltekit";

  const chat = new Chat({ model: "smart" });
  let input = $state("");
</script>

{#each chat.messages as msg}
  <div class={msg.role}>{msg.content}</div>
{/each}

{#if chat.loading}
  <span class="cursor"></span>
{/if}

<form
  onsubmit={(e) => {
    e.preventDefault();
    chat.send(input);
    input = "";
  }}
>
  <input bind:value={input} placeholder="Ask something…" />
  <button disabled={chat.loading}>Send</button>
</form>
tsx
"use client";
import { useChat } from "@aibind/nextjs"; // or @aibind/react
import { useState } from "react";

export default function ChatPage() {
  const { messages, send, loading } = useChat({ model: "smart" });
  const [input, setInput] = useState("");

  return (
    <>
      {messages.map((msg) => (
        <div key={msg.id} className={msg.role}>
          {msg.content}
        </div>
      ))}
      {loading && <span className="cursor"></span>}
      <form
        onSubmit={(e) => {
          e.preventDefault();
          send(input);
          setInput("");
        }}
      >
        <input value={input} onChange={(e) => setInput(e.target.value)} />
        <button disabled={loading}>Send</button>
      </form>
    </>
  );
}
vue
<script setup lang="ts">
import { useChat } from "@aibind/nuxt"; // or @aibind/vue
import { ref } from "vue";

const { messages, send, loading } = useChat({ model: "smart" });
const input = ref("");
</script>

<template>
  <div v-for="msg in messages" :key="msg.id" :class="msg.role">
    {{ msg.content }}
  </div>
  <span v-if="loading" class="cursor"></span>
  <form
    @submit.prevent="
      send(input);
      input = '';
    "
  >
    <input v-model="input" placeholder="Ask something…" />
    <button :disabled="loading">Send</button>
  </form>
</template>
tsx
import { useChat } from "@aibind/solidstart"; // or @aibind/solid
import { createSignal } from "solid-js";

export default function ChatPage() {
  const { messages, send, loading } = useChat({ model: "smart" });
  const [input, setInput] = createSignal("");

  return (
    <>
      <For each={messages()}>
        {(msg) => <div class={msg.role}>{msg.content}</div>}
      </For>
      {loading() && <span class="cursor"></span>}
      <form
        onSubmit={(e) => {
          e.preventDefault();
          send(input());
          setInput("");
        }}
      >
        <input
          value={input()}
          onInput={(e) => setInput(e.currentTarget.value)}
        />
        <button disabled={loading()}>Send</button>
      </form>
    </>
  );
}

API

Options

ts
interface ChatOptions {
  model?: string; // model key passed to your StreamHandler
  system?: string; // system prompt sent with every request
  endpoint?: string; // defaults to /__aibind__/chat
  fetch?: typeof fetch; // custom fetch implementation
  onFinish?: (messages: ChatMessage[]) => void;
  onError?: (error: Error) => void;
  // Tool calling
  toolset?: string; // named toolset registered on the server (defaults to "default")
  maxSteps?: number; // max tool-call → result → LLM rounds per turn (default: 5)
  onToolCall?: (name: string, args: unknown) => void; // fired when a tool is invoked
}

Reactive state

PropertyTypeDescription
messagesChatMessage[]Full conversation history. Each message has id, role, and content.
loadingbooleantrue while the assistant is streaming.
errorError | nullLast error, or null.
statusStreamStatus"idle" / "streaming" / "done" / "error"
hasOptimisticbooleantrue when any message in the array is still optimistic (unconfirmed).

Methods

MethodDescription
send(text, opts?)Append a user message and stream the assistant reply. No-op if text is empty or loading.
abort()Cancel the in-flight request. The partial assistant message stays in history.
clear()Reset to empty conversation.
regenerate()Remove the last assistant reply (and its user turn) and re-send the same user message (with any original attachments).
edit(id, text, opts?)Truncate history from message id onwards and re-send text as a new user turn.
revert()Abort the current request, remove the last user+assistant pair, and return the user's text. Returns null if nothing to revert.
optimistic(text, opts?)Stage a user+assistant message pair immediately without making a request. Returns a StagedMessage handle.

ChatMessage type

ts
interface ChatMessage {
  id: string; // stable UUID, assigned on creation
  role: "user" | "assistant" | "tool";
  content: string; // accumulates during streaming
  optimistic?: boolean; // true until the request is confirmed
  attachments?: Attachment[]; // images/files attached to this message
  toolName?: string; // present on role: "tool" messages
  toolArgs?: unknown; // present on role: "tool" messages
}

When toolset is active, chat.messages includes role: "tool" entries for each tool invocation, inserted before the final assistant response. These are UI-only — they are filtered out before the conversation history is sent to the server.

svelte
{#each chat.messages as msg}
  {#if msg.role === "tool"}
    <div class="tool-call">Called {msg.toolName}</div>
  {:else if msg.role === "assistant"}
    <div class="assistant">{msg.content}</div>
  {:else}
    <div class="user">{msg.content}</div>
  {/if}
{/each}

ChatSendOptions type

ts
interface ChatSendOptions {
  attachments?: Attachment[];
}

StagedMessage

The handle returned by chat.optimistic(text):

ts
interface StagedMessage {
  send(): void; // start streaming — commits the staged pair
  cancel(): void; // remove the staged pair from messages[]
}

Both methods are idempotent. Calling send() after cancel() (or vice versa) is a no-op.

Server setup

Chat sends POST /__aibind__/chat with { messages, system?, model? }. Your StreamHandler handles this automatically — no extra setup needed beyond the standard handler.

ts
import { createStreamHandler } from "@aibind/sveltekit/server";
import { models } from "./models.server";

export const handle = createStreamHandler({ models });
ts
import { createStreamHandler } from "@aibind/nextjs/server";
import { models } from "@/lib/models";

const handler = createStreamHandler({ models });
export const POST = handler.handle;

Tool calling

Chat has first-class support for server-executed tools. Register named toolsets on the server and opt in per chat instance on the client with toolset:

svelte
const chat = new Chat({
  toolset: "search",   // opt in by name — omitting this disables tools entirely
  maxSteps: 5,
  onToolCall(name) { status = `Running ${name}`; },
});

Full tool calling guide

Edit and regenerate

The two most common chat UI actions work out of the box:

svelte
<script lang="ts">
  import { Chat } from "@aibind/sveltekit";
  import type { ChatMessage } from "@aibind/sveltekit";

  const chat = new Chat({ model: "smart" });

  let editingId = $state<string | null>(null);
  let editText = $state("");
</script>

{#each chat.messages as msg}
  <div class={msg.role}>
    {#if editingId === msg.id}
      <input bind:value={editText} />
      <button
        onclick={() => {
          chat.edit(msg.id, editText);
          editingId = null;
        }}
      >
        Save & Resend
      </button>
    {:else}
      <p>{msg.content}</p>
      {#if msg.role === "user"}
        <button
          onclick={() => {
            editingId = msg.id;
            editText = msg.content;
          }}
        >
          Edit
        </button>
      {:else}
        <button onclick={() => chat.regenerate()} disabled={chat.loading}>
          Regenerate
        </button>
      {/if}
    {/if}
  </div>
{/each}

edit(id, text) truncates history from the edited message forward and re-sends the new text. regenerate() removes the last assistant reply and its paired user message, then re-sends the same user prompt.

Attachments

Send images and files alongside text by passing attachments in the second argument to send(), optimistic(), or edit().

Attachment type

ts
interface Attachment {
  mimeType: string; // e.g. "image/png", "application/pdf"
  data?: string; // base64-encoded content (no data: prefix)
  url?: string; // OR a remote URL — mutually exclusive with data
}

fileToAttachment(file)

The fileToAttachment utility converts a browser File object to an Attachment by reading it as base64. Browser-only — uses FileReader.

ts
import { fileToAttachment } from "@aibind/core"; // or your framework package

Full example

svelte
<script lang="ts">
  import { Chat, fileToAttachment } from "@aibind/sveltekit";
  import type { Attachment } from "@aibind/sveltekit";

  const chat = new Chat({ model: "smart" });
  let input = $state("");
  let attachments: Attachment[] = $state([]);

  async function onFileChange(e: Event) {
    const files = (e.target as HTMLInputElement).files;
    if (!files) return;
    attachments = await Promise.all([...files].map(fileToAttachment));
  }

  function send() {
    if (!input.trim() && !attachments.length) return;
    chat.send(input, { attachments });
    input = "";
    attachments = [];
  }
</script>

{#each chat.messages as msg (msg.id)}
  <div class={msg.role}>
    {#if msg.attachments?.length}
      {#each msg.attachments as att}
        {#if att.mimeType.startsWith("image/")}
          <img src="data:{att.mimeType};base64,{att.data}" alt="attachment" />
        {/if}
      {/each}
    {/if}
    <p>{msg.content}</p>
  </div>
{/each}

<form
  onsubmit={(e) => {
    e.preventDefault();
    send();
  }}
>
  <input bind:value={input} placeholder="Ask something…" />
  <input type="file" accept="image/*" multiple onchange={onFileChange} />
  <button disabled={chat.loading}>Send</button>
</form>
tsx
"use client";
import { useChat } from "@aibind/nextjs";
import { fileToAttachment } from "@aibind/core";
import type { Attachment } from "@aibind/nextjs";
import { useState } from "react";

export default function ChatPage() {
  const { messages, send, loading } = useChat({ model: "smart" });
  const [input, setInput] = useState("");
  const [attachments, setAttachments] = useState<Attachment[]>([]);

  async function onFileChange(e: React.ChangeEvent<HTMLInputElement>) {
    const files = e.target.files;
    if (!files) return;
    setAttachments(await Promise.all([...files].map(fileToAttachment)));
  }

  function handleSend() {
    send(input, { attachments });
    setInput("");
    setAttachments([]);
  }

  return (
    <>
      {messages.map((msg) => (
        <div key={msg.id} className={msg.role}>
          {msg.attachments?.map((att, i) =>
            att.mimeType.startsWith("image/") ? (
              <img
                key={i}
                src={`data:${att.mimeType};base64,${att.data}`}
                alt="attachment"
              />
            ) : null,
          )}
          <p>{msg.content}</p>
        </div>
      ))}
      <form
        onSubmit={(e) => {
          e.preventDefault();
          handleSend();
        }}
      >
        <input value={input} onChange={(e) => setInput(e.target.value)} />
        <input type="file" accept="image/*" multiple onChange={onFileChange} />
        <button disabled={loading}>Send</button>
      </form>
    </>
  );
}

The server receives attachments alongside the message and StreamHandler.chat() converts them to the AI SDK's multipart format automatically — images become ImagePart, other files become FilePart.

Optimistic UI

chat.send(text) is fire-and-forget — the user message appears in messages[] instantly and streaming begins. For most apps that's enough.

For flows where you need to show the message first, then decide whether to send — e.g. uploading a file attachment before streaming, showing a confirmation step, or triggering send from a different event — use chat.optimistic(text):

svelte
<script lang="ts">
  import { Chat } from "@aibind/sveltekit";
  import type { StagedMessage } from "@aibind/sveltekit";

  const chat = new Chat({ model: "smart" });

  let staged: StagedMessage | null = $state(null);
  let input = $state("");

  function stage() {
    const text = input.trim();
    if (!text) return;
    input = "";
    staged = chat.optimistic(text); // message appears immediately, no request yet
  }

  function confirm() {
    staged?.send(); // start streaming
    staged = null;
  }

  function cancel() {
    staged?.cancel(); // remove the message
    staged = null;
  }
</script>

{#each chat.messages as msg (msg.id)}
  <div class={msg.role} style:opacity={msg.optimistic ? 0.5 : 1}>
    {msg.content}
  </div>
{/each}

{#if staged}
  <div class="confirm-bar">
    <button onclick={confirm}>Send</button>
    <button onclick={cancel}>Discard</button>
  </div>
{:else}
  <form
    onsubmit={(e) => {
      e.preventDefault();
      stage();
    }}
  >
    <input bind:value={input} />
    <button type="submit">Stage</button>
  </form>
{/if}

Optimistic messages have msg.optimistic === true until the first streaming chunk arrives — use this to render a pending state (dimmed opacity, spinner, etc.). Once streaming starts the flag is cleared automatically.

Undo send with revert()

revert() aborts the current request, removes the last user+assistant pair from messages[], and returns the user's original text so you can put it back in the input:

svelte
{#if chat.error}
  <div class="error">
    {chat.error.message}
    <button
      onclick={() => {
        input = chat.revert() ?? input;
      }}
    >
      Undo send
    </button>
  </div>
{/if}

This is different from abort(): abort() stops streaming but leaves the messages in place. revert() removes them entirely.

System prompt per session

Pass system to set a persistent instruction for the whole conversation:

svelte
<script lang="ts">
  import { Chat } from "@aibind/sveltekit";

  const chat = new Chat({
    model: "smart",
    system: "You are a concise technical assistant. Reply in plain text only.",
  });
</script>

Framework access patterns

ts
import { Chat, fileToAttachment } from "@aibind/sveltekit";
import type {
  Attachment,
  ChatMessage,
  ChatSendOptions,
  StagedMessage,
} from "@aibind/sveltekit";
// Instantiate in <script> — lifecycle tied to component
const chat = new Chat({ model: "smart" });
ts
import { useChat } from "@aibind/nextjs";
import { fileToAttachment } from "@aibind/core";
import type {
  Attachment,
  ChatMessage,
  ChatSendOptions,
  StagedMessage,
} from "@aibind/nextjs";
ts
import { useChat } from "@aibind/react-router";
import { fileToAttachment } from "@aibind/core";
import type {
  Attachment,
  ChatSendOptions,
  StagedMessage,
} from "@aibind/react-router";
ts
import { useChat } from "@aibind/tanstack-start";
import { fileToAttachment } from "@aibind/core";
import type {
  Attachment,
  ChatSendOptions,
  StagedMessage,
} from "@aibind/tanstack-start";
ts
import { useChat } from "@aibind/nuxt";
import { fileToAttachment } from "@aibind/core";
import type {
  Attachment,
  ChatMessage,
  ChatSendOptions,
  StagedMessage,
} from "@aibind/nuxt";
ts
import { useChat } from "@aibind/solidstart";
import { fileToAttachment } from "@aibind/core";
import type {
  Attachment,
  ChatMessage,
  ChatSendOptions,
  StagedMessage,
} from "@aibind/solidstart";

Conversation title generation

Generate a short title from the conversation — like ChatGPT or Claude do automatically — with generateTitle(). The title streams in live, character by character.

Auto-generate after the first turn

svelte
<script lang="ts">
  import { Chat } from "@aibind/sveltekit";

  const chat = new Chat({
    model: "smart",
    autoTitle: true, // fires automatically after the first completed response
  });
</script>

<!-- title streams in live; falls back to static heading until generated -->
<h2>
  {chat.title ?? "New conversation"}
  {#if chat.titleLoading}<span class="typing-cursor">|</span>{/if}
</h2>

Manually at any point

svelte
<script lang="ts">
  import { Chat } from "@aibind/sveltekit";

  const chat = new Chat({ model: "smart" });
</script>

<button onclick={() => chat.generateTitle()}>
  {chat.titleLoading ? "Generating…" : "Generate title"}
</button>
<h2>{chat.title ?? "Untitled"}</h2>

generateTitle() can be called at any point — after the first message, mid-conversation, or to refresh a stale title.

API

Property / MethodTypeDescription
chat.titlestring | nullThe generated title. null until first generation. Streams in live.
chat.titleLoadingbooleantrue while title is generating.
chat.generateTitle(opts?)Promise<void>Generate (or regenerate) the title from the current messages.
autoTitle optionbooleanAuto-call generateTitle() after the first completed turn.
titleEndpoint optionstringCustom endpoint. Defaults to /__aibind__/title.

How it works

The /__aibind__/title endpoint is registered automatically by createStreamHandler — no extra setup needed. It sends up to the first 6 messages to the model with a terse system prompt ("2–6 words, no punctuation, no quotes") and streams the result back as plain text. generateTitle() consumes the stream and writes each chunk into chat.title reactively.

autoTitle: true fires once — after the first completed turn only. Calling generateTitle() manually can refresh the title at any time.

Released under the MIT License.