Build an AI App with Ease on TypeScript

aifrontendjavascriptnextjstypescript

December 7, 2024  |  3 min read

Found a gem, AI SDK, from Vercel's workshop "Building AI apps with the AI SDK and Next.js" a couple days ago. It has unified provider API, streaming1 API built-in, and, the most important of all, the ability to transform model output into type-safe objects. Better yet, we can easily iterate the output by injecting prompt through schema:

// app/api/notifications/schema.ts; from https://sdk.vercel.ai/docs/ai-sdk-ui/object-generation#schema
import { z } from 'zod';
 
export const notificationSchema = z.object({
  notifications: z.array(
    // Use describe() give model prompt for a specific field
    z.object({
      name: z.string().describe('Name of a fictional person.'),
      message: z.string().describe('Message. Do not use emojis or links.'),
    }),
  ),
});
// app/page.tsx; from https://sdk.vercel.ai/docs/ai-sdk-ui/object-generation#schema
'use client';
 
import { experimental_useObject as useObject } from 'ai/react';
import { notificationSchema } from './api/notifications/schema';
 
export default function Page() {
  const { object, submit } = useObject({
    api: '/api/notifications',
    schema: notificationSchema,
  });
 
  return (
    <>
      /* Use submit() to provide the main prompt. */
      <button onClick={() => submit('Messages during finals week.')}>
        Generate notifications
      </button>
 
      {object?.notifications?.map((notification, index) => (
        <div key={index}>
          <p>{notification?.name}</p>
          <p>{notification?.message}</p>
        </div>
      ))}
    </>
  );
}

Self-Paced Workshop

The workshop material is available at Build an AI App2. It covers all common uses for AI: extracting and classifying data, and chatbot.

Bonus - Developing An AI App

Tool Calling

No language model is one-fits-all (for now, at least), or applicable for all use cases. And we can leverage language model tool calls3 here.

Take getting real-time weather for example, we can "tell" the model to fetch weather data from an endpoint, then get back to us4. If we want the tool call result to be sent back to the model for further processing, we can enable it by setting maxSteps.

Debug a Language Model Call, Implement Caching, or Sanitize the Generated Text

See AI SDK's Language Model Middleware.

Feeling Thrifty. How about Hooking AI SDK to a Local Language Model?

Ollama Provider is here to help. See another post Set up a Local LLM for Neovim on Mac for setting up a local language model.

Pour Closure

LLM (large language model) is just an API.

I now see what it is said so in the workshop. With AI SDK, developing an AI app and prompt engineering become a clean and elegant process. And in a foreseeable future, English will become the new programming language.


Refs

Footnotes

  1. Not all models support streaming. See AI SDK Provider support section for more.

  2. In case the site is not available, its Next repo fork can be found here.

  3. Not all models support tool calls. See AI SDK Provider support section for more.

  4. In case the site is not available, its Next repo fork can be found here.