Found a gem, AI SDK, from Vercel's workshop "Building AI apps with the AI SDK and Next.js" a couple days ago. It has unified provider API, streaming1 API built-in, and, the most important of all, the ability to transform model output into type-safe objects. Better yet, we can easily iterate the output by injecting prompt through schema:
Self-Paced Workshop
The workshop material is available at Build an AI App2. It covers all common uses for AI: extracting and classifying data, and chatbot.
Bonus - Developing An AI App
Tool Calling
No language model is one-fits-all (for now, at least), or applicable for all use cases. And we can leverage language model tool calls3 here.
Take getting real-time weather for example, we can "tell" the model to fetch weather data from an endpoint, then get back to us4. If we want the tool call result to be sent back to the model for further processing, we can enable it by setting maxSteps
.
Debug a Language Model Call, Implement Caching, or Sanitize the Generated Text
See AI SDK's Language Model Middleware.
Feeling Thrifty. How about Hooking AI SDK to a Local Language Model?
Ollama Provider is here to help. See another post Set up a Local LLM for Neovim on Mac for setting up a local language model.
Pour Closure
LLM (large language model) is just an API.
I now see what it is said so in the workshop. With AI SDK, developing an AI app and prompt engineering become a clean and elegant process. And in a foreseeable future, English will become the new programming language.
Refs
Footnotes
-
Not all models support streaming. See AI SDK Provider support section for more. ↩
-
In case the site is not available, its Next repo fork can be found here. ↩
-
Not all models support tool calls. See AI SDK Provider support section for more. ↩
-
In case the site is not available, its Next repo fork can be found here. ↩