Few companies have had a bigger impact on the frontend developer ecosystem in the 2020s than Vercel, steward of the popular React framework, Next.js. When I first wrote about Vercel, in July 2020, the company had just embraced the Jamstack trend and was liberally using the term “serverless” in its marketing. But with Jamstack on the decline and serverless less of a buzzword now, it’s no surprise that Vercel has latched onto the latest Next Big Thing: generative AI.
Vercel’s relatively new AI SDK has quickly gained traction amongst JavaScript developers — it’s currently running at 40,000 weekly downloads on npm. The reason, of course, is the incredible popularity of AI applications in 2023. Vercel’s CEO Guillermo Rauch tweeted last week that “building AI apps is the #2 reason folks are signing up to @vercel these days, ahead of social/marketing & e-commerce, based on signup surveys.” (While he didn’t specify what was #1, a commenter said it was easy-to-deploy Next.js projects.)
What Is the Vercel AI SDK?
Vercel defines the SDK as an “interoperable, streaming-enabled, edge-ready software development kit for AI apps built with React and Svelte.” It supports React/Next.js and Svelte/SvelteKit, with support for Nuxt/Vue “coming soon.” [Update: Vercel has advised that Nuxt and Solid.js frameworks are both now supported.] On the LLM side of things, the SDK “includes first-class support for OpenAI, LangChain, and Hugging Face Inference.” To complement the SDK, Vercel also offers a playground that has over twenty LLMs on tap.
The appeal of the Vercel AI SDK is similar to what made Vercel so popular with JavaScript developers in the first place: it abstracts away the infrastructure piece of an application.
So how does the SDK compare to existing LLM app stack tools, like LangChain? I checked with Rauch, who said that the Vercel AI SDK is “focused on helping devs build full, rich streaming user interfaces and applications with deep integration/support for frontend frameworks,” whereas “LangChain is focused on ETL [Extract, transform, and load] and prompt engineering.”
Rauch added that the AI SDK has an integration with LangChain. “Devs can use LangChain for prompt engineering and then use the AI SDK for streaming and rendering output in their applications,” he said, via X/Twitter direct message. He pointed me to the LangChain page in its documentation for further reference.
Example AI App: Memorang
To show off its new-found AI prowess, Vercel held an AI Accelerator Demo Day this month. The overall winner was a startup called Memorang, an ed-tech platform described by Vercel as “a complete platform for building AI-powered courses & study apps for any subject.”
Memorang is currently in private beta, but its quick introduction on Demo Day gave us a glimpse into what an AI-based application is nowadays. The founder and CEO, Dr. Yermie Cohen, explained that Memorang was “built on the modern and evolving AI stack, including Vercel, much of which didn’t exist months ago.”
Memorang platform (click for large image)
The first part of Memorang is an “AI-powered headless CMS” called EdWrite, which makes heavy use of generative AI for content generation — in this case, for educational material. Cohen pointed out the scaling benefits of using AI for this type of content. “Your custom workflows are effectively a content cannon that you can aim and fire to build thousands of assessments,” he said.
Using this content, Memorang is able to provide customers (presumably education organizations) with “AI-powered web and mobile study apps that are composable and white labeled.” He then discussed some of the benefits to users of this approach. “When a user completes a study session they get a personalized AI analysis of the performance behavior and tips to improve,” he noted. “Then when reviewing their answers, our AI learning assistant helps them learn more and dig deeper into each practice question.”
Memorang EdWrite
The AI Engineer Stack
While Cohen didn’t discuss the tech stack Memorang is using to create its platform, you can get a clue from looking at the company’s current job vacancies. Specifically, check out these requirements for the job of Full-Stack AI Engineer:
Expertise in TypeScript/JavaScript
Advanced knowledge of best practices in prompt engineering
Completed projects using OpenAI +/- Langchain
Experience with vector databases and semantic search
Expertise in the serverless stack, including GraphQL
Deep understanding of NoSQL database design and access patterns
Frontend skills involving React (understanding of hooks, components)
University Degree (technical field)
The list of tools, libraries and frameworks for the role is as follows:
Langchain.js
AWS Lambda
Pinecone / Weaviate
DynamoDB / MongoDB
Neptune/Neo4j
React + React Native
GraphQL
Next.js
Clearly, React is a big part of building Memorang’s user interface and hooking into AI stack components like LLMs, vector databases and LangChain.
Next Big Thing
For those developers wanting to check out a publicly available AI app, Vercel has a Pokedex template that uses the following tools:
Postgres on Vercel
Prisma as the ORM [Object-relational mapping]
pgvector for vector similarity search
OpenAI embeddings
Built with Next.js App Router
But probably the best place to get started with the Vercel AI SDK is with Vercel’s quickstart documentation. It has instructions for both Next.js and SvelteKit. If you’re still looking for ideas, check out Vercel’s AI app templates and examples.
One final note: clearly Vercel is not yet finished rolling out its AI features. This recent comment on X, by Vercel VP of Developer Experience Lee Robinson, sums it up:
TRENDING STORIES
YOUTUBE.COM/THENEWSTACK
Tech moves fast, don't miss an episode. Subscribe to our YouTube channel to stream all our podcasts, interviews, demos, and more.
Richard MacManus is a Senior Editor at The New Stack and writes about web and application development trends. Previously he founded ReadWriteWeb in 2003 and built it into one of the world’s most influential technology news sites. From the early...