Skip to content

LanceDB Chatbot - Vercel Next.js Template

Use an AI chatbot with website context retrieved from a vector store like LanceDB. LanceDB is lightweight and can be embedded directly into Next.js, with data stored on-prem.

One click deploy on Vercel

Deploy with Vercel

Demo website landing page


First, rename .env.example to .env.local, and fill out OPENAI_API_KEY with your OpenAI API key. You can get one here.

Run the development server:

npm run dev
# or
yarn dev
# or
pnpm dev

Open http://localhost:3000 with your browser to see the result.

This project uses next/font to automatically optimize and load Inter, a custom Google Font.

Learn More

To learn more about LanceDB or Next.js, take a look at the following resources:

LanceDB on Next.js and Vercel

FYI: these configurations have been pre-implemented in this template.

Since LanceDB contains a prebuilt Node binary, you must configure next.config.js to exclude it from webpack. This is required for both using Next.js and deploying on Vercel.

/** @type {import('next').NextConfig} */
module.exports = ({
  webpack(config) {
    config.externals.push({ vectordb: 'vectordb' })
    return config;

To deploy on Vercel, we need to make sure that the NodeJS runtime static file analysis for Vercel can find the binary, since LanceDB uses dynamic imports by default. We can do this by modifying package.json in the scripts section.

  "scripts": {
    "vercel-build": "sed -i 's/nativeLib = require(`@lancedb\\/vectordb-\\${currentTarget()}`);/nativeLib = require(`@lancedb\\/vectordb-linux-x64-gnu`);/' node_modules/vectordb/native.js && next build",