I'm encountering an issue where the path
module is not being found in my Node.js environment. This problem occurs when trying to send message in the chatbox.
Module not found: Can't resolve 'path'
https://nextjs.org/docs/messages/module-not-found
Import trace for requested module:
./node_modules/dotenv/config.js
./src/lib/db/index.ts
./src/app/api/chat/route.ts
./node_modules/next/dist/build/webpack/loaders/next-edge-app-route-loader/index.js?absolutePagePath=C%3A%5CUsers%5CDell%5COneDrive%5CDesktop%5C100xDevs%5Csummarize-my-pdf-ai%5Csrc%5Capp%5Capi%5Cchat%5Croute.ts&page=%2Fapi%2Fchat%2Froute&appDirLoader=bmV4dC1hcHAtbG9hZGVyP25hbWU9YXBwJTJGYXBpJTJGY2hhdCUyRnJvdXRlJnBhZ2U9JTJGYXBpJTJGY2hhdCUyRnJvdXRlJmFwcFBhdGhzPSZwYWdlUGF0aD1wcml2YXRlLW5leHQtYXBwLWRpciUyRmFwaSUyRmNoYXQlMkZyb3V0ZS50cyZhcHBEaXI9QyUzQSU1Q1VzZXJzJTVDRGVsbCU1Q09uZURyaXZlJTVDRGVza3RvcCU1QzEwMHhEZXZzJTVDc3VtbWFyaXplLW15LXBkZi1haSU1Q3NyYyU1Q2FwcCZwYWdlRXh0ZW5zaW9ucz10c3gmcGFnZUV4dGVuc2lvbnM9dHMmcGFnZUV4dGVuc2lvbnM9anN4JnBhZ2VFeHRlbnNpb25zPWpzJnJvb3REaXI9QyUzQSU1Q1VzZXJzJTVDRGVsbCU1Q09uZURyaXZlJTVDRGVza3RvcCU1QzEwMHhEZXZzJTVDc3VtbWFyaXplLW15LXBkZi1haSZpc0Rldj10cnVlJnRzY29uZmlnUGF0aD10c2NvbmZpZy5qc29uJmJhc2VQYXRoPSZhc3NldFByZWZpeD0mbmV4dENvbmZpZ091dHB1dD0mcHJlZmVycmVkUmVnaW9uPSZtaWRkbGV3YXJlQ29uZmlnPWUzMCUzRCE%3D&nextConfigOutput=&preferredRegion=&middlewareConfig=e30%3D!
route.ts
import { getContext } from "@/lib/context";
import { db } from "@/lib/db";
import { chats, messages as _messages } from "@/lib/db/schema";
import { openai } from "@ai-sdk/openai";
import { streamText } from "ai";
import { eq } from "drizzle-orm";
import { NextResponse } from "next/server";
import { Message } from "ai/react";
export const runtime = "edge";
export async function POST(req: Request) {
console.log("Called api");
try {
const { messages,chatId } = await req.json();
console.log("Messages:", messages);
console.log("Chat ID:", chatId);
const _chats = await db.select().from(chats).where(eq(chats.id, chatId));
// Log retrieved chats
console.log("_chats:", _chats);
if (_chats.length != 1) {
return NextResponse.json({ Error: " Chat not found" }, { status: 404 });
}
const fileKey = _chats[0].fileKey;
const lastMessage = messages[messages.length - 1];
// Log fileKey and lastMessage
console.log("File Key:", fileKey);
console.log("Last Message:", lastMessage);
const context = await getContext(lastMessage.content, fileKey);
console.log("Context:", context);
const prompt = {
role: "system",
content: `AI assistant is a brand new, powerful, human-like artificial intelligence.
The traits of AI include expert knowledge, helpfulness, cleverness, and articulateness.
AI is a well-behaved and well-mannered individual.
AI is always friendly, kind, and inspiring, and he is eager to provide vivid and thoughtful responses to the user.
AI has the sum of all knowledge in their brain, and is able to accurately answer nearly any question about any topic in conversation.
AI assistant is a big fan of Pinecone and Vercel.
START CONTEXT BLOCK
${context}
END OF CONTEXT BLOCK
AI assistant will take into account any CONTEXT BLOCK that is provided in a conversation.
If the context does not provide the answer to question, the AI assistant will say, "I'm sorry, but I don't know the answer to that question".
AI assistant will not apologize for previous responses, but instead will indicated new information was gained.
AI assistant will not invent anything that is not drawn directly from the context.
`,
};
const response = await streamText({
model: openai("gpt-4o-mini"),
messages: [
prompt,
...messages.filter((message: Message) => message.role === "user"),
],
});
return response.toDataStreamResponse();
} catch (error) {
console.log(error);
return NextResponse.json(
{ error: "Internal Server Error" },
{ status: 500 }
);
}
}
ChatComponent.tsx
"use client";
import React from "react";
import { Input } from "./ui/input";
import { useChat } from "ai/react";
import { Button } from "./ui/button";
import { SendIcon } from "lucide-react";
import MessageList from "./MessageList";
type Props = { chatId: number };
const ChatComponent = ({ chatId }: Props) => {
console.log("Chat ID in ChatComponent:", chatId);
const { input, handleInputChange, handleSubmit, messages } = useChat({
api: "/api/chat",
body: {
chatId,
},
});
// React.useEffect(() => {
// const messageContainer = document.getElementById("message-container");
// if (messageContainer) {
// messageContainer.scrollTo({
// top: messageContainer.scrollHeight,
// behavior: "smooth",
// });
// }
// }, [messages]);
return (
<div
className="relative max-h-screen overflow-scroll"
id="message-container"
>
{/* Header */}
<div className="sticky top-0 inset-x-0 p-2 bg-white h-fit">
<h3 className="text-xl font-bold">Chat</h3>
</div>
{/* Message List */}
<MessageList messages={messages} />
<form
onSubmit={handleSubmit}
className="sticky bottom-0 px-2 py-4 inset-x-0 bg-white"
>
<div className="flex">
<Input
value={input}
onChange={handleInputChange}
placeholder="Ask any question..."
className="w-full"
/>
<Button className="bg-gradient-to-r from-sky-400 to-blue-500 ml-2">
<SendIcon className="h-4 w-4" />
</Button>
</div>
</form>
</div>
);
};
export default ChatComponent;
PS C:\Users\Dell\OneDrive\Desktop\100xDevs\summarize-my-pdf-ai> node -v
v18.17.1
PS C:\Users\Dell\OneDrive\Desktop\100xDevs\summarize-my-pdf-ai>
this is console GET /chat/9?_rsc=a12k2 200 in 227ms
GET /chat/8?_rsc=18zah 200 in 383ms
○ Compiling /api/chat ...
⨯ ./node_modules/dotenv/lib/main.js:2:1
Module not found: Can't resolve 'path'
https://nextjs.org/docs/messages/module-not-found
Import trace for requested module:
./node_modules/dotenv/config.js
./src/lib/db/index.ts
./src/app/api/chat/route.ts
./node_modules/next/dist/build/webpack/loaders/next-edge-app-route-loader/index.js?absolutePagePath=C%3A%5CUsers%5CDell%5COneDrive%5CDesktop%5C100xDevs%5Csummarize-my-pdf-ai%5Csrc%5Capp%5Capi%5Cchat%5Croute.ts&page=%2Fapi%2Fchat%2Froute&appDirLoader=bmV4dC1hcHAtbG9hZGVyP25hbWU9YXBwJTJGYXBpJT
JGY2hhdCUyRnJvdXRlJnBhZ2U9JTJGYXBpJTJGY2hhdCUyRnJvdXRlJmFwcFBhdGhzPSZwYWdlUGF0aD1wcml2YXRlLW5leHQtYXBwLWRpciUyRmFwaSUyRmNoYXQlMkZyb3V0ZS50cyZhcHBEaXI9QyUzQSU1Q1VzZXJzJTVDRGVsbCU1Q09uZURyaXZlJTVDRGVza3RvcCU1QzEwMHhEZXZzJTVDc3VtbWFyaXplLW15LXBkZi1haSU1Q3NyYyU1Q2FwcCZwYWdlRXh0ZW5zaW9ucz10c3gmcGFnZUV4dGVuc2lvbnM9dHMmcGFnZUV4dGVuc2lvbnM9anN4JnBhZ2VFeHRlbnNpb25zPWpzJnJvb3REaXI9QyUzQSU1Q1VzZXJzJTVDRGVsbCU1Q09uZURyaXZlJTVDRGVza3RvcCU1QzEwMHhEZXZzJTVDc3VtbWFyaXplLW15LXBkZi1haSZpc0Rldj10cnVlJnRzY29uZmlnUGF0aD10c2NvbmZpZy5qc29uJmJhc2VQYXRoPSZhc3NldFByZWZpeD0mbmV4dENvbmZpZ091dHB1dD0mcHJlZmVycmVkUmVnaW9uPSZtaWRkbGV3YXJlQ29uZmlnPWUzMCUzRCE%3D&nextConfigOutput=&preferredRegion=&middlewareConfig=e30%3D!
When using the following simplified code, I am able to get a response from the chatbox:
import { openai } from "@ai-sdk/openai";
import { streamText } from "ai";
import { NextResponse } from "next/server";
export const runtime = "edge";
export async function POST(req: Request) {
try {
const { messages } = await req.json();
const response = await streamText({
model: openai("gpt-4o-mini"),
messages,
});
return response.toDataStreamResponse();
} catch (error) {
console.log(error);
return NextResponse.json(
{ error: "Internal Server Error" },
{ status: 500 }
);
}
}
This should provide a clear context for the issue and indicate that the problem might be related to the configuration or dependencies rather than the core functionality of fetching chat responses.
route.ts
and ChatComponent.tsx
code.I attempted to implement a chat functionality in my Next.js application using the streamText
function from the ai
package and the openai
model. Specifically, I used the following code to handle POST requests and fetch chat responses:
import { openai } from "@ai-sdk/openai";
import { streamText } from "ai";
import { NextResponse } from "next/server";
export const runtime = "edge";
export async function POST(req: Request) {
try {
const { messages } = await req.json();
const response = await streamText({
model: openai("gpt-4o-mini"),
messages,
});
return response.toDataStreamResponse();
} catch (error) {
console.log(error);
return NextResponse.json(
{ error: "Internal Server Error" },
{ status: 500 }
);
}
}
I expected this implementation to correctly process incoming messages and return a valid response from the chatbox.
What actually resulted?
Although the code runs without errors and returns a response in a simplified setup, I encountered issues with the more complex implementation, which includes additional logic such as database interactions and context handling. In those cases, I am receiving a 500 Internal Server Error
and the response is not as expected. The simplified code works as intended and provides the expected chat responses.
I removed this and it worked!!!!!
export const runtime = "edge";