Integration with the Ollama SDK.
import { ChatOllama } from "@langchain/ollama";const model = new ChatOllama({ model: "llama3", // Default model.});const result = await model.invoke([ "human", "What is a good name for a company that makes colorful socks?",]);console.log(result); Copy
import { ChatOllama } from "@langchain/ollama";const model = new ChatOllama({ model: "llama3", // Default model.});const result = await model.invoke([ "human", "What is a good name for a company that makes colorful socks?",]);console.log(result);
Optional
Integration with the Ollama SDK.
Example