Wrapper around Minimax large language models that use the Chat endpoint.
To use you should have the MINIMAX_GROUP_ID and MINIMAX_API_KEY
environment variable set.
Example
// Define a chat prompt with a system message setting the context for translation constchatPrompt = ChatPromptTemplate.fromMessages([ SystemMessagePromptTemplate.fromTemplate( "You are a helpful assistant that translates {input_language} to {output_language}.", ), HumanMessagePromptTemplate.fromTemplate("{text}"), ]);
// Create a new LLMChain with the chat model and the defined prompt constchainB = newLLMChain({ prompt:chatPrompt, llm:newChatMinimax({ temperature:0.01 }), });
// Call the chain with the input language, output language, and the text to translate constresB = awaitchainB.call({ input_language:"English", output_language:"Chinese", text:"I love programming.", });
Wrapper around Minimax large language models that use the Chat endpoint.
To use you should have the
MINIMAX_GROUP_ID
andMINIMAX_API_KEY
environment variable set.Example