Skip to main content

Interface: ModelParams

llms.internal.ModelParams

Hierarchy

Properties

frequencyPenalty

frequencyPenalty: number

Penalizes repeated tokens according to frequency

Defined in

langchain/src/llms/openai-chat.ts:22


logitBias

Optional logitBias: Record<string, number>

Dictionary used to adjust the probability of specific tokens being generated

Defined in

langchain/src/llms/openai-chat.ts:31


n

n: number

Number of chat completions to generate for each prompt

Defined in

langchain/src/llms/openai-chat.ts:28


presencePenalty

presencePenalty: number

Penalizes repeated tokens

Defined in

langchain/src/llms/openai-chat.ts:25


streaming

streaming: boolean

Whether to stream the results or not

Defined in

langchain/src/llms/openai-chat.ts:34


temperature

temperature: number

Sampling temperature to use, between 0 and 2, defaults to 1

Defined in

langchain/src/llms/openai-chat.ts:16


topP

topP: number

Total probability mass of tokens to consider at each step, between 0 and 1, defaults to 1

Defined in

langchain/src/llms/openai-chat.ts:19