Skip to main content

Interface: AnthropicInput

chat_models.internal.AnthropicInput

Input to AnthropicChat class.

Hierarchy

Implemented by

Properties

apiKey

Optional apiKey: string

Anthropic API key

Defined in

langchain/src/chat_models/anthropic.ts:75


invocationKwargs

Optional invocationKwargs: Kwargs

Holds any additional parameters that are valid to pass to anthropic.complete that are not explicitly specified on this class.

Defined in

langchain/src/chat_models/anthropic.ts:84


maxTokensToSample

maxTokensToSample: number

A maximum number of tokens to generate before stopping.

Inherited from

ModelParams.maxTokensToSample

Defined in

langchain/src/chat_models/anthropic.ts:57


modelName

modelName: string

Model name to use

Defined in

langchain/src/chat_models/anthropic.ts:78


stopSequences

Optional stopSequences: string[]

A list of strings upon which to stop generating. You probably want ["\n\nHuman:"], as that's the cue for the next turn in the dialog agent.

Inherited from

ModelParams.stopSequences

Defined in

langchain/src/chat_models/anthropic.ts:63


streaming

Optional streaming: boolean

Whether to stream the results or not

Inherited from

ModelParams.streaming

Defined in

langchain/src/chat_models/anthropic.ts:66


temperature

Optional temperature: number

Amount of randomness injected into the response. Ranges from 0 to 1. Use temp closer to 0 for analytical / multiple choice, and temp closer to 1 for creative and generative tasks.

Inherited from

ModelParams.temperature

Defined in

langchain/src/chat_models/anthropic.ts:38


topK

Optional topK: number

Only sample from the top K options for each subsequent token. Used to remove "long tail" low probability responses. Defaults to -1, which disables it.

Inherited from

ModelParams.topK

Defined in

langchain/src/chat_models/anthropic.ts:44


topP

Optional topP: number

Does nucleus sampling, in which we compute the cumulative distribution over all the options for each subsequent token in decreasing probability order and cut it off once it reaches a particular probability specified by top_p. Defaults to -1, which disables it. Note that you should either alter temperature or top_p, but not both.

Inherited from

ModelParams.topP

Defined in

langchain/src/chat_models/anthropic.ts:54