Skip to main content

Class: ChatAnthropic

chat_models.ChatAnthropic

Wrapper around Anthropic large language models.

To use you should have the @anthropic-ai/sdk package installed, with the ANTHROPIC_API_KEY environment variable set.

Remarks

Any parameters that are valid to be passed to anthropic.complete can be passed through invocationKwargs, even if not explicitly available on this class.

Hierarchy

Implements

Constructors

constructor

new ChatAnthropic(fields?)

Parameters

NameType
fields?Partial<AnthropicInput> & BaseLanguageModelParams & { anthropicApiKey?: string }

Overrides

BaseChatModel.constructor

Defined in

langchain/src/chat_models/anthropic.ts:130

Properties

apiKey

Optional apiKey: string

Anthropic API key

Implementation of

AnthropicInput.apiKey

Defined in

langchain/src/chat_models/anthropic.ts:106


callbackManager

callbackManager: CallbackManager

Inherited from

BaseChatModel.callbackManager

Defined in

langchain/src/base_language/index.ts:34


caller

Protected caller: AsyncCaller

The async caller should be used by subclasses to make any async calls, which will thus benefit from the concurrency and retry logic.

Inherited from

BaseChatModel.caller

Defined in

langchain/src/base_language/index.ts:40


invocationKwargs

Optional invocationKwargs: Kwargs

Holds any additional parameters that are valid to pass to anthropic.complete that are not explicitly specified on this class.

Implementation of

AnthropicInput.invocationKwargs

Defined in

langchain/src/chat_models/anthropic.ts:118


maxTokensToSample

maxTokensToSample: number = 2048

A maximum number of tokens to generate before stopping.

Implementation of

AnthropicInput.maxTokensToSample

Defined in

langchain/src/chat_models/anthropic.ts:114


modelName

modelName: string = "claude-v1"

Model name to use

Implementation of

AnthropicInput.modelName

Defined in

langchain/src/chat_models/anthropic.ts:116


stopSequences

Optional stopSequences: string[]

A list of strings upon which to stop generating. You probably want ["\n\nHuman:"], as that's the cue for the next turn in the dialog agent.

Implementation of

AnthropicInput.stopSequences

Defined in

langchain/src/chat_models/anthropic.ts:120


streaming

streaming: boolean = false

Whether to stream the results or not

Implementation of

AnthropicInput.streaming

Defined in

langchain/src/chat_models/anthropic.ts:122


temperature

temperature: number = 1

Amount of randomness injected into the response. Ranges from 0 to 1. Use temp closer to 0 for analytical / multiple choice, and temp closer to 1 for creative and generative tasks.

Implementation of

AnthropicInput.temperature

Defined in

langchain/src/chat_models/anthropic.ts:108


topK

topK: number = -1

Only sample from the top K options for each subsequent token. Used to remove "long tail" low probability responses. Defaults to -1, which disables it.

Implementation of

AnthropicInput.topK

Defined in

langchain/src/chat_models/anthropic.ts:110


topP

topP: number = -1

Does nucleus sampling, in which we compute the cumulative distribution over all the options for each subsequent token in decreasing probability order and cut it off once it reaches a particular probability specified by top_p. Defaults to -1, which disables it. Note that you should either alter temperature or top_p, but not both.

Implementation of

AnthropicInput.topP

Defined in

langchain/src/chat_models/anthropic.ts:112


verbose

verbose: boolean

Whether to print out response text.

Inherited from

BaseChatModel.verbose

Defined in

langchain/src/base_language/index.ts:32

Methods

_combineLLMOutput

_combineLLMOutput(): never[]

Returns

never[]

Overrides

BaseChatModel._combineLLMOutput

Defined in

langchain/src/chat_models/anthropic.ts:296


_generate

_generate(messages, stopSequences?): Promise<ChatResult>

Call out to Anthropic's endpoint with k unique prompts

Example

import { ChatAnthropic } from "langchain/llms";
const anthropic = new ChatAnthropic();
const response = await anthropic.generate(new HumanChatMessage(["Tell me a joke."]));

Parameters

NameTypeDescription
messagesBaseChatMessage[]The messages to pass into the model.
stopSequences?string[]Optional list of stop sequences to use when generating.

Returns

Promise<ChatResult>

The full LLM output.

Overrides

BaseChatModel._generate

Defined in

langchain/src/chat_models/anthropic.ts:222


_identifyingParams

_identifyingParams(): Object

Get the identifying parameters of the LLM.

Returns

Object

NameType
model_namestring

Overrides

BaseChatModel._identifyingParams

Defined in

langchain/src/chat_models/anthropic.ts:177


_llmType

_llmType(): string

Returns

string

Overrides

BaseChatModel._llmType

Defined in

langchain/src/chat_models/anthropic.ts:292


_modelType

_modelType(): string

Returns

string

Inherited from

BaseChatModel._modelType

Defined in

langchain/src/chat_models/base.ts:76


call

call(messages, stop?): Promise<BaseChatMessage>

Parameters

NameType
messagesBaseChatMessage[]
stop?string[]

Returns

Promise<BaseChatMessage>

Inherited from

BaseChatModel.call

Defined in

langchain/src/chat_models/base.ts:97


callPrompt

callPrompt(promptValue, stop?): Promise<BaseChatMessage>

Parameters

NameType
promptValueBasePromptValue
stop?string[]

Returns

Promise<BaseChatMessage>

Inherited from

BaseChatModel.callPrompt

Defined in

langchain/src/chat_models/base.ts:106


generate

generate(messages, stop?): Promise<LLMResult>

Parameters

NameType
messagesBaseChatMessage[][]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

BaseChatModel.generate

Defined in

langchain/src/chat_models/base.ts:39


generatePrompt

generatePrompt(promptValues, stop?): Promise<LLMResult>

Parameters

NameType
promptValuesBasePromptValue[]
stop?string[]

Returns

Promise<LLMResult>

Inherited from

BaseChatModel.generatePrompt

Defined in

langchain/src/chat_models/base.ts:82


getNumTokens

getNumTokens(text): Promise<number>

Parameters

NameType
textstring

Returns

Promise<number>

Inherited from

BaseChatModel.getNumTokens

Defined in

langchain/src/base_language/index.ts:62


identifyingParams

identifyingParams(): Object

Get the identifying parameters for the model

Returns

Object

NameType
model_namestring

Defined in

langchain/src/chat_models/anthropic.ts:187


invocationParams

invocationParams(): Omit<SamplingParameters, "prompt"> & Kwargs

Get the parameters used to invoke the model

Returns

Omit<SamplingParameters, "prompt"> & Kwargs

Defined in

langchain/src/chat_models/anthropic.ts:164


serialize

serialize(): SerializedLLM

Return a json-like object representing this LLM.

Returns

SerializedLLM

Inherited from

BaseChatModel.serialize

Defined in

langchain/src/base_language/index.ts:92


deserialize

Static deserialize(data): Promise<BaseLanguageModel>

Load an LLM from a json-like object describing it.

Parameters

NameType
dataSerializedLLM

Returns

Promise<BaseLanguageModel>

Inherited from

BaseChatModel.deserialize

Defined in

langchain/src/base_language/index.ts:103