Class: ChatAnthropic
chat_models.ChatAnthropic
Wrapper around Anthropic large language models.
To use you should have the @anthropic-ai/sdk
package installed, with the
ANTHROPIC_API_KEY
environment variable set.
Remarks
Any parameters that are valid to be passed to anthropic.complete
can be passed through invocationKwargs,
even if not explicitly available on this class.
Hierarchy
↳
ChatAnthropic
Implements
Constructors
constructor
• new ChatAnthropic(fields?
)
Parameters
Name | Type |
---|---|
fields? | Partial <AnthropicInput > & BaseLanguageModelParams & { anthropicApiKey? : string } |
Overrides
Defined in
langchain/src/chat_models/anthropic.ts:130
Properties
apiKey
• Optional
apiKey: string
Anthropic API key
Implementation of
Defined in
langchain/src/chat_models/anthropic.ts:106
callbackManager
• callbackManager: CallbackManager
Inherited from
Defined in
langchain/src/base_language/index.ts:34
caller
• Protected
caller: AsyncCaller
The async caller should be used by subclasses to make any async calls, which will thus benefit from the concurrency and retry logic.
Inherited from
Defined in
langchain/src/base_language/index.ts:40
invocationKwargs
• Optional
invocationKwargs: Kwargs
Holds any additional parameters that are valid to pass to anthropic.complete
that are not explicitly specified on this class.
Implementation of
AnthropicInput.invocationKwargs
Defined in
langchain/src/chat_models/anthropic.ts:118
maxTokensToSample
• maxTokensToSample: number
= 2048
A maximum number of tokens to generate before stopping.
Implementation of
AnthropicInput.maxTokensToSample
Defined in
langchain/src/chat_models/anthropic.ts:114
modelName
• modelName: string
= "claude-v1"
Model name to use
Implementation of
Defined in
langchain/src/chat_models/anthropic.ts:116
stopSequences
• Optional
stopSequences: string
[]
A list of strings upon which to stop generating. You probably want ["\n\nHuman:"], as that's the cue for the next turn in the dialog agent.
Implementation of
Defined in
langchain/src/chat_models/anthropic.ts:120
streaming
• streaming: boolean
= false
Whether to stream the results or not
Implementation of
Defined in
langchain/src/chat_models/anthropic.ts:122
temperature
• temperature: number
= 1
Amount of randomness injected into the response. Ranges from 0 to 1. Use temp closer to 0 for analytical / multiple choice, and temp closer to 1 for creative and generative tasks.
Implementation of
Defined in
langchain/src/chat_models/anthropic.ts:108
topK
• topK: number
= -1
Only sample from the top K options for each subsequent token. Used to remove "long tail" low probability responses. Defaults to -1, which disables it.
Implementation of
Defined in
langchain/src/chat_models/anthropic.ts:110
topP
• topP: number
= -1
Does nucleus sampling, in which we compute the cumulative distribution over all the options for each subsequent token in decreasing probability order and cut it off once it reaches a particular probability specified by top_p. Defaults to -1, which disables it. Note that you should either alter temperature or top_p, but not both.
Implementation of
Defined in
langchain/src/chat_models/anthropic.ts:112
verbose
• verbose: boolean
Whether to print out response text.
Inherited from
Defined in
langchain/src/base_language/index.ts:32
Methods
_combineLLMOutput
▸ _combineLLMOutput(): never
[]
Returns
never
[]
Overrides
BaseChatModel._combineLLMOutput
Defined in
langchain/src/chat_models/anthropic.ts:296
_generate
▸ _generate(messages
, stopSequences?
): Promise
<ChatResult
>
Call out to Anthropic's endpoint with k unique prompts
Example
import { ChatAnthropic } from "langchain/llms";
const anthropic = new ChatAnthropic();
const response = await anthropic.generate(new HumanChatMessage(["Tell me a joke."]));
Parameters
Name | Type | Description |
---|---|---|
messages | BaseChatMessage [] | The messages to pass into the model. |
stopSequences? | string [] | Optional list of stop sequences to use when generating. |
Returns
Promise
<ChatResult
>
The full LLM output.
Overrides
Defined in
langchain/src/chat_models/anthropic.ts:222
_identifyingParams
▸ _identifyingParams(): Object
Get the identifying parameters of the LLM.
Returns
Object
Name | Type |
---|---|
model_name | string |
Overrides
BaseChatModel._identifyingParams
Defined in
langchain/src/chat_models/anthropic.ts:177
_llmType
▸ _llmType(): string
Returns
string
Overrides
Defined in
langchain/src/chat_models/anthropic.ts:292
_modelType
▸ _modelType(): string
Returns
string
Inherited from
Defined in
langchain/src/chat_models/base.ts:76
call
▸ call(messages
, stop?
): Promise
<BaseChatMessage
>
Parameters
Name | Type |
---|---|
messages | BaseChatMessage [] |
stop? | string [] |
Returns
Promise
<BaseChatMessage
>
Inherited from
Defined in
langchain/src/chat_models/base.ts:97
callPrompt
▸ callPrompt(promptValue
, stop?
): Promise
<BaseChatMessage
>
Parameters
Name | Type |
---|---|
promptValue | BasePromptValue |
stop? | string [] |
Returns
Promise
<BaseChatMessage
>
Inherited from
Defined in
langchain/src/chat_models/base.ts:106
generate
▸ generate(messages
, stop?
): Promise
<LLMResult
>
Parameters
Name | Type |
---|---|
messages | BaseChatMessage [][] |
stop? | string [] |
Returns
Promise
<LLMResult
>
Inherited from
Defined in
langchain/src/chat_models/base.ts:39
generatePrompt
▸ generatePrompt(promptValues
, stop?
): Promise
<LLMResult
>
Parameters
Name | Type |
---|---|
promptValues | BasePromptValue [] |
stop? | string [] |
Returns
Promise
<LLMResult
>
Inherited from
Defined in
langchain/src/chat_models/base.ts:82
getNumTokens
▸ getNumTokens(text
): Promise
<number
>
Parameters
Name | Type |
---|---|
text | string |
Returns
Promise
<number
>
Inherited from
Defined in
langchain/src/base_language/index.ts:62
identifyingParams
▸ identifyingParams(): Object
Get the identifying parameters for the model
Returns
Object
Name | Type |
---|---|
model_name | string |
Defined in
langchain/src/chat_models/anthropic.ts:187
invocationParams
▸ invocationParams(): Omit
<SamplingParameters
, "prompt"
> & Kwargs
Get the parameters used to invoke the model
Returns
Omit
<SamplingParameters
, "prompt"
> & Kwargs
Defined in
langchain/src/chat_models/anthropic.ts:164
serialize
▸ serialize(): SerializedLLM
Return a json-like object representing this LLM.
Returns
Inherited from
Defined in
langchain/src/base_language/index.ts:92
deserialize
▸ Static
deserialize(data
): Promise
<BaseLanguageModel
>
Load an LLM from a json-like object describing it.
Parameters
Name | Type |
---|---|
data | SerializedLLM |
Returns
Promise
<BaseLanguageModel
>
Inherited from
Defined in
langchain/src/base_language/index.ts:103