Class: FireworksLLM
Hierarchy
-
↳
FireworksLLM
Constructors
constructor
• new FireworksLLM(init?
): FireworksLLM
Parameters
Name | Type |
---|---|
init? | Partial <OpenAI > |
Returns
Overrides
Defined in
packages/core/src/llm/fireworks.ts:5
Properties
additionalChatOptions
• Optional
additionalChatOptions: OpenAIAdditionalChatOptions
Inherited from
Defined in
packages/core/src/llm/open_ai.ts:176
additionalSessionOptions
• Optional
additionalSessionOptions: Omit
<Partial
<ClientOptions
>, "apiKey"
| "timeout"
| "maxRetries"
>
Inherited from
OpenAI.additionalSessionOptions
Defined in
packages/core/src/llm/open_ai.ts:183
apiKey
• Optional
apiKey: string
= undefined
Inherited from
Defined in
packages/core/src/llm/open_ai.ts:179
maxRetries
• maxRetries: number
Inherited from
Defined in
packages/core/src/llm/open_ai.ts:180
maxTokens
• Optional
maxTokens: number
Inherited from
Defined in
packages/core/src/llm/open_ai.ts:175
model
• model: string
Inherited from
Defined in
packages/core/src/llm/open_ai.ts:172
session
• session: OpenAISession
Inherited from
Defined in
packages/core/src/llm/open_ai.ts:182
temperature
• temperature: number
Inherited from
Defined in
packages/core/src/llm/open_ai.ts:173
timeout
• Optional
timeout: number
Inherited from
Defined in
packages/core/src/llm/open_ai.ts:181
topP
• topP: number
Inherited from
Defined in
packages/core/src/llm/open_ai.ts:174
Accessors
metadata
• get
metadata(): LLMMetadata
& OpenAIAdditionalMetadata
Returns
LLMMetadata
& OpenAIAdditionalMetadata
Inherited from
OpenAI.metadata
Defined in
packages/core/src/llm/open_ai.ts:241
Methods
chat
▸ chat(params
): Promise
<AsyncIterable
<{ delta
: string
; options?
: OpenAIAdditionalMessageOptions
}>>
Parameters
Name | Type |
---|---|
params | LLMChatParamsStreaming <OpenAIAdditionalChatOptions > |
Returns
Promise
<AsyncIterable
<{ delta
: string
; options?
: OpenAIAdditionalMessageOptions
}>>
Inherited from
Defined in
packages/core/src/llm/open_ai.ts:321
▸ chat(params
): Promise
<ChatResponse
<OpenAIAdditionalMessageOptions
>>
Parameters
Name | Type |
---|---|
params | LLMChatParamsNonStreaming <OpenAIAdditionalChatOptions > |
Returns
Promise
<ChatResponse
<OpenAIAdditionalMessageOptions
>>