Class: Ollama
This class both implements the LLM and Embedding interfaces.
Hierarchy
-
↳
Ollama
Implements
LLM
Omit
<OllamaBase
,"chat"
>
Constructors
constructor
• new Ollama(params
): Ollama
Parameters
Name | Type |
---|---|
params | OllamaParams |
Returns
Overrides
Defined in
packages/core/src/llm/ollama.ts:75
Properties
embedBatchSize
• embedBatchSize: number
= DEFAULT_EMBED_BATCH_SIZE
Inherited from
Defined in
packages/core/src/embeddings/types.ts:11
hasStreaming
• Readonly
hasStreaming: true
Defined in
packages/core/src/llm/ollama.ts:61
model
• model: string
Defined in
packages/core/src/llm/ollama.ts:66
ollama
• ollama: Ollama
Defined in
packages/core/src/llm/ollama.ts:63
options
• options: Partial
<Omit
<Options
, "temperature"
| "top_p"
| "num_ctx"
>> & Pick
<Options
, "temperature"
| "top_p"
| "num_ctx"
>
Defined in
packages/core/src/llm/ollama.ts:68
Accessors
metadata
• get
metadata(): LLMMetadata
Returns
Implementation of
Defined in
packages/core/src/llm/ollama.ts:87
Methods
abort
▸ abort(): void
Returns
void
Implementation of
Omit.abort
Defined in
packages/core/src/llm/ollama.ts:209
chat
▸ chat(params
): Promise
<AsyncIterable
<ChatResponseChunk
>>
Get a chat response from the LLM
Parameters
Name | Type |
---|---|
params | LLMChatParamsStreaming <object , object > |
Returns
Promise
<AsyncIterable
<ChatResponseChunk
>>
Implementation of
Defined in
packages/core/src/llm/ollama.ts:99
▸ chat(params
): Promise
<ChatResponse
<object
>>
Parameters
Name | Type |
---|---|
params | LLMChatParamsNonStreaming <object , object > |
Returns
Promise
<ChatResponse
<object
>>
Implementation of
Defined in
packages/core/src/llm/ollama.ts:102
complete
▸ complete(params
): Promise
<AsyncIterable
<CompletionResponse
>>
Get a prompt completion from the LLM
Parameters
Name | Type |
---|---|
params | LLMCompletionParamsStreaming |
Returns
Promise
<AsyncIterable
<CompletionResponse
>>
Implementation of
Defined in
packages/core/src/llm/ollama.ts:140
▸ complete(params
): Promise
<CompletionResponse
>
Parameters
Name | Type |
---|---|
params | LLMCompletionParamsNonStreaming |
Returns
Promise
<CompletionResponse
>