Class: OllamaEmbedding
OllamaEmbedding is an alias for Ollama that implements the BaseEmbedding interface.
Hierarchy
-
↳
OllamaEmbedding
Implements
Constructors
constructor
• new OllamaEmbedding(params
): OllamaEmbedding
Parameters
Name | Type |
---|---|
params | OllamaParams |
Returns
Inherited from
Defined in
packages/core/src/llm/ollama.ts:75
Properties
embedBatchSize
• embedBatchSize: number
= DEFAULT_EMBED_BATCH_SIZE
Implementation of
Inherited from
Defined in
packages/core/src/embeddings/types.ts:11
hasStreaming
• Readonly
hasStreaming: true
Inherited from
Defined in
packages/core/src/llm/ollama.ts:61
model
• model: string
Inherited from
Defined in
packages/core/src/llm/ollama.ts:66
ollama
• ollama: Ollama
Inherited from
Defined in
packages/core/src/llm/ollama.ts:63
options
• options: Partial
<Omit
<Options
, "temperature"
| "top_p"
| "num_ctx"
>> & Pick
<Options
, "temperature"
| "top_p"
| "num_ctx"
>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:68
Accessors
metadata
• get
metadata(): LLMMetadata
Returns
Inherited from
Ollama.metadata
Defined in
packages/core/src/llm/ollama.ts:87
Methods
abort
▸ abort(): void
Returns
void
Inherited from
Defined in
packages/core/src/llm/ollama.ts:209
chat
▸ chat(params
): Promise
<AsyncIterable
<ChatResponseChunk
>>
Get a chat response from the LLM
Parameters
Name | Type |
---|---|
params | LLMChatParamsStreaming <object , object > |
Returns
Promise
<AsyncIterable
<ChatResponseChunk
>>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:99
▸ chat(params
): Promise
<ChatResponse
<object
>>
Parameters
Name | Type |
---|---|
params | LLMChatParamsNonStreaming <object , object > |
Returns
Promise
<ChatResponse
<object
>>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:102