Class: OllamaEmbedding
OllamaEmbedding is an alias for Ollama that implements the BaseEmbedding interface.
Hierarchy
-
↳
OllamaEmbedding
Implements
Constructors
constructor
• new OllamaEmbedding(params
): OllamaEmbedding
Parameters
Name | Type |
---|---|
params | OllamaParams |
Returns
Inherited from
Defined in
packages/core/src/llm/ollama.ts:75
Properties
embedBatchSize
• embedBatchSize: number
= DEFAULT_EMBED_BATCH_SIZE
Implementation of
Inherited from
Defined in
packages/core/src/embeddings/types.ts:11
hasStreaming
• Readonly
hasStreaming: true
Inherited from
Defined in
packages/core/src/llm/ollama.ts:61
model
• model: string
Inherited from
Defined in
packages/core/src/llm/ollama.ts:66
ollama
• ollama: Ollama
Inherited from
Defined in
packages/core/src/llm/ollama.ts:63
options
• options: Partial
<Omit
<Options
, "temperature"
| "top_p"
| "num_ctx"
>> & Pick
<Options
, "temperature"
| "top_p"
| "num_ctx"
>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:68
Accessors
metadata
• get
metadata(): LLMMetadata
Returns
Inherited from
Ollama.metadata
Defined in
packages/core/src/llm/ollama.ts:87
Methods
abort
▸ abort(): void
Returns
void
Inherited from
Defined in
packages/core/src/llm/ollama.ts:209
chat
▸ chat(params
): Promise
<AsyncIterable
<ChatResponseChunk
>>
Get a chat response from the LLM
Parameters
Name | Type |
---|---|
params | LLMChatParamsStreaming <object , object > |
Returns
Promise
<AsyncIterable
<ChatResponseChunk
>>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:99
▸ chat(params
): Promise
<ChatResponse
<object
>>
Parameters
Name | Type |
---|---|
params | LLMChatParamsNonStreaming <object , object > |
Returns
Promise
<ChatResponse
<object
>>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:102
complete
▸ complete(params
): Promise
<AsyncIterable
<CompletionResponse
>>
Get a prompt completion from the LLM
Parameters
Name | Type |
---|---|
params | LLMCompletionParamsStreaming |
Returns
Promise
<AsyncIterable
<CompletionResponse
>>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:140
▸ complete(params
): Promise
<CompletionResponse
>
Parameters
Name | Type |
---|---|
params | LLMCompletionParamsNonStreaming |
Returns
Promise
<CompletionResponse
>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:143
copy
▸ copy(request
): Promise
<StatusResponse
>
Parameters
Name | Type |
---|---|
request | CopyRequest |
Returns
Promise
<StatusResponse
>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:245
create
▸ create(request
): Promise
<AsyncGenerator
<ProgressResponse
, any
, unknown
>>
Parameters
Name | Type |
---|---|
request | CreateRequest & { stream : true } |
Returns
Promise
<AsyncGenerator
<ProgressResponse
, any
, unknown
>>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:224
▸ create(request
): Promise
<ProgressResponse
>
Parameters
Name | Type |
---|---|
request | CreateRequest & { stream? : false } |
Returns
Promise
<ProgressResponse
>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:227
delete
▸ delete(request
): Promise
<StatusResponse
>
Parameters
Name | Type |
---|---|
request | DeleteRequest |
Returns
Promise
<StatusResponse
>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:242
embeddings
▸ embeddings(request
): Promise
<EmbeddingsResponse
>
Parameters
Name | Type |
---|---|
request | EmbeddingsRequest |
Returns
Promise
<EmbeddingsResponse
>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:254
encodeImage
▸ encodeImage(image
): Promise
<string
>
Parameters
Name | Type |
---|---|
image | string | Uint8Array |
Returns
Promise
<string
>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:212
generate
▸ generate(request
): Promise
<AsyncGenerator
<GenerateResponse
, any
, unknown
>>
Parameters
Name | Type |
---|---|
request | GenerateRequest & { stream : true } |
Returns
Promise
<AsyncGenerator
<GenerateResponse
, any
, unknown
>>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:215
▸ generate(request
): Promise
<GenerateResponse
>
Parameters
Name | Type |
---|---|
request | GenerateRequest & { stream? : false } |
Returns
Promise
<GenerateResponse
>
Inherited from
Defined in
packages/core/src/llm/ollama.ts:218
getQueryEmbedding
▸ getQueryEmbedding(query
): Promise
<number
[]>
Parameters
Name | Type |
---|---|
query | string |
Returns
Promise
<number
[]>
Implementation of
BaseEmbedding.getQueryEmbedding
Inherited from
Defined in
packages/core/src/llm/ollama.ts:194
getTextEmbedding
▸ getTextEmbedding(text
): Promise
<number
[]>
Parameters
Name | Type |
---|---|
text | string |
Returns
Promise
<number
[]>
Implementation of
BaseEmbedding.getTextEmbedding
Inherited from
Defined in
packages/core/src/llm/ollama.ts:190
getTextEmbeddings
▸ getTextEmbeddings(texts
): Promise
<number
[][]>
Optionally override this method to retrieve multiple embeddings in a single request
Parameters
Name | Type |
---|---|
texts | string [] |
Returns
Promise
<number
[][]>
Implementation of
BaseEmbedding.getTextEmbeddings
Inherited from
Defined in
packages/core/src/embeddings/types.ts:28
getTextEmbeddingsBatch
▸ getTextEmbeddingsBatch(texts
, options?
): Promise
<number
[][]>
Get embeddings for a batch of texts
Parameters
Name | Type |
---|---|
texts | string [] |
options? | Object |
options.logProgress? | boolean |
Returns
Promise
<number
[][]>
Implementation of
BaseEmbedding.getTextEmbeddingsBatch
Inherited from
Defined in
packages/core/src/embeddings/types.ts:44
list
▸ list(): Promise
<ListResponse
>
Returns
Promise
<ListResponse
>