Class SpringAIOllamaService
java.lang.Object
com.bytedesk.ai.springai.service.BaseSpringAIService
com.bytedesk.ai.springai.providers.ollama.SpringAIOllamaService
- All Implemented Interfaces:
SpringAIService
-
Nested Class Summary
Nested classes/interfaces inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
BaseSpringAIService.PromptResult
-
Field Summary
FieldsFields inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
applicationEventPublisher, chunkElasticService, chunkVectorService, faqElasticService, faqVectorService, messagePersistCache, messageRestService, messageSendService, robotMessageCache, robotRestService, textElasticService, textVectorService, threadRestService, uidUtils
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionorg.springframework.ai.ollama.api.OllamaApi
createOllamaApi
(String apiUrl) private org.springframework.ai.ollama.OllamaChatModel
根据机器人配置创建动态的OllamaChatModelprivate org.springframework.ai.ollama.api.OllamaOptions
根据机器人配置创建动态的OllamaOptionsisModelExists
(OllamaRequest request) 检查模型是否存在protected void
processPromptSse
(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter, String fullPromptContent) protected String
processPromptSync
(String message, RobotProtobuf robot, String fullPromptContent) protected void
processPromptWebsocket
(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, String fullPromptContent) Token usage data classMethods inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
createDynamicOptions, extractFullPromptContent, extractTextFromResponse, extractTokenUsage, getTokenUnitPrice, handleSseError, isEmitterCompleted, persistMessage, persistMessage, persistMessage, persistMessage, processDirectLlmRequest, processLlmResponseWebsocket, publishAiTokenUsageEvent, recordAiTokenUsage, searchKnowledgeBase, sendMessageWebsocket, sendSseMessage, sendStreamEndMessage, sendStreamMessage, sendStreamStartMessage, sendSyncMessage, sendWebsocketMessage
-
Field Details
-
llmProviderRestService
-
-
Constructor Details
-
SpringAIOllamaService
public SpringAIOllamaService()
-
-
Method Details
-
createOllamaOptions
根据机器人配置创建动态的OllamaOptions- Parameters:
llm
- 机器人LLM配置- Returns:
- 根据机器人配置创建的选项
-
createOllamaApi
-
createOllamaChatModel
根据机器人配置创建动态的OllamaChatModel- Parameters:
llm
- 机器人LLM配置- Returns:
- 配置了特定模型的OllamaChatModel
-
processPromptWebsocket
protected void processPromptWebsocket(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, String fullPromptContent) Description copied from class:BaseSpringAIService
Token usage data class- Specified by:
processPromptWebsocket
in classBaseSpringAIService
-
processPromptSync
- Specified by:
processPromptSync
in classBaseSpringAIService
-
processPromptSse
protected void processPromptSse(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter, String fullPromptContent) - Specified by:
processPromptSse
in classBaseSpringAIService
-
isModelExists
检查模型是否存在- Parameters:
model
- 模型名称- Returns:
- 如果模型存在返回true,否则返回false
-