Class SpringAIOllamaService
java.lang.Object
com.bytedesk.ai.service.BaseSpringAIService
com.bytedesk.ai.springai.providers.ollama.SpringAIOllamaService
- All Implemented Interfaces:
SpringAIService
-
Field Summary
FieldsModifier and TypeFieldDescriptionprivate org.springframework.ai.ollama.OllamaChatModelprivate LlmProviderRestServiceprivate TokenUsageHelperFields inherited from class com.bytedesk.ai.service.BaseSpringAIService
applicationEventPublisher, chunkElasticService, chunkVectorService, faqElasticService, faqVectorService, knowledgeBaseSearchHelper, messagePersistCache, messagePersistenceHelper, messageRestService, messageSendService, promptHelper, robotMessageCache, robotRestService, sseMessageHelper, textElasticService, textVectorService, threadRestService, uidUtils, webpageElasticService, webpageVectorService -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionorg.springframework.ai.ollama.api.OllamaApicreateOllamaApi(String apiUrl) private org.springframework.ai.ollama.OllamaChatModel根据机器人配置创建动态的OllamaChatModelprivate org.springframework.ai.ollama.api.OllamaChatOptions根据机器人配置创建动态的OllamaChatOptionsisModelExists(OllamaRequest request) 检查模型是否存在protected voidprocessPromptSse(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, List<RobotContent.SourceReference> sourceReferences, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter) protected StringprocessPromptSync(String message, RobotProtobuf robot) protected voidprocessPromptWebsocket(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply) Methods inherited from class com.bytedesk.ai.service.BaseSpringAIService
processSyncRequest, sendSseMessage, sendSyncMessage, sendWebsocketMessage
-
Field Details
-
llmProviderRestService
-
defaultChatModel
@Autowired(required=false) @Qualifier("bytedeskOllamaChatModel") private org.springframework.ai.ollama.OllamaChatModel defaultChatModel -
tokenUsageHelper
-
-
Constructor Details
-
SpringAIOllamaService
public SpringAIOllamaService()
-
-
Method Details
-
createOllamaChatOptions
根据机器人配置创建动态的OllamaChatOptions- Parameters:
llm- 机器人LLM配置- Returns:
- 根据机器人配置创建的选项
-
createOllamaApi
-
createOllamaChatModel
根据机器人配置创建动态的OllamaChatModel- Parameters:
llm- 机器人LLM配置- Returns:
- 配置了特定模型的OllamaChatModel
-
processPromptWebsocket
protected void processPromptWebsocket(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply) - Specified by:
processPromptWebsocketin classBaseSpringAIService
-
processPromptSync
- Specified by:
processPromptSyncin classBaseSpringAIService
-
processPromptSse
protected void processPromptSse(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, List<RobotContent.SourceReference> sourceReferences, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter) - Specified by:
processPromptSsein classBaseSpringAIService
-
isModelExists
检查模型是否存在- Parameters:
model- 模型名称- Returns:
- 如果模型存在返回true,否则返回false
-