Class SpringAIOllamaService
java.lang.Object
com.bytedesk.ai.springai.service.BaseSpringAIService
com.bytedesk.ai.springai.providers.ollama.SpringAIOllamaService
- All Implemented Interfaces:
SpringAIService
@Service
@ConditionalOnProperty(name="spring.ai.ollama.chat.enabled",
havingValue="true",
matchIfMissing=false)
public class SpringAIOllamaService
extends BaseSpringAIService
-
Field Summary
FieldsModifier and TypeFieldDescriptionprivate org.springframework.ai.ollama.OllamaChatModel
private org.springframework.ai.ollama.api.OllamaApi
Fields inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
chunkElasticService, chunkVectorService, faqElasticService, faqVectorService, messagePersistCache, messageRestService, messageSendService, robotMessageCache, robotRestService, textElasticService, textVectorService, threadRestService, uidUtils
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionprivate org.springframework.ai.ollama.OllamaChatModel
configureModelWithTimeout
(org.springframework.ai.ollama.OllamaChatModel model, long timeoutMillis) private org.springframework.ai.ollama.OllamaChatModel
根据机器人配置创建动态的OllamaChatModelprivate org.springframework.ai.ollama.api.OllamaOptions
根据机器人配置创建动态的OllamaOptionsorg.springframework.ai.ollama.OllamaChatModel
protected void
processPromptSse
(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter) protected String
processPromptSync
(String message, RobotProtobuf robot) protected void
processPromptWebsocket
(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply) Methods inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
createDynamicOptions, extractTextFromResponse, handleSseError, isEmitterCompleted, persistMessage, processDirectLlmRequest, processLlmResponseWebsocket, searchKnowledgeBase, sendMessageWebsocket, sendSseMessage, sendStreamEndMessage, sendStreamMessage, sendStreamStartMessage, sendSyncMessage, sendWebsocketMessage
-
Field Details
-
bytedeskOllamaChatModel
@Autowired(required=false) @Qualifier("bytedeskOllamaChatModel") private org.springframework.ai.ollama.OllamaChatModel bytedeskOllamaChatModel -
ollamaApi
@Autowired @Qualifier("bytedeskOllamaApi") private org.springframework.ai.ollama.api.OllamaApi ollamaApi
-
-
Constructor Details
-
SpringAIOllamaService
public SpringAIOllamaService()
-
-
Method Details
-
createDynamicOptions
根据机器人配置创建动态的OllamaOptions- Parameters:
llm
- 机器人LLM配置- Returns:
- 根据机器人配置创建的选项
-
createDynamicChatModel
根据机器人配置创建动态的OllamaChatModel- Parameters:
llm
- 机器人LLM配置- Returns:
- 配置了特定模型的OllamaChatModel
-
processPromptWebsocket
protected void processPromptWebsocket(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply) - Specified by:
processPromptWebsocket
in classBaseSpringAIService
-
processPromptSync
- Specified by:
processPromptSync
in classBaseSpringAIService
-
processPromptSse
protected void processPromptSse(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter) - Specified by:
processPromptSse
in classBaseSpringAIService
-
configureModelWithTimeout
private org.springframework.ai.ollama.OllamaChatModel configureModelWithTimeout(org.springframework.ai.ollama.OllamaChatModel model, long timeoutMillis) -
isServiceHealthy
-
getChatModel
public org.springframework.ai.ollama.OllamaChatModel getChatModel()
-