Class SpringAIOllamaChatService
java.lang.Object
com.bytedesk.ai.springai.service.BaseSpringAIService
com.bytedesk.ai.springai.providers.ollama.SpringAIOllamaChatService
- All Implemented Interfaces:
SpringAIService
@Service
@ConditionalOnProperty(prefix="spring.ai.ollama.chat",
name="enabled",
havingValue="true",
matchIfMissing=false)
public class SpringAIOllamaChatService
extends BaseSpringAIService
-
Nested Class Summary
Nested classes/interfaces inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
BaseSpringAIService.PromptResult
-
Field Summary
FieldsModifier and TypeFieldDescriptionprivate org.springframework.ai.ollama.api.OllamaApi
private org.springframework.ai.ollama.OllamaChatModel
Fields inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
applicationEventPublisher, chunkElasticService, chunkVectorService, faqElasticService, faqVectorService, messagePersistCache, messageRestService, messageSendService, robotMessageCache, robotRestService, textElasticService, textVectorService, threadRestService, uidUtils
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionorg.springframework.ai.ollama.api.OllamaApi
createOllamaApi
(String apiUrl) private org.springframework.ai.ollama.api.OllamaOptions
根据机器人配置创建动态的OllamaOptionsisModelExists
(OllamaRequest request) 检查模型是否存在protected void
processPromptSse
(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter, String fullPromptContent) protected String
processPromptSync
(String message, RobotProtobuf robot, String fullPromptContent) protected void
processPromptWebsocket
(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, String fullPromptContent) Token usage data classMethods inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
createDynamicOptions, extractFullPromptContent, extractTextFromResponse, extractTokenUsage, getTokenUnitPrice, handleSseError, isEmitterCompleted, persistMessage, persistMessage, persistMessage, persistMessage, processDirectLlmRequest, processLlmResponseWebsocket, publishAiTokenUsageEvent, recordAiTokenUsage, searchKnowledgeBase, sendMessageWebsocket, sendSseMessage, sendStreamEndMessage, sendStreamMessage, sendStreamStartMessage, sendSyncMessage, sendWebsocketMessage
-
Field Details
-
bytedeskOllamaApi
@Autowired @Qualifier("bytedeskOllamaApi") private org.springframework.ai.ollama.api.OllamaApi bytedeskOllamaApi -
bytedeskOllamaChatModel
@Autowired(required=false) @Qualifier("bytedeskOllamaChatModel") private org.springframework.ai.ollama.OllamaChatModel bytedeskOllamaChatModel
-
-
Constructor Details
-
SpringAIOllamaChatService
public SpringAIOllamaChatService()
-
-
Method Details
-
createOllamaOptions
根据机器人配置创建动态的OllamaOptions- Parameters:
llm
- 机器人LLM配置- Returns:
- 根据机器人配置创建的选项
-
createOllamaApi
-
isModelExists
检查模型是否存在- Parameters:
model
- 模型名称- Returns:
- 如果模型存在返回true,否则返回false
-
processPromptWebsocket
protected void processPromptWebsocket(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, String fullPromptContent) Description copied from class:BaseSpringAIService
Token usage data class- Specified by:
processPromptWebsocket
in classBaseSpringAIService
-
processPromptSync
- Specified by:
processPromptSync
in classBaseSpringAIService
-
processPromptSse
protected void processPromptSse(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter, String fullPromptContent) - Specified by:
processPromptSse
in classBaseSpringAIService
-