Class SpringAIOllamaChatService
java.lang.Object
com.bytedesk.ai.springai.service.BaseSpringAIService
com.bytedesk.ai.springai.providers.ollama.SpringAIOllamaChatService
- All Implemented Interfaces:
SpringAIService
@Service
@ConditionalOnProperty(prefix="spring.ai.ollama.chat",
name="enabled",
havingValue="true",
matchIfMissing=false)
public class SpringAIOllamaChatService
extends BaseSpringAIService
-
Nested Class Summary
Nested classes/interfaces inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
BaseSpringAIService.PromptResult -
Field Summary
FieldsModifier and TypeFieldDescriptionprivate org.springframework.ai.ollama.api.OllamaApiprivate org.springframework.ai.ollama.OllamaChatModelFields inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
applicationEventPublisher, chunkElasticService, chunkVectorService, faqElasticService, faqVectorService, messagePersistCache, messageRestService, messageSendService, robotMessageCache, robotRestService, textElasticService, textVectorService, threadRestService, uidUtils -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionorg.springframework.ai.ollama.api.OllamaApicreateOllamaApi(String apiUrl) private org.springframework.ai.ollama.api.OllamaOptions根据机器人配置创建动态的OllamaOptionsisModelExists(OllamaRequest request) 检查模型是否存在protected voidprocessPromptSse(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter, String fullPromptContent) protected StringprocessPromptSync(String message, RobotProtobuf robot, String fullPromptContent) protected voidprocessPromptWebsocket(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, String fullPromptContent) Token usage data classMethods inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
createDynamicOptions, extractFullPromptContent, extractTextFromResponse, extractTokenUsage, getTokenUnitPrice, handleSseError, isEmitterCompleted, persistMessage, persistMessage, persistMessage, persistMessage, processDirectLlmRequest, processLlmResponseWebsocket, publishAiTokenUsageEvent, recordAiTokenUsage, searchKnowledgeBase, sendMessageWebsocket, sendSseMessage, sendStreamEndMessage, sendStreamMessage, sendStreamStartMessage, sendSyncMessage, sendWebsocketMessage
-
Field Details
-
bytedeskOllamaApi
@Autowired @Qualifier("bytedeskOllamaApi") private org.springframework.ai.ollama.api.OllamaApi bytedeskOllamaApi -
bytedeskOllamaChatModel
@Autowired(required=false) @Qualifier("bytedeskOllamaChatModel") private org.springframework.ai.ollama.OllamaChatModel bytedeskOllamaChatModel
-
-
Constructor Details
-
SpringAIOllamaChatService
public SpringAIOllamaChatService()
-
-
Method Details
-
createOllamaOptions
根据机器人配置创建动态的OllamaOptions- Parameters:
llm- 机器人LLM配置- Returns:
- 根据机器人配置创建的选项
-
createOllamaApi
-
isModelExists
检查模型是否存在- Parameters:
model- 模型名称- Returns:
- 如果模型存在返回true,否则返回false
-
processPromptWebsocket
protected void processPromptWebsocket(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, String fullPromptContent) Description copied from class:BaseSpringAIServiceToken usage data class- Specified by:
processPromptWebsocketin classBaseSpringAIService
-
processPromptSync
- Specified by:
processPromptSyncin classBaseSpringAIService
-
processPromptSse
protected void processPromptSse(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter, String fullPromptContent) - Specified by:
processPromptSsein classBaseSpringAIService
-