Class SpringAICustomService
java.lang.Object
com.bytedesk.ai.springai.service.BaseSpringAIService
com.bytedesk.ai.springai.providers.custom.SpringAICustomService
- All Implemented Interfaces:
SpringAIService
统一的OpenAI兼容服务
支持所有基于OpenAI API兼容的LLM提供商,包括:
- OpenAI
- DeepSeek
- GiteeAI
- Tencent HunYuan
- Baidu Qianfan
- Volcengine
- OpenRouter
- SiliconFlow
- Custom providers
等
-
Nested Class Summary
Nested classes/interfaces inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
BaseSpringAIService.PromptResult
-
Field Summary
FieldsFields inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
applicationEventPublisher, chunkElasticService, chunkVectorService, faqElasticService, faqVectorService, messagePersistCache, messageRestService, messageSendService, robotMessageCache, robotRestService, textElasticService, textVectorService, threadRestService, uidUtils
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionprivate org.springframework.ai.openai.OpenAiChatModel
createChatModel
(RobotLlm llm) 根据机器人配置创建动态的OpenAiChatModelprivate org.springframework.ai.openai.OpenAiChatOptions
根据机器人配置创建动态的OpenAiChatOptionsorg.springframework.ai.openai.api.OpenAiApi
createOpenAiApi
(String apiUrl, String apiKey) 根据机器人配置创建动态的OpenAiApiprivate long
estimateTokens
(String text) 估算文本的token数量private ChatTokenUsage
estimateTokenUsageFromText
(String outputText) 从完整响应文本估算token使用量private String
根据LLM配置获取提供商常量(用于统计)private String
getProviderName
(RobotLlm llm) 根据LLM配置获取提供商名称(用于日志)private String
mapProviderNameToConstant
(String providerName) 将提供商名称映射到对应的常量protected void
processPromptSse
(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter, String fullPromptContent) protected String
processPromptSync
(String message, RobotProtobuf robot, String fullPromptContent) protected void
processPromptWebsocket
(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, String fullPromptContent) Token usage data classMethods inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
createDynamicOptions, extractFullPromptContent, extractTextFromResponse, extractTokenUsage, getTokenUnitPrice, handleSseError, isEmitterCompleted, persistMessage, persistMessage, persistMessage, persistMessage, processDirectLlmRequest, processLlmResponseWebsocket, publishAiTokenUsageEvent, recordAiTokenUsage, searchKnowledgeBase, sendMessageWebsocket, sendSseMessage, sendStreamEndMessage, sendStreamMessage, sendStreamStartMessage, sendSyncMessage, sendWebsocketMessage
-
Field Details
-
llmProviderRestService
-
-
Constructor Details
-
SpringAICustomService
public SpringAICustomService()
-
-
Method Details
-
createDynamicOptions
根据机器人配置创建动态的OpenAiChatOptions- Parameters:
llm
- 机器人LLM配置- Returns:
- 根据机器人配置创建的选项
-
createOpenAiApi
根据机器人配置创建动态的OpenAiApi- Parameters:
apiUrl
- API URLapiKey
- API Key- Returns:
- 配置的OpenAiApi实例
-
createChatModel
根据机器人配置创建动态的OpenAiChatModel- Parameters:
llm
- 机器人LLM配置- Returns:
- 配置了特定模型的OpenAiChatModel
-
processPromptWebsocket
protected void processPromptWebsocket(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, String fullPromptContent) Description copied from class:BaseSpringAIService
Token usage data class- Specified by:
processPromptWebsocket
in classBaseSpringAIService
-
processPromptSync
- Specified by:
processPromptSync
in classBaseSpringAIService
-
processPromptSse
protected void processPromptSse(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter, String fullPromptContent) - Specified by:
processPromptSse
in classBaseSpringAIService
-
estimateTokenUsageFromText
从完整响应文本估算token使用量- Parameters:
outputText
- 完整的输出文本- Returns:
- 估算的TokenUsage对象
-
estimateTokens
估算文本的token数量- Parameters:
text
- 输入文本- Returns:
- 估算的token数量
-
getProviderName
根据LLM配置获取提供商名称(用于日志) -
getProviderConstant
根据LLM配置获取提供商常量(用于统计) -
mapProviderNameToConstant
将提供商名称映射到对应的常量
-