Uses of Class
com.bytedesk.ai.springai.providers.ollama.OllamaRequest
Packages that use OllamaRequest
Package
Description
Ollama AI 服务提供者集成包,包含与 Spring AI 框架集成的所有类和配置。
-
Uses of OllamaRequest in com.bytedesk.ai.springai.providers.ollama
Methods in com.bytedesk.ai.springai.providers.ollama with parameters of type OllamaRequestModifier and TypeMethodDescriptionorg.springframework.http.ResponseEntity<JsonResult<?>>SpringAIOllamaChatController.chatCustom(OllamaRequest request) 自定义模型参数的调用示例 http://127.0.0.1:9003/api/v1/ollama/chat/custom?org.springframework.web.servlet.mvc.method.annotation.SseEmitterSpringAIOllamaChatController.chatSSE(OllamaRequest request) 方式3:SSE调用 http://127.0.0.1:9003/api/v1/ollama/chat/sse?reactor.core.publisher.Flux<org.springframework.ai.chat.model.ChatResponse>SpringAIOllamaChatController.chatStream(OllamaRequest request) 方式2:异步流式调用 http://127.0.0.1:9003/api/v1/ollama/chat/stream?org.springframework.http.ResponseEntity<JsonResult<?>>SpringAIOllamaChatController.chatSync(OllamaRequest request) 方式1:同步调用 http://127.0.0.1:9003/api/v1/ollama/chat/sync?io.github.ollama4j.OllamaAPIOllama4jService.createOllamaAPI(OllamaRequest request) 根据请求中的 apiUrl 创建 OllamaAPI 实例org.springframework.http.ResponseEntity<?>Ollama4jRestController.deleteModel(OllamaRequest request) voidOllama4jService.deleteModel(OllamaRequest request) org.springframework.http.ResponseEntity<?>Ollama4jChatController.getAsyncAnswer(OllamaRequest request) org.springframework.http.ResponseEntity<?>Ollama4jChatController.getChatWithContext(OllamaRequest request) org.springframework.http.ResponseEntity<JsonResult<io.github.ollama4j.models.response.LibraryModelDetail>>Ollama4jRestController.getLibraryModelDetails(OllamaRequest request) io.github.ollama4j.models.response.LibraryModelDetailOllama4jService.getLibraryModelDetails(OllamaRequest request) org.springframework.http.ResponseEntity<JsonResult<List<io.github.ollama4j.models.response.Model>>>Ollama4jRestController.getLocalModels(OllamaRequest request) List<io.github.ollama4j.models.response.Model>Ollama4jService.getLocalModels(OllamaRequest request) org.springframework.http.ResponseEntity<JsonResult<io.github.ollama4j.models.response.ModelDetail>>Ollama4jRestController.getModelDetails(OllamaRequest request) io.github.ollama4j.models.response.ModelDetailOllama4jService.getModelDetails(OllamaRequest request) https://ollama4j.github.io/ollama4j/apis-model-management/get-model-detailsorg.springframework.http.ResponseEntity<JsonResult<List<io.github.ollama4j.models.response.LibraryModel>>>Ollama4jRestController.getModels(OllamaRequest request) List<io.github.ollama4j.models.response.LibraryModel>Ollama4jService.getModels(OllamaRequest request) org.springframework.http.ResponseEntity<JsonResult<io.github.ollama4j.models.response.LibraryModelTag>>Ollama4jRestController.getModelTag(OllamaRequest request) io.github.ollama4j.models.response.LibraryModelTagOllama4jService.getModelTag(OllamaRequest request) io.github.ollama4j.models.ps.ModelsProcessResponseOllama4jService.getPs(OllamaRequest request) org.springframework.web.servlet.mvc.method.annotation.SseEmitterOllama4jChatController.getStreamAnswerSse(OllamaRequest request) org.springframework.http.ResponseEntity<?>Ollama4jChatController.getSyncAnswer(OllamaRequest request) org.springframework.http.ResponseEntity<?>Ollama4jChatController.getSyncAnswerStream(OllamaRequest request) org.springframework.http.ResponseEntity<?>SpringAIOllamaChatController.isEmbeddingModelExists(OllamaRequest request) SpringAIOllamaChatService.isModelExists(OllamaRequest request) 检查模型是否存在SpringAIOllamaService.isModelExists(OllamaRequest request) 检查模型是否存在Ollama4jService.isOllama4jReachable(OllamaRequest request) org.springframework.http.ResponseEntity<JsonResult<Boolean>>Ollama4jRestController.ping(OllamaRequest request) org.springframework.http.ResponseEntity<JsonResult<io.github.ollama4j.models.ps.ModelsProcessResponse>>Ollama4jRestController.processModelsResponse(OllamaRequest request) org.springframework.http.ResponseEntity<?>Ollama4jRestController.pullModel(OllamaRequest request) voidOllama4jService.pullModel(OllamaRequest request) https://ollama4j.github.io/ollama4j/apis-model-management/pull-model ollamaAPI.pullModel(OllamaModelType.LLAMA2);voidOllama4jService.pullModel(OllamaRequest request, io.github.ollama4j.models.response.LibraryModelTag libraryModelTag)