diff --git a/docs/advanced-features/imgs/ai-assistant-enable-mcp-tools.png b/docs/advanced-features/imgs/ai-assistant-enable-mcp-tools.png new file mode 100644 index 0000000000..5b99aa1285 Binary files /dev/null and b/docs/advanced-features/imgs/ai-assistant-enable-mcp-tools.png differ diff --git a/docs/advanced-features/imgs/ai-assistant-fullscreen.png b/docs/advanced-features/imgs/ai-assistant-fullscreen.png new file mode 100644 index 0000000000..77798c8738 Binary files /dev/null and b/docs/advanced-features/imgs/ai-assistant-fullscreen.png differ diff --git a/docs/advanced-features/imgs/ai-assistant-interface.png b/docs/advanced-features/imgs/ai-assistant-interface.png new file mode 100644 index 0000000000..ab5b8ab932 Binary files /dev/null and b/docs/advanced-features/imgs/ai-assistant-interface.png differ diff --git a/docs/advanced-features/imgs/ai-assistant-mcp-tools-management.png b/docs/advanced-features/imgs/ai-assistant-mcp-tools-management.png new file mode 100644 index 0000000000..a1afaaabd5 Binary files /dev/null and b/docs/advanced-features/imgs/ai-assistant-mcp-tools-management.png differ diff --git a/docs/advanced-features/imgs/ai-assistant-open.png b/docs/advanced-features/imgs/ai-assistant-open.png new file mode 100644 index 0000000000..7d0169cfaf Binary files /dev/null and b/docs/advanced-features/imgs/ai-assistant-open.png differ diff --git a/docs/advanced-features/imgs/ai-assistant-prompts.png b/docs/advanced-features/imgs/ai-assistant-prompts.png new file mode 100644 index 0000000000..5ad998edbd Binary files /dev/null and b/docs/advanced-features/imgs/ai-assistant-prompts.png differ diff --git a/docs/advanced-features/imgs/ai-assistant-settings.png b/docs/advanced-features/imgs/ai-assistant-settings.png new file mode 100644 index 0000000000..3b635634e0 Binary files /dev/null and b/docs/advanced-features/imgs/ai-assistant-settings.png differ diff --git a/docs/advanced-features/imgs/ai-assistant-tool-execution1.png b/docs/advanced-features/imgs/ai-assistant-tool-execution1.png new file mode 100644 index 0000000000..68a9ed28ad Binary files /dev/null and b/docs/advanced-features/imgs/ai-assistant-tool-execution1.png differ diff --git a/docs/advanced-features/imgs/ai-assistant-tool-execution2.png b/docs/advanced-features/imgs/ai-assistant-tool-execution2.png new file mode 100644 index 0000000000..d41191b125 Binary files /dev/null and b/docs/advanced-features/imgs/ai-assistant-tool-execution2.png differ diff --git a/docs/advanced-features/new-ai-plugin-usage.md b/docs/advanced-features/new-ai-plugin-usage.md new file mode 100644 index 0000000000..cf0f532e63 --- /dev/null +++ b/docs/advanced-features/new-ai-plugin-usage.md @@ -0,0 +1,225 @@ +# 新版AI插件使用指南 + +随着TinyEngine低代码平台的不断升级,AI插件也迎来了重大更新。新版AI插件(v2.8以上版本)采用了全新的TinyRobot组件库界面,同时通过使用OpenTiny Next SDK 集成了MCP(Model Context Protocol)工具使用能力,使AI能够调用平台各插件提供的工具能力,实现更强大的自动化操作功能。 + +## 一、功能概览 + +新版AI插件具备以下核心功能: + +1. **全新UI界面**:采用TinyRobot组件库,提供更现代化的聊天体验,支持Markdown语法、支持全屏模式等 +2. **智能对话**:支持与AI进行自然语言交互,完成复杂任务 +3. **Next SDK 与 MCP工具集成**:可以调用平台插件提供的各种工具,如修改属性、修改样式、创建页面等 + +## 二、界面介绍 + +### 2.1 主界面 + +在编辑器界面左下角插件栏,您可以看到 AI 助手的图标。点击图标即可打开主界面对话窗口。 + +![打开AI助手](./imgs/ai-assistant-open.png) + +主界面包含以下元素: +- **欢迎区域**:显示AI助手的欢迎信息 +- **提示项**:快速开始的常用问题示例 +- **对话区域**:显示用户与AI的对话历史 +- **输入区域**:用户输入问题的地方 +- **MC工具按钮**:管理和配置MCP工具的入口 + +![AI插件主界面](./imgs/ai-assistant-interface.png) + +### 2.2 设置 + +点击顶部的设置图标,可以进行以下配置: +- 选择 AI 模型:支持多种大型语言模型 +- 设置 API Token:配置访问 AI 服务所需的认证信息 + +![AI设置面板](./imgs/ai-assistant-settings.png) + +注意:切换 AI 模型会开启新的会话。 + + +### 2.3 MCP工具管理 + +![MCP工具管理](./imgs/ai-assistant-mcp-tools-management.png) + +MCP工具管理面板允许用户: +- 查看启用的MCP服务器 +- 启用/禁用特定的MCP工具 +- 添加新的MCP服务器(即将开放) + +## 三、基础使用流程 + + +### 3.1 设置模型接口 + +1. [可选] 配置自定义的OpenAI兼容格式的大模型接口 + +通过AI插件的`customCompatibleAIModels`选项,支持自定义添加OpenAI兼容格式大模型 + +```javascript +// registry.js +[META_APP.Robot]: { + options: { + customCompatibleAIModels: [ + { label: 'SiliconFlow:DeepSeek-V3', value: 'deepseek-ai/DeepSeek-V3', manufacturer: 'siliconflow' }, + { label: 'Qwen:qwen-max', value: 'qwen-max', manufacturer: 'qwen' }, + ] + } +}, +``` + +2. 配置本地的MCP工具调用使用的AI模型接口Proxy, 方便本地调试,以百炼为例: +```javascript +// vite.config.js +const originProxyConfig = baseConfig.server.proxy +baseConfig.server.proxy = { + '/app-center/api/chat/completions': { + target: 'https://dashscope.aliyuncs.com', + changeOrigin: true, + rewrite: path => path.replace('/app-center/api/', '/compatible-mode/v1/'), + }, + ...originProxyConfig, +} +``` + +### 3.2 打开AI插件 + +点击设计器左下角的AI图标即可打开AI插件对话框 + +### 3.3 配置AI模型 + +首次使用或切换AI模型时,需要进行配置: + +1. 点击右上角设置按钮 +2. 选择合适的AI模型 +3. 输入对应的API Token + +### 3.4 开始对话 + +在输入框中输入问题,按回车或点击发送按钮即可开始对话 + +## 四、MCP工具使用 + +### 4.1 MCP工具概览 + +MCP(Model Connector Protocol)是新版AI插件的核心功能之一。通过MCP,AI可以调用平台中各个插件提供的工具能力,例如: +- 创建新页面 +- 修改组件属性 +- 修改样式设置 +- 查询页面列表 +- 添加国际化内容 + +### 4.2 启用MCP工具 + +1. 点击输入框左侧的"MCP"按钮打开工具管理面板 +2. 查看可用的MCP服务器和工具 +3. 启用需要的工具 + +![启用MCP工具](./imgs/ai-assistant-enable-mcp-tools.png) + +### 4.3 使用MCP工具 + +当启用MCP工具后,AI在对话过程中会自动判断是否需要调用相关工具。例如: + +1. 用户询问:"帮我创建一个用户列表页面" +2. AI识别需要调用"创建页面"工具 +3. 自动执行创建页面操作 +4. 返回操作结果给用户 + +工具调用中(以查询天气Mock工具为例): +![MCP工具调用示例中](./imgs/ai-assistant-tool-execution1.png) +工具调用完成后会返回最终结果,也可以点击展开工具调用参数与结果: +![MCP工具调用示例结果示例](./imgs/ai-assistant-tool-execution2.png) + +## 五、典型使用场景 + +### 5.1 页面搭建场景 (即将开放) + +用户可以通过自然语言描述来生成页面: + +``` +用户:帮我创建一个包含用户信息表单的页面,需要有姓名、邮箱、手机号字段 +AI:好的,我将为您创建一个包含用户信息表单的页面... +[执行创建页面操作] +AI:已完成页面创建,您可以在页面列表中查看新创建的表单页面 +``` + +### 5.2 组件属性修改场景 + +用户可以通过对话修改画布中组件的属性: + +``` +用户:将当前选中按钮的文字改为"提交",颜色改为蓝色 +AI:好的,我将为您修改按钮属性... +[执行修改属性操作] +AI:已完成按钮属性修改 +``` + +### 5.3 样式调整场景 + +用户可以通过自然语言调整组件样式: + +``` +用户:把页面标题的文字大小调整为24px,居中显示 +AI:好的,我将为您调整标题样式... +[执行修改样式操作] +AI:已完成标题样式调整 +``` + +## 六、其他功能 + +### 6.1 会话管理 + +- **新建会话**:点击右上角的"新建会话"按钮可以开始新的对话 + +### 6.2 提示项快速开始 + +界面提供了几个提示项帮助用户快速开始: +- MCP工具使用 +- 页面搭建场景 +- 学习/知识型场景 + +点击任一提示项可以快速发送对应的示例问题。 + +### 6.3 Markdown 支持 + +AI 助手支持 Markdown 格式的消息渲染,可以更好地展示: +- 代码片段 +- 表格 +- 列表 +- 等其他富文本内容 + +### 6.4 全屏模式 + +点击右上角的展开按钮,可以进入全屏模式,获得更大的对话空间。 + +![全屏模式](./imgs/ai-assistant-fullscreen.png) + +## 七、注意事项 + +1. **保存提醒**:在使用AI插件前,请确保当前页面或区块已保存 +2. **网络要求**:使用AI功能需要稳定的网络连接 +3. **API Token**:需要配置有效的API Token才能正常使用AI功能 +4. **MCP工具依赖**:部分MCP工具需要相应的插件支持才能正常工作 + +## 八、故障排除 + +### 8.1 无法连接AI服务 + +- 检查网络连接是否正常 +- 确认API Token是否正确配置 +- 检查AI服务是否正常运行 + +### 8.2 MCP工具无法使用 + +- 确认相关插件是否已安装并启用 +- 检查MCP工具是否已启用 +- 查看控制台是否有错误信息 + +### 8.3 功能异常 + +- 尝试刷新页面 +- 清除浏览器缓存 +- 联系技术支持 + +通过以上介绍,您应该能够熟练使用新版AI插件的各项功能。随着平台的不断发展,AI插件将会支持更多强大的功能,帮助您更高效地进行低代码开发。 diff --git a/docs/catalog.json b/docs/catalog.json index 3066d29117..d2ff885ed9 100644 --- a/docs/catalog.json +++ b/docs/catalog.json @@ -41,6 +41,7 @@ { "title": "循环渲染", "name": "loop-rendering.md" }, { "title": "条件渲染", "name": "conditional-rendering.md" }, { "title": "集成ChatGPT搭建简单页面能力", "name": "integrating-chatgpt-for-simple-pages.md" }, + { "title": "新版AI插件使用", "name": "new-ai-plugin-usage.md" }, { "title": "数据源和Collection—远程字段", "name": "data-source-and-collection-remote-fields.md" }, { "title": "数据源和Collection—mock数据", "name": "data-source-and-collection-mock-data.md" }, { "title": "数据源和Collection—使用数据源", "name": "data-source-and-collection-usage.md" }, diff --git a/packages/plugins/robot/index.ts b/packages/plugins/robot/index.ts index f84692b4e6..e4ce8a93bd 100644 --- a/packages/plugins/robot/index.ts +++ b/packages/plugins/robot/index.ts @@ -13,6 +13,7 @@ import RobotIcon from './src/Main.vue' import metaData from './meta' import './src/styles/vars.less' +import '@opentiny/tiny-robot/dist/style.css' export default { ...metaData, diff --git a/packages/plugins/robot/package.json b/packages/plugins/robot/package.json index 0c2185f383..8ef6b469ef 100644 --- a/packages/plugins/robot/package.json +++ b/packages/plugins/robot/package.json @@ -25,10 +25,17 @@ "license": "MIT", "homepage": "https://opentiny.design/tiny-engine", "dependencies": { - "@opentiny/tiny-engine-meta-register": "workspace:*" + "@opentiny/tiny-engine-meta-register": "workspace:*", + "@opentiny/tiny-robot": "0.3.0-alpha.14", + "@opentiny/tiny-robot-kit": "0.3.0-alpha.14", + "@opentiny/tiny-robot-svgs": "0.3.0-alpha.14", + "dompurify": "^3.0.1", + "highlight.js": "^11.11.1", + "markdown-it": "^14.1.0" }, "devDependencies": { "@opentiny/tiny-engine-vite-plugin-meta-comments": "workspace:*", + "@types/markdown-it": "^14.1.2", "@vitejs/plugin-vue": "^5.1.2", "@vitejs/plugin-vue-jsx": "^4.0.1", "vite": "^5.4.2" diff --git a/packages/plugins/robot/src/Main.vue b/packages/plugins/robot/src/Main.vue index 511909ba80..fb9326ccf2 100644 --- a/packages/plugins/robot/src/Main.vue +++ b/packages/plugins/robot/src/Main.vue @@ -4,19 +4,14 @@ -
-
-
-
- - -
-
-
+
+ + +
+ + +
-
+ + + + +
@@ -113,26 +73,37 @@ + + diff --git a/packages/plugins/robot/src/mcp/McpServer.vue b/packages/plugins/robot/src/mcp/McpServer.vue new file mode 100644 index 0000000000..28d403a041 --- /dev/null +++ b/packages/plugins/robot/src/mcp/McpServer.vue @@ -0,0 +1,179 @@ + + + + + diff --git a/packages/plugins/robot/src/mcp/types.ts b/packages/plugins/robot/src/mcp/types.ts new file mode 100644 index 0000000000..6304732af3 --- /dev/null +++ b/packages/plugins/robot/src/mcp/types.ts @@ -0,0 +1,87 @@ +import type { BubbleContentItem } from '@opentiny/tiny-robot' + +export interface RequestOptions { + url?: string + model?: string + headers?: Record +} + +export interface RequestTool { + type: 'function' + function: { + name: string + description: string + parameters: { + type: 'object' + required?: string[] + properties: Record< + string, + { + type: string + description: string + [prop: string]: unknown + } + > + } + } +} + +export interface LLMMessage { + role: string + content: string + [prop: string]: unknown +} + +export interface RobotMessage { + role: string + content: string | BubbleContentItem[] + [prop: string]: unknown +} + +export interface LLMRequestBody { + model?: string + stream: boolean + messages: LLMMessage[] + tools?: RequestTool[] +} + +export interface ReponseToolCall { + id: string + function: { + name: string + arguments: string + } +} + +export interface LLMResponse { + choices: Array<{ + message: { + role?: string + content: string + tool_calls?: Array + [prop: string]: unknown + } + }> +} + +export interface McpTool { + name: string + description: string + inputSchema?: { + type: 'object' + properties: Record< + string, + { + type: string + description: string + [prop: string]: unknown + } + > + [prop: string]: unknown + } + [prop: string]: unknown +} + +export interface McpListToolsResponse { + tools: Array +} diff --git a/packages/plugins/robot/src/mcp/useMcp.ts b/packages/plugins/robot/src/mcp/useMcp.ts new file mode 100644 index 0000000000..e88ea9b7af --- /dev/null +++ b/packages/plugins/robot/src/mcp/useMcp.ts @@ -0,0 +1,185 @@ +import { computed, ref } from 'vue' +import type { PluginInfo, PluginTool } from '@opentiny/tiny-robot' +import { getMetaApi, META_SERVICE } from '@opentiny/tiny-engine-meta-register' +import type { McpListToolsResponse, McpTool, RequestTool } from './types' + +const ENGINE_MCP_SERVER: PluginInfo = { + id: 'tiny-engine-mcp-server', + name: 'Tiny Engine MCP 服务器', + icon: 'https://res.hc-cdn.com/lowcode-portal/1.1.80.20250515160330/assets/opentiny-tinyengine-logo-4f8a3801.svg', + description: '使用TinyEngine设计器能力,如添加国际化', + added: true +} + +const MOCK_SERVERS: PluginInfo[] = [ + { + id: 'plugin-1', + name: 'Jira 集成 (Mock)', + icon: 'https://ts3.tc.mm.bing.net/th/id/ODLS.2a97aa8b-50c6-4e00-af97-3b563dfa07f4', + description: 'Jira 任务管理', + enabled: true, + added: false, + tools: [ + { id: 'tool-5', name: '创建任务', description: '创建 Jira 任务', enabled: true }, + { id: 'tool-6', name: '查询任务', description: '查询 Jira 任务', enabled: true } + ] + }, + { + id: 'plugin-2', + name: 'Notion 集成 (Mock)', + icon: 'https://www.notion.so/front-static/favicon.ico', + description: 'Notion 文档管理和协作', + enabled: false, + added: false, + tools: [ + { id: 'tool-7', name: '创建页面', description: '创建 Notion 页面', enabled: false }, + { id: 'tool-8', name: '查询数据库', description: '查询 Notion 数据库', enabled: false } + ] + }, + { + id: 'plugin-3', + name: 'Telegram 机器人 (Mock)', + icon: 'https://telegram.org/favicon.ico', + description: 'Telegram 消息推送和自动化', + enabled: false, + added: false, + tools: [{ id: 'tool-9', name: '发送消息', description: '发送 Telegram 消息', enabled: false }] + } +] + +const mcpServers = ref([ENGINE_MCP_SERVER, ...MOCK_SERVERS]) + +const inUseMcpServers = ref([ + { ...ENGINE_MCP_SERVER, enabled: true, expanded: false, tools: [], toolCount: 0 } +]) + +const updateServerTools = (serverId: string, tools: PluginTool[]) => { + const mcpServer = inUseMcpServers.value.find((item) => item.id === serverId) + if (mcpServer) { + mcpServer.tools = tools + mcpServer.toolCount = tools.length + } +} + +const updateEngineTools = async () => { + const tools: Array<{ name: string; description: string; status: string }> = + (await getMetaApi(META_SERVICE.McpService)?.getToolList?.()) || [] + const engineTools = tools.map((tool) => ({ + id: tool.name, + name: tool.name, + description: tool.description, + enabled: tool.status === 'enabled' + })) + updateServerTools(ENGINE_MCP_SERVER.id, engineTools) +} + +const convertMCPToOpenAITools = (mcpTools: McpTool[]): RequestTool[] => { + return mcpTools.map((tool: McpTool) => ({ + type: 'function', + function: { + name: tool.name, + description: tool.description || '', + parameters: { + type: 'object', + properties: Object.fromEntries( + Object.entries(tool.inputSchema?.properties || {}).map(([key, prop]: [string, any]) => [key, { ...prop }]) + ), + required: tool.inputSchema?.required || [] + } + } + })) as RequestTool[] +} + +const getEngineServer = () => { + return inUseMcpServers.value.find((item) => item.id === ENGINE_MCP_SERVER.id) +} + +const isToolsEnabled = computed(() => getEngineServer()?.tools?.some((tool) => tool.enabled)) + +const updateEngineServerToolStatus = (toolId: string, enabled: boolean) => { + getMetaApi(META_SERVICE.McpService)?.updateTool?.(toolId, { enabled }) +} + +const updateEngineServer = (engineServer: PluginInfo, enabled: boolean) => { + engineServer?.tools?.forEach((tool) => { + tool.enabled = enabled + updateEngineServerToolStatus(tool.id, enabled) + }) +} + +// TODO: 连接MCP Server +const connectMcpServer = (_server: PluginInfo) => {} + +// TODO: 断开连接 +const disconnectMcpServer = (_server: PluginInfo) => {} + +const updateMcpServerStatus = async (server: PluginInfo, added: boolean) => { + if (added) { + const newServer: PluginInfo = { + ...server, + id: server.id || `mcp-server-${Date.now()}`, + enabled: true, + added: true, + expanded: false, + tools: server.tools || [], + toolCount: server.tools?.length || 0 + } + inUseMcpServers.value.push(newServer) + if (server.id === ENGINE_MCP_SERVER.id) { + await updateEngineTools() + updateEngineServer(newServer, added) + } + // TODO: 连接MCP Server + connectMcpServer(newServer) + } else { + const index = inUseMcpServers.value.findIndex((p) => p.id === server.id) + if (index > -1) { + updateEngineServer(inUseMcpServers.value[index], added) + inUseMcpServers.value.splice(index, 1) + // TODO: 断开连接 + disconnectMcpServer(server) + } + } +} + +const updateMcpServerToolStatus = (currentServer: PluginInfo, toolId: string, enabled: boolean) => { + const tool = currentServer.tools?.find((t: PluginTool) => t.id === toolId) + if (tool) { + tool.enabled = enabled + if (currentServer.id === ENGINE_MCP_SERVER.id) { + updateEngineServerToolStatus(toolId, enabled) + } else { + // TODO: 更新MCP Server的Tool状态 + // 获取 tool 实例调用 enableTool 或 disableTool + } + } +} + +const refreshMcpServerTools = () => { + updateEngineTools() +} + +const listTools = async (): Promise => + getMetaApi(META_SERVICE.McpService)?.getMcpClient()?.listTools() + +const callTool = async (toolId: string, args: Record) => + getMetaApi(META_SERVICE.McpService)?.getMcpClient()?.callTool({ name: toolId, arguments: args }) || {} + +const getLLMTools = async () => { + const mcpTools = await listTools() + return convertMCPToOpenAITools(mcpTools?.tools || []) +} + +export default function useMcpServer() { + return { + mcpServers, + inUseMcpServers, + refreshMcpServerTools, + updateMcpServerStatus, + updateMcpServerToolStatus, + listTools, + callTool, + getLLMTools, + isToolsEnabled + } +} diff --git a/packages/plugins/robot/src/mcp/utils.ts b/packages/plugins/robot/src/mcp/utils.ts new file mode 100644 index 0000000000..5ef759a8d2 --- /dev/null +++ b/packages/plugins/robot/src/mcp/utils.ts @@ -0,0 +1,113 @@ +import { toRaw } from 'vue' +import useMcpServer from './useMcp' +import type { LLMMessage, RobotMessage } from './types' +import type { LLMRequestBody, LLMResponse, ReponseToolCall, RequestOptions, RequestTool } from './types' + +let requestOptions: RequestOptions = {} + +const fetchLLM = async (messages: LLMMessage[], tools: RequestTool[], options: RequestOptions = requestOptions) => { + const bodyObj: LLMRequestBody = { + model: options?.model || 'deepseek-chat', + stream: false, + messages: toRaw(messages) + } + if (tools.length > 0) { + bodyObj.tools = toRaw(tools) + } + return fetch(options?.url || '/app-center/api/chat/completions', { + method: 'POST', + headers: { + 'Content-Type': 'application/json', + ...options?.headers + }, + body: JSON.stringify(bodyObj) + }) +} + +const parseArgs = (args: string) => { + try { + return JSON.parse(args) + } catch (error) { + return args + } +} + +const handleToolCall = async ( + res: LLMResponse, + tools: RequestTool[], + messages: RobotMessage[], + contextMessages?: RobotMessage[] +) => { + if (messages.length < 1) { + return + } + const currentMessage = messages.at(-1)! + if (typeof currentMessage.content === 'string' || !currentMessage.content) { + currentMessage.content = [] + } + if (res.choices[0].message.content) { + currentMessage.content.push({ + type: 'markdown', + content: res.choices[0].message.content + }) + } + const tool_calls: ReponseToolCall[] | undefined = res.choices[0].message.tool_calls + if (tool_calls && tool_calls.length) { + const historyMessages = contextMessages?.length ? contextMessages : toRaw(messages.slice(0, -1)) + const toolMessages: LLMMessage[] = [...historyMessages, res.choices[0].message] as LLMMessage[] + for (const tool of tool_calls) { + const { name, arguments: args } = tool.function + const parsedArgs = parseArgs(args) + const currentToolMessage = { + type: 'tool', + name, + status: 'running', + content: { + params: parsedArgs + }, + formatPretty: true + } + currentMessage.content.push(currentToolMessage) + const toolCallResult = await useMcpServer().callTool(name, parsedArgs) + toolMessages.push({ + type: 'text', + content: toolCallResult.content, + role: 'tool', + tool_call_id: tool.id + }) + + currentMessage.content.at(-1)!.status = 'success' + currentMessage.content.at(-1)!.content = { + params: parsedArgs, + result: toolCallResult.content + } + } + const newResp = await fetchLLM(toolMessages, tools).then((res) => res.json()) + const hasToolCall = newResp.choices[0].message.tool_calls?.length > 0 + if (hasToolCall) { + await handleToolCall(newResp, tools, messages, toolMessages) + } else { + if (newResp.choices[0].message.content) { + currentMessage.content.push({ + type: 'markdown', + content: newResp.choices[0].message.content + }) + } + } + } +} + +export const sendMcpRequest = async (messages: LLMMessage[], options: RequestOptions = {}) => { + if (messages.length < 1) { + return + } + const tools = await useMcpServer().getLLMTools() + requestOptions = options + const res = await fetchLLM(messages.slice(0, -1), tools, options).then((res) => res.json()) + const hasToolCall = res.choices[0].message.tool_calls?.length > 0 + if (hasToolCall) { + await handleToolCall(res, tools, messages) + } else { + messages.at(-1)!.content = res.choices[0].message.content + } +} diff --git a/packages/register/src/constants.ts b/packages/register/src/constants.ts index d200a3d724..d85e3ff2be 100644 --- a/packages/register/src/constants.ts +++ b/packages/register/src/constants.ts @@ -18,7 +18,8 @@ export const META_SERVICE = { Property: 'engine.service.property', Properties: 'engine.service.properties', ThemeSwitch: 'engine.service.themeSwitch', - Style: 'engine.service.style' + Style: 'engine.service.style', + McpService: 'engine.service.mcpService' } export const META_APP = { diff --git a/packages/register/src/service.ts b/packages/register/src/service.ts index 0161c56652..54b5bcb8b3 100644 --- a/packages/register/src/service.ts +++ b/packages/register/src/service.ts @@ -8,7 +8,7 @@ interface Context { options: K } -interface ServiceOptions { +export interface ServiceOptions { id: string type: 'MetaService' initialState: T