Compare commits

...

22 Commits
v1.1 ... main

Author SHA1 Message Date
Akarin~
dc6786deab
暗语初步支持 (#27) 2025-06-10 13:30:54 +08:00
6bfa2c39a1
Merge pull request #28 from LiteyukiStudio/snowykami-patch-3
📝 new page
2025-06-10 13:02:32 +08:00
2ce29e45e7
📝 new page 2025-06-10 12:58:52 +08:00
55f9c427b7 🗑️ 标记MarshoTools为弃用 2025-04-05 01:31:35 +08:00
Akarin~
5768b95b09
[WIP] 表情回应支持 (#26)
* 初步支持&utils重构

* 戳一戳支持流式请求

* 移除未使用import

* 解决类型问题
2025-04-04 23:01:01 +08:00
c9d2ef7885 异步化获取夸赞名单与昵称函数 2025-03-29 12:53:20 +08:00
Akarin~
ff6369c1a5
Update README_EN.md 2025-03-25 23:06:52 +08:00
Akarin~
c00cb19e9e
Update README.md 2025-03-25 23:05:58 +08:00
e4490334fa ️ 修改 SSL 问题修复方式 2025-03-23 22:58:10 +08:00
fce3152e17 修改文档url 2025-03-17 23:13:47 +08:00
9878114376 修复夸赞名单报错 2025-03-17 05:25:15 +08:00
21b695f2d4
Merge pull request #22 from LiteyukiStudio/snowykami-patch-2
📝 新增PR预览
2025-03-11 00:05:16 +08:00
02d465112f
📝 新增PR预览 2025-03-11 00:03:54 +08:00
d95928cab7 Merge branch 'main' of https://github.com/LiteyukiStudio/nonebot-plugin-marshoai 2025-03-10 23:57:16 +08:00
41cb287a84 修复流式请求思维链未包含在结构体问题 2025-03-10 23:56:13 +08:00
a0f2b52e59 📝 更新 GitHub Actions 工作流以支持推送和拉取请求 2025-03-10 23:38:42 +08:00
75d173bed7 修改引用链接 2025-03-10 23:24:19 +08:00
f39f5cc1be
Merge pull request #20 from LiteyukiStudio/snowykami-patch-1
📝 更新pages部署地址
2025-03-10 23:13:32 +08:00
70fd176904
📝 更新pages部署地址 2025-03-10 23:08:57 +08:00
57ea4fc10b 📝 引入神秘小js 2025-03-08 23:31:59 +08:00
a1ddf40610 Merge branch 'main' of https://github.com/LiteyukiStudio/nonebot-plugin-marshoai 2025-03-07 21:34:22 +08:00
dc294a257d 📝 禁用干净 URL 设置 2025-03-07 21:34:19 +08:00
17 changed files with 234 additions and 122 deletions

View File

@ -1,9 +1,6 @@
name: Deploy VitePress site to Liteyuki PaaS
on:
push:
branches: [main]
workflow_dispatch:
on: ["push", "pull_request_target"]
permissions:
contents: write
@ -28,7 +25,7 @@ jobs:
- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: '3.11'
python-version: "3.11"
- name: Setup API markdown
run: |-
@ -52,11 +49,10 @@ jobs:
run: |-
pnpm run docs:build
- name: "发布"
run: |
npx -p "@getmeli/cli" meli upload docs/.vitepress/dist \
--url "https://meli.liteyuki.icu" \
--url "https://dash.apage.dev" \
--site "$MELI_SITE" \
--token "$MELI_TOKEN" \
--release "$GITHUB_SHA"

2
CNAME
View File

@ -1 +1 @@
marshoai-docs.meli.liteyuki.icu
marshoai-docs.pages.liteyuki.icu

View File

@ -1,6 +1,6 @@
<!--suppress LongLine -->
<div align="center">
<a href="https://marshoai-docs.meli.liteyuki.icu"><img src="https://marshoai-docs.meli.liteyuki.icu/marsho-full.svg" width="800" height="430" alt="MarshoLogo"></a>
<a href="https://marshoai-docs.pages.liteyuki.icu"><img src="https://marshoai-docs.pages.liteyuki.icu/marsho-full.svg" width="800" height="430" alt="MarshoLogo"></a>
<br>
</div>
@ -48,7 +48,7 @@ _谁不喜欢回复消息快又可爱的猫娘呢_
## 😼 使用
请查看[使用文档](https://marshoai-docs.meli.liteyuki.icu/start/use)
请查看[使用文档](https://marshoai-docs.pages.liteyuki.icu/start/use.html)
## ❤ 鸣谢&版权说明

View File

@ -1,6 +1,6 @@
<!--suppress LongLine -->
<div align="center">
<a href="https://marshoai-docs.meli.liteyuki.icu"><img src="https://marshoai-docs.meli.liteyuki.icu/marsho-full.svg" width="800" height="430" alt="MarshoLogo"></a>
<a href="https://marshoai-docs.pages.liteyuki.icu"><img src="https://marshoai-docs.pages.liteyuki.icu/marsho-full.svg" width="800" height="430" alt="MarshoLogo"></a>
<br>
</div>
@ -48,7 +48,7 @@ Plugin internally installed the catgirl character of Marsho, is able to have a c
- 🐾 Play! I like play with friends!
## 😼 Usage
Please read [Documentation](https://marshoai-docs.meli.liteyuki.icu/start/install)
Please read [Documentation](https://marshoai-docs.pages.liteyuki.icu/start/use.html)
## ❤ Thanks&Copyright
This project uses the following code from other projects:

View File

@ -8,12 +8,13 @@ import { generateSidebar } from 'vitepress-sidebar'
// https://vitepress.dev/reference/site-config
export default defineConfig({
head: [
["script", { src: "https://cdn.liteyuki.icu/js/liteyuki_footer.js" }],
['link', { rel: 'icon', type: 'image/x-icon', href: '/favicon.ico' }],
],
rewrites: {
[`${defaultLang}/:rest*`]: ":rest*",
},
cleanUrls: true,
cleanUrls: false,
themeConfig: {
// https://vitepress.dev/reference/default-theme-config
logo: {

View File

@ -65,7 +65,7 @@ When nonebot linked to OneBot v11 adapter, can recieve double click and response
MarshoTools is a feature added in `v0.5.0`, support loading external function library to provide Function Call for Marsho.
## 🧩 Marsho Plugin
Marsho Plugin is a feature added in `v1.0.0`, replacing the old MarshoTools feature. [Documentation](https://marshoai-docs.meli.liteyuki.icu/dev/extension)
Marsho Plugin is a feature added in `v1.0.0`, replacing the old MarshoTools feature. [Documentation](https://marshoai-docs.pages.liteyuki.icu/dev/extension)
## 👍 Praise list

View File

@ -68,7 +68,7 @@ GitHub Models API 的限制较多,不建议使用,建议通过修改`MARSHOA
## 🧩 小棉插件
小棉插件是`v1.0.0`的新增功能,替代旧的小棉工具功能。[使用文档](https://marshoai-docs.meli.liteyuki.icu/dev/extension)
小棉插件是`v1.0.0`的新增功能,替代旧的小棉工具功能。[使用文档](https://marshoai-docs.pages.liteyuki.icu/dev/extension)
## 👍 夸赞名单

View File

@ -26,17 +26,19 @@ from nonebot.plugin import require
require("nonebot_plugin_alconna")
require("nonebot_plugin_localstore")
require("nonebot_plugin_argot")
import nonebot_plugin_localstore as store # type: ignore
from nonebot import get_driver, logger # type: ignore
from .config import config
# from .hunyuan import *
from .dev import *
from .marsho import *
from .metadata import metadata
# from .hunyuan import *
__author__ = "Asankilp"
__plugin_meta__ = metadata

View File

@ -37,7 +37,7 @@ OPENAI_NEW_MODELS: list = [
INTRODUCTION: str = f"""MarshoAI-NoneBot by LiteyukiStudio
你好喵~我是一只可爱的猫娘AI名叫小棉~🐾
我的主页在这里哦~
https://marshoai-docs.meli.liteyuki.icu
https://marshoai-docs.pages.liteyuki.icu
使用 {config.marshoai_default_name}.status命令获取状态信息
使用{config.marshoai_default_name}.help命令获取使用说明"""

View File

@ -1,15 +1,16 @@
import os
from pathlib import Path
from nonebot import get_driver, logger, require
from nonebot import get_driver, logger, on_command, require
from nonebot.adapters import Bot, Event
from nonebot.matcher import Matcher
from nonebot.typing import T_State
from nonebot_plugin_argot import add_argot, get_message_id
from nonebot_plugin_marshoai.plugin.load import reload_plugin
from .config import config
from .marsho import context
from .instances import context
from .plugin.func_call.models import SessionContext
require("nonebot_plugin_alconna")
@ -48,6 +49,21 @@ function_call = on_alconna(
permission=SUPERUSER,
)
argot_test = on_command("argot", permission=SUPERUSER)
@argot_test.handle()
async def _():
await argot_test.send(
"aa",
argot={
"name": "test",
"command": "test",
"segment": f"{os.getcwd()}",
"expired_at": 1000,
},
)
@function_call.assign("list")
async def list_functions():

View File

@ -1,4 +1,5 @@
import json
from datetime import timedelta
from typing import Optional, Tuple, Union
from azure.ai.inference.models import (
@ -17,10 +18,15 @@ from nonebot.matcher import (
current_event,
current_matcher,
)
from nonebot_plugin_alconna.uniseg import UniMessage, UniMsg
from nonebot_plugin_alconna.uniseg import (
Text,
UniMessage,
UniMsg,
get_target,
)
from nonebot_plugin_argot import Argot # type: ignore
from openai import AsyncOpenAI, AsyncStream
from openai.types.chat import ChatCompletion, ChatCompletionChunk, ChatCompletionMessage
from openai.types.chat.chat_completion import Choice
from .config import config
from .constants import SUPPORT_IMAGE_MODELS
@ -36,6 +42,7 @@ from .util import (
make_chat_openai,
parse_richtext,
)
from .utils.processor import process_chat_stream, process_completion_to_details
class MarshoHandler:
@ -51,7 +58,7 @@ class MarshoHandler:
# self.state: T_State = current_handler.get().state
self.matcher: Matcher = current_matcher.get()
self.message_id: str = UniMessage.get_message_id(self.event)
self.target = UniMessage.get_target(self.event)
self.target = get_target(self.event)
async def process_user_input(
self, user_input: UniMsg, model_name: str
@ -103,7 +110,7 @@ class MarshoHandler:
处理单条聊天
"""
context_msg = get_prompt(model_name) + (
context_msg = await get_prompt(model_name) + (
self.context.build(self.target.id, self.target.private)
)
response = await make_chat_openai(
@ -117,10 +124,10 @@ class MarshoHandler:
async def handle_function_call(
self,
completion: Union[ChatCompletion, AsyncStream[ChatCompletionChunk]],
completion: Union[ChatCompletion],
user_message: Union[str, list],
model_name: str,
tools_list: list,
tools_list: list | None = None,
):
# function call
# 需要获取额外信息,调用函数工具
@ -188,7 +195,7 @@ class MarshoHandler:
self,
user_message: Union[str, list],
model_name: str,
tools_list: list,
tools_list: list | None = None,
stream: bool = False,
tool_message: Optional[list] = None,
) -> Optional[Tuple[UserMessage, ChatCompletionMessage]]:
@ -210,10 +217,7 @@ class MarshoHandler:
tools_list=tools_list,
tool_message=tool_message,
)
if isinstance(response, ChatCompletion):
choice = response.choices[0]
else:
raise ValueError("Unexpected response type")
choice = response.choices[0] # type: ignore
# Sprint(choice)
# 当tool_calls非空时将finish_reason设置为TOOL_CALLS
if choice.message.tool_calls is not None and config.marshoai_fix_toolcalls:
@ -233,12 +237,28 @@ class MarshoHandler:
target_list.append([self.target.id, self.target.private])
# 对话成功发送消息
send_message = UniMessage()
if config.marshoai_enable_richtext_parse:
await (await parse_richtext(str(choice_msg_content))).send(
reply_to=True
)
send_message = await parse_richtext(str(choice_msg_content))
else:
await UniMessage(str(choice_msg_content)).send(reply_to=True)
send_message = UniMessage(str(choice_msg_content))
send_message.append(
Argot(
"detail",
Text(await process_completion_to_details(response)),
command="detail",
expired_at=timedelta(minutes=5),
)
)
# send_message.append(
# Argot(
# "debug",
# Text(str(response)),
# command=f"debug",
# expired_at=timedelta(minutes=5),
# )
# )
await send_message.send(reply_to=True)
return UserMessage(content=user_message), choice_msg_after
elif choice.finish_reason == CompletionsFinishReason.CONTENT_FILTERED:
@ -260,9 +280,9 @@ class MarshoHandler:
self,
user_message: Union[str, list],
model_name: str,
tools_list: list,
tools_list: list | None = None,
tools_message: Optional[list] = None,
) -> Union[ChatCompletion, None]:
) -> ChatCompletion:
"""
处理流式请求
"""
@ -275,54 +295,6 @@ class MarshoHandler:
)
if isinstance(response, AsyncStream):
reasoning_contents = ""
answer_contents = ""
last_chunk = None
is_first_token_appeared = False
is_answering = False
async for chunk in response:
last_chunk = chunk
# print(chunk)
if not is_first_token_appeared:
logger.debug(f"{chunk.id}: 第一个 token 已出现")
is_first_token_appeared = True
if not chunk.choices:
logger.info("Usage:", chunk.usage)
else:
delta = chunk.choices[0].delta
if (
hasattr(delta, "reasoning_content")
and delta.reasoning_content is not None
):
reasoning_contents += delta.reasoning_content
else:
if not is_answering:
logger.debug(
f"{chunk.id}: 思维链已输出完毕或无 reasoning_content 字段输出"
)
is_answering = True
if delta.content is not None:
answer_contents += delta.content
# print(last_chunk)
# 创建新的 ChatCompletion 对象
if last_chunk and last_chunk.choices:
message = ChatCompletionMessage(
content=answer_contents,
role="assistant",
tool_calls=last_chunk.choices[0].delta.tool_calls, # type: ignore
)
choice = Choice(
finish_reason=last_chunk.choices[0].finish_reason, # type: ignore
index=last_chunk.choices[0].index,
message=message,
)
return ChatCompletion(
id=last_chunk.id,
choices=[choice],
created=last_chunk.created,
model=last_chunk.model,
system_fingerprint=last_chunk.system_fingerprint,
object="chat.completion",
usage=last_chunk.usage,
)
return None
return await process_chat_stream(response)
else:
raise TypeError("Unexpected response type for stream request")

View File

@ -15,7 +15,15 @@ from nonebot.params import CommandArg
from nonebot.permission import SUPERUSER
from nonebot.rule import to_me
from nonebot.typing import T_State
from nonebot_plugin_alconna import MsgTarget, UniMessage, UniMsg, on_alconna
from nonebot_plugin_alconna import (
Emoji,
MsgTarget,
UniMessage,
UniMsg,
message_reaction,
on_alconna,
)
from nonebot_plugin_argot.extension import ArgotExtension # type: ignore
from .config import config
from .constants import INTRODUCTION, SUPPORT_IMAGE_MODELS
@ -25,6 +33,7 @@ from .instances import client, context, model_name, target_list, tools
from .metadata import metadata
from .plugin.func_call.caller import get_function_calls
from .util import *
from .utils.processor import process_chat_stream
async def at_enable():
@ -55,6 +64,7 @@ marsho_cmd = on_alconna(
aliases=tuple(config.marshoai_aliases),
priority=96,
block=True,
extensions=[ArgotExtension()],
)
resetmem_cmd = on_alconna(
Alconna(
@ -226,6 +236,7 @@ async def marsho(
if not text:
# 发送说明
# await UniMessage(metadata.usage + "\n当前使用的模型" + model_name).send()
await message_reaction(Emoji("38"))
await marsho_cmd.finish(INTRODUCTION)
backup_context = await get_backup_context(target.id, target.private)
if backup_context:
@ -256,6 +267,7 @@ async def marsho(
map(lambda v: v.data(), get_function_calls().values())
)
logger.info(f"正在获取回答,模型:{model_name}")
await message_reaction(Emoji("66"))
# logger.info(f"上下文:{context_msg}")
response = await handler.handle_common_chat(
usermsg, model_name, tools_lists, config.marshoai_stream
@ -282,19 +294,23 @@ with contextlib.suppress(ImportError): # 优化先不做()
async def poke(event: Event):
user_nickname = await get_nickname_by_user_id(event.get_user_id())
usermsg = await get_prompt(model_name) + [
UserMessage(content=f"*{user_nickname}{config.marshoai_poke_suffix}"),
]
try:
if config.marshoai_poke_suffix != "":
logger.info(f"收到戳一戳,用户昵称:{user_nickname}")
response = await make_chat_openai(
pre_response = await make_chat_openai(
client=client,
model_name=model_name,
msg=get_prompt(model_name)
+ [
UserMessage(
content=f"*{user_nickname}{config.marshoai_poke_suffix}"
),
],
msg=usermsg,
stream=config.marshoai_stream,
)
if isinstance(pre_response, AsyncStream):
response = await process_chat_stream(pre_response)
else:
response = pre_response
choice = response.choices[0] # type: ignore
if choice.finish_reason == CompletionsFinishReason.STOPPED:
content = extract_content_and_think(choice.message)[0]

View File

@ -7,6 +7,7 @@ import sys
import traceback
from nonebot import logger
from typing_extensions import deprecated
from .config import config
@ -73,6 +74,7 @@ class MarshoContext:
return self._get_target_dict(is_private).setdefault(target_id, [])
@deprecated("小棉工具已弃用,无法正常调用")
class MarshoTools:
"""
Marsho 的工具类

View File

@ -2,6 +2,7 @@ import base64
import json
import mimetypes
import re
import ssl
import uuid
from typing import Any, Dict, List, Optional, Union
@ -17,7 +18,7 @@ from nonebot_plugin_alconna import Text as TextMsg
from nonebot_plugin_alconna import UniMessage
from openai import AsyncOpenAI, AsyncStream, NotGiven
from openai.types.chat import ChatCompletion, ChatCompletionChunk, ChatCompletionMessage
from zhDateTime import DateTime
from zhDateTime import DateTime # type: ignore
from ._types import DeveloperMessage
from .cache.decos import *
@ -58,6 +59,8 @@ _praises_init_data = {
"""
初始夸赞名单之数据
"""
_ssl_context = ssl.create_default_context()
_ssl_context.set_ciphers("DEFAULT")
async def get_image_raw_and_type(
@ -74,7 +77,7 @@ async def get_image_raw_and_type(
tuple[bytes, str]: 图片二进制数据, 图片MIME格式
"""
async with httpx.AsyncClient() as client:
async with httpx.AsyncClient(verify=_ssl_context) as client:
response = await client.get(url, headers=_browser_headers, timeout=timeout)
if response.status_code == 200:
# 获取图片数据
@ -98,9 +101,7 @@ async def get_image_b64(url: str, timeout: int = 10) -> Optional[str]:
return: 图片base64编码
"""
if data_type := await get_image_raw_and_type(
url.replace("https://", "http://"), timeout
):
if data_type := await get_image_raw_and_type(url, timeout):
# image_format = content_type.split("/")[1] if content_type else "jpeg"
base64_image = base64.b64encode(data_type[0]).decode("utf-8")
data_url = "data:{};base64,{}".format(data_type[1], base64_image)
@ -136,15 +137,15 @@ async def make_chat_openai(
@from_cache("praises")
def get_praises():
async def get_praises():
praises_file = store.get_plugin_data_file(
"praises.json"
) # 夸赞名单文件使用localstore存储
if not praises_file.exists():
with open(praises_file, "w", encoding="utf-8") as f:
json.dump(_praises_init_data, f, ensure_ascii=False, indent=4)
with open(praises_file, "r", encoding="utf-8") as f:
data = json.load(f)
async with aiofiles.open(praises_file, "w", encoding="utf-8") as f:
await f.write(json.dumps(_praises_init_data, ensure_ascii=False, indent=4))
async with aiofiles.open(praises_file, "r", encoding="utf-8") as f:
data = json.loads(await f.read())
praises_json = data
return praises_json
@ -160,8 +161,8 @@ async def refresh_praises_json():
return data
def build_praises() -> str:
praises = get_praises()
async def build_praises() -> str:
praises = await get_praises()
result = ["你喜欢以下几个人物,他们有各自的优点:"]
for item in praises["like"]:
result.append(f"名字:{item['name']},优点:{item['advantages']}")
@ -213,8 +214,8 @@ async def set_nickname(user_id: str, name: str):
data[user_id] = name
if name == "" and user_id in data:
del data[user_id]
with open(filename, "w", encoding="utf-8") as f:
json.dump(data, f, ensure_ascii=False, indent=4)
async with aiofiles.open(filename, "w", encoding="utf-8") as f:
await f.write(json.dumps(data, ensure_ascii=False, indent=4))
return data
@ -237,11 +238,11 @@ async def refresh_nickname_json():
logger.error("刷新 nickname_json 表错误:无法载入 nickname.json 文件")
def get_prompt(model: str) -> List[Dict[str, Any]]:
async def get_prompt(model: str) -> List[Dict[str, Any]]:
"""获取系统提示词"""
prompts = config.marshoai_additional_prompt
if config.marshoai_enable_praises:
praises_prompt = build_praises()
praises_prompt = await build_praises()
prompts += praises_prompt
if config.marshoai_enable_time_prompt:

View File

@ -0,0 +1,87 @@
from nonebot.log import logger
from openai import AsyncStream
from openai.types.chat import ChatCompletion, ChatCompletionChunk, ChatCompletionMessage
from openai.types.chat.chat_completion import Choice
async def process_chat_stream(
stream: AsyncStream[ChatCompletionChunk],
) -> ChatCompletion:
reasoning_contents = ""
answer_contents = ""
last_chunk = None
is_first_token_appeared = False
is_answering = False
async for chunk in stream:
last_chunk = chunk
# print(chunk)
if not is_first_token_appeared:
logger.info(f"{chunk.id}: 第一个 token 已出现")
is_first_token_appeared = True
if not chunk.choices:
logger.info("Usage:", chunk.usage)
else:
delta = chunk.choices[0].delta
if (
hasattr(delta, "reasoning_content")
and delta.reasoning_content is not None
):
reasoning_contents += delta.reasoning_content
else:
if not is_answering:
logger.info(
f"{chunk.id}: 思维链已输出完毕或无 reasoning_content 字段输出"
)
is_answering = True
if delta.content is not None:
answer_contents += delta.content
# print(last_chunk)
# 创建新的 ChatCompletion 对象
if last_chunk and last_chunk.choices:
message = ChatCompletionMessage(
content=answer_contents,
role="assistant",
tool_calls=last_chunk.choices[0].delta.tool_calls, # type: ignore
)
if reasoning_contents != "":
setattr(message, "reasoning_content", reasoning_contents)
choice = Choice(
finish_reason=last_chunk.choices[0].finish_reason, # type: ignore
index=last_chunk.choices[0].index,
message=message,
)
return ChatCompletion(
id=last_chunk.id,
choices=[choice],
created=last_chunk.created,
model=last_chunk.model,
system_fingerprint=last_chunk.system_fingerprint,
object="chat.completion",
usage=last_chunk.usage,
)
else:
return ChatCompletion(
id="",
choices=[],
created=0,
model="",
system_fingerprint="",
object="chat.completion",
usage=None,
)
async def process_completion_to_details(completion: ChatCompletion) -> str:
usage_text = ""
usage = completion.usage
if usage is None:
usage_text = ""
else:
usage_text = str(usage)
details_text = f"""=========消息详情=========
模型: {completion.model}
消息 ID: {completion.id}
用量信息: {usage_text}"""
# print(details_text)
return details_text

32
pdm.lock generated
View File

@ -5,7 +5,7 @@
groups = ["default", "dev", "test"]
strategy = ["inherit_metadata"]
lock_version = "4.5.0"
content_hash = "sha256:d7ab3d9ca825de512d4f87ec846f7fddcf3d5796a7c9562e60c8c7d39c058817"
content_hash = "sha256:6aa043fb1d2d4d384e0d0c698c02a27f22e099828d2973a4baef05c5316f4ee0"
[[metadata.targets]]
requires_python = "~=3.10"
@ -1485,7 +1485,7 @@ files = [
[[package]]
name = "nonebot-plugin-alconna"
version = "0.54.1"
version = "0.57.0"
requires_python = ">=3.9"
summary = "Alconna Adapter for Nonebot"
groups = ["default"]
@ -1499,8 +1499,8 @@ dependencies = [
"tarina<0.7,>=0.6.8",
]
files = [
{file = "nonebot_plugin_alconna-0.54.1-py3-none-any.whl", hash = "sha256:4edb4b081cd64ce37717c7a92d31aadd2cf287a5a0adc2ac86ed82d9bcad5048"},
{file = "nonebot_plugin_alconna-0.54.1.tar.gz", hash = "sha256:66fae03120b8eff25bb0027d65f149e399aa6f73c7585ebdd388d1904cecdeee"},
{file = "nonebot_plugin_alconna-0.57.0-py3-none-any.whl", hash = "sha256:6c4bcce1a9aa176244b4c011b19b1cea00269c4c6794cd4e90d8dd7990ec3ec9"},
{file = "nonebot_plugin_alconna-0.57.0.tar.gz", hash = "sha256:7a9a4bf373f3f6836611dbde1a0917b84441a534dd6f2b20dae3ba6fff142858"},
]
[[package]]
@ -1519,9 +1519,27 @@ files = [
{file = "nonebot_plugin_apscheduler-0.5.0.tar.gz", hash = "sha256:6c0230e99765f275dc83d6639ff33bd6f71203fa10cd1b8a204b0f95530cda86"},
]
[[package]]
name = "nonebot-plugin-argot"
version = "0.1.7"
requires_python = ">=3.10"
summary = "NoneBot 暗语"
groups = ["default"]
dependencies = [
"aiofiles>=24.1.0",
"nonebot-plugin-alconna>=0.51.1",
"nonebot-plugin-apscheduler>=0.5.0",
"nonebot-plugin-localstore>=0.7.4",
"nonebot2>=2.3.2",
]
files = [
{file = "nonebot_plugin_argot-0.1.7-py3-none-any.whl", hash = "sha256:1af939a60967e27aff6f7ce97150d26cba8f1ef0cf216b44372cc0d8e5937204"},
{file = "nonebot_plugin_argot-0.1.7.tar.gz", hash = "sha256:f76c2139c9af1e2de6efdc487b728fbad84737d272bf1f600d085bbe6ed79094"},
]
[[package]]
name = "nonebot-plugin-localstore"
version = "0.7.3"
version = "0.7.4"
requires_python = "<4.0,>=3.9"
summary = "Local Storage Support for NoneBot2"
groups = ["default"]
@ -1532,8 +1550,8 @@ dependencies = [
"typing-extensions<5.0.0,>=4.0.0",
]
files = [
{file = "nonebot_plugin_localstore-0.7.3-py3-none-any.whl", hash = "sha256:1bc239b4b5320df0dc08eada7c4f8ba4cb92d4dc3134bf4646ab5e297bd7e575"},
{file = "nonebot_plugin_localstore-0.7.3.tar.gz", hash = "sha256:1aff10e2dacfc5bc9ce239fd34849f8d7172a118135dbc5aeba1c97605d9959d"},
{file = "nonebot_plugin_localstore-0.7.4-py3-none-any.whl", hash = "sha256:3b08030878eadcdd8b9ce3d079da0dc2d0e41dc91f0b2d8cf7fa862a27de9090"},
{file = "nonebot_plugin_localstore-0.7.4.tar.gz", hash = "sha256:85ddc13814bfcd484ab311306823651390020bf44f4fb4733b343a58e72723ce"},
]
[[package]]

View File

@ -10,7 +10,7 @@ authors = [
]
dependencies = [
"nonebot2>=2.4.0",
"nonebot-plugin-alconna>=0.48.0",
"nonebot-plugin-alconna>=0.57.1",
"nonebot-plugin-localstore>=0.7.1",
"zhDatetime>=2.0.0",
"aiohttp>=3.9",
@ -28,13 +28,14 @@ dependencies = [
"azure-ai-inference>=1.0.0b6",
"watchdog>=6.0.0",
"nonebot-plugin-apscheduler>=0.5.0",
"openai>=1.58.1"
"openai>=1.58.1",
"nonebot-plugin-argot>=0.1.7"
]
license = { text = "MIT, Mulan PSL v2" }
[project.urls]
Homepage = "https://marshoai-docs.meli.liteyuki.icu/"
Homepage = "https://marshoai-docs.pages.liteyuki.icu/"
[tool.nonebot]