Commit Graph

35 Commits

Author SHA1 Message Date
41262b008b feat(chat): update metrics name 2025-07-30 17:55:02 +02:00
0791506124 Fix some proposals 2025-07-15 17:10:45 +02:00
d76dcc8998 Make clippy happy 2025-07-15 11:49:48 +02:00
e654f66223 Support filtering 2025-07-15 11:49:47 +02:00
34f2ab7093 WIP report search errors to the LLM 2025-07-15 11:49:46 +02:00
1a9dbd364e Fix some issues 2025-07-15 11:49:46 +02:00
662c5d9871 Introduce filters in the chat completions 2025-07-15 11:49:45 +02:00
a76a3e8f11 Change the metric name for the search to use a label 2025-07-03 16:01:31 +02:00
6397ef12a0 Use three metrics for the three different tokens 2025-07-03 15:56:56 +02:00
b5e41f0e46 Fix the Mistral uncompatibility with the usage of OpenAI 2025-07-03 15:21:40 +02:00
9f0d33ec99 Expose the number of tokens on the chat completions routes 2025-07-03 15:05:15 +02:00
2b75072b09 Expose the number of internal chat searches on the /metrics route 2025-07-03 14:04:27 +02:00
adc9976615 Simplify the analytics chat completions aggragetor 2025-06-25 11:50:26 +02:00
5f50fc9464 Add new analytics to the chat completions route 2025-06-24 17:05:49 +02:00
9ae73e3c05 Better support for Mistral errors 2025-06-12 15:18:37 +02:00
7533a11143 Make sure to send the tool response before the error message 2025-06-11 10:49:21 +02:00
77cc3678b5 Make sure template errors are reported to the LLM and front-end without panicking 2025-06-11 09:27:14 +02:00
506ee40dc5 Improve errors and other stuff 2025-06-10 17:52:35 +02:00
e9d547556d Better error reporting when multi choices is used 2025-06-10 16:41:02 +02:00
bbe802c656 Remove the write txn method from the index scheduler 2025-06-10 14:03:05 +02:00
ae115cee78 Make clippy happy 2025-06-10 13:51:04 +02:00
605dea4f85 Do not leak the chat "workspace" term 2025-06-10 10:34:30 +02:00
95d4775d4a Remove the preQuery chat setting 2025-06-10 10:32:58 +02:00
48e8356a16 Mark the non-streaming chat completions route unimplemented 2025-06-10 09:18:36 +02:00
717a026fdd Make sure to use the system prompt 2025-06-06 12:32:40 +02:00
70670c3be4 Introduce the support of Azure, Gemini, vLLM 2025-06-06 12:08:37 +02:00
28dc7b836b Fix the chat completions feature gate 2025-06-03 17:10:53 +02:00
c4e1407e77 Fix the chat, chats, and chatsSettings actions 2025-06-03 16:11:54 +02:00
8fdcdee0cc Do a first clippy pass 2025-06-03 15:39:26 +02:00
7d574433b6 Clean up chat completions modules a bit 2025-06-03 15:39:26 +02:00
201a808fe2 Better report errors happening with the underlying LLM 2025-06-03 15:39:26 +02:00
f827c2442c Mark tool calls to be implemented later for non-streaming 2025-06-03 15:36:35 +02:00
3b931e75d9 Make the chats settings and chat completions route experimental 2025-06-03 15:36:35 +02:00
02cbcea3db Better chat completions settings management 2025-06-03 15:31:28 +02:00
0f7f5fa104 Introduce listing/getting/deleting/updating chat workspace settings 2025-06-03 15:31:28 +02:00