Compare commits

..

99 Commits

Author SHA1 Message Date
Elian Doran
a2b6bc0493 chore(llm): address requested changes 2026-03-30 22:20:44 +03:00
Elian Doran
841c58ca8c chore: fix type errors 2026-03-30 20:23:00 +03:00
Elian Doran
41164add15 chore(deps): fix OOM caused by Zod
See https://github.com/vercel/ai/issues/7351
2026-03-30 20:17:37 +03:00
Elian Doran
f4858d3684 refactor(llm): simplify the saving process 2026-03-30 19:40:38 +03:00
Elian Doran
be60479122 fix(llm): XSS risk when displaying the message 2026-03-30 19:36:22 +03:00
Elian Doran
948f160d14 fix(llm): XSS risk when displaying the message 2026-03-30 19:31:56 +03:00
Elian Doran
768c733f92 fix(llm): missing translation for name 2026-03-30 19:31:44 +03:00
Elian Doran
1a02be7c91 fix(llm): usage not reset when opening an empty chat 2026-03-30 19:23:42 +03:00
Elian Doran
ac75f6f7a6 feat(llm): hide the feature behind an experimental flag 2026-03-30 19:19:04 +03:00
Elian Doran
b2befb4feb feat(llm): automatic refresh of note title 2026-03-30 19:08:54 +03:00
Elian Doran
3e49399f82 fix(llm): automatic title not working for standalone chats 2026-03-30 19:03:17 +03:00
Elian Doran
eaaaf3effd fix(llm): automatic title not persisted 2026-03-30 18:59:49 +03:00
Elian Doran
f2cd1be3af fix(llm): history doesn't show last notes correctly 2026-03-30 18:55:41 +03:00
Elian Doran
b4fcf41420 feat(llm): basic auto-title 2026-03-30 18:52:22 +03:00
Elian Doran
5feccae2a0 feat(llm): enable cache control in Anthropic 2026-03-30 18:26:49 +03:00
Elian Doran
d28318005d feat(llm): basic support for attributes 2026-03-30 18:26:23 +03:00
Elian Doran
fcf39d7786 feat(llm): show footer only on hover 2026-03-30 18:14:23 +03:00
Elian Doran
5e9fc614d7 feat(llm): display message time 2026-03-30 18:08:20 +03:00
Elian Doran
a860803cc4 feat(llm): add usage underneath the message 2026-03-30 18:02:06 +03:00
Elian Doran
c40f5953fa feat(llm): make the prompt usage more compact 2026-03-30 17:56:07 +03:00
Elian Doran
241282296e fix(llm): report append to note not supporting all string content types 2026-03-30 17:50:28 +03:00
Elian Doran
8a8143167f feat(llm): report tool call errors 2026-03-30 17:45:58 +03:00
Elian Doran
12797293f0 feat(llm): improve model name display 2026-03-30 17:40:57 +03:00
Elian Doran
af0eb9551a feat(llm): save revision before changing content 2026-03-30 17:32:40 +03:00
Elian Doran
8a492450da feat(llm): render tools inline 2026-03-30 17:29:25 +03:00
Elian Doran
f3cb356b2b chore(llm): allow editing all string note types 2026-03-30 17:20:18 +03:00
Elian Doran
8ea1b7afba chore(llm): always mention note type 2026-03-30 17:16:49 +03:00
Elian Doran
911c1bdd0c feat(llm): use Markdown instead of HTML 2026-03-30 17:13:20 +03:00
Elian Doran
41f3274c7e feat(llm): use tool-based approach for reading current note 2026-03-30 17:08:47 +03:00
Elian Doran
0fc62dda78 chore(llm): styling of history menu 2026-03-30 16:38:11 +03:00
Elian Doran
e482c911c4 chore(desktop): add script to start prod with no dir 2026-03-30 12:45:30 +03:00
Elian Doran
abbe6437a9 chore(llm): use NoItems for type widget as well 2026-03-29 23:58:30 +03:00
Elian Doran
f2d67d4128 fix(desktop): stream not working on Electron 2026-03-29 23:50:23 +03:00
Elian Doran
7c9e02996e fix(desktop): unable to list providers 2026-03-29 23:47:37 +03:00
Elian Doran
c43e10c4af feat(llm): add tool to create note 2026-03-29 23:01:05 +03:00
Elian Doran
25037324ab feat(llm): improve handling when there is no provider set 2026-03-29 22:55:28 +03:00
Elian Doran
b8f9916d13 feat(llm): add tools to append or replace note content 2026-03-29 22:53:06 +03:00
Elian Doran
ed8b9cc943 feat(llm): integrate API keys with provider settings 2026-03-29 22:46:07 +03:00
Elian Doran
efbe7e0a21 feat(llm): add provider config in options 2026-03-29 22:42:05 +03:00
Elian Doran
46dd500d37 chore(llm): improve button for note access 2026-03-29 22:21:42 +03:00
Elian Doran
261c95fb06 feat(llm): add button to toggle access to the note 2026-03-29 22:20:26 +03:00
Elian Doran
41a122f722 feat(llm): allow the sidebar chat access to the note content 2026-03-29 22:09:29 +03:00
Elian Doran
490406e12a feat(llm): create empty settings page 2026-03-29 22:03:52 +03:00
Elian Doran
d12677094d chore(llm): improve chat bar size in sidebar 2026-03-29 21:54:50 +03:00
Elian Doran
3c69792744 feat(llm): improve layout with send button & context window 2026-03-29 21:52:35 +03:00
Elian Doran
395e79adbf fix(llm): sidebar chat box required scrolling to reach 2026-03-29 21:46:04 +03:00
Elian Doran
d5e56d8e29 feat(llm): integrate chat options into model selector 2026-03-29 21:43:27 +03:00
Elian Doran
e4c4873aa7 feat(llm): group legacy models into submenu 2026-03-29 21:35:33 +03:00
Elian Doran
293da1d4ef feat(llm): display cost next to the title 2026-03-29 21:29:59 +03:00
Elian Doran
d1c206a05a feat(llm): add same selectors in sidebar 2026-03-29 21:22:54 +03:00
Elian Doran
37b370511f chore(llm): get rid of different chat bar for sidebar 2026-03-29 21:14:09 +03:00
Elian Doran
734ef5533a refactor(llm): extract chat input bar into separate component 2026-03-29 21:11:51 +03:00
Elian Doran
0eb9b9fdac fix(llm): wrong icon size 2026-03-29 21:05:58 +03:00
Elian Doran
7817890cfe feat(llm): history button 2026-03-29 21:00:43 +03:00
Elian Doran
23dbedd139 refactor(llm): deduplicate LLM chat widgets 2026-03-29 20:28:19 +03:00
Elian Doran
2c8e2251fa feat(llm): use a better placeholder 2026-03-29 20:13:11 +03:00
Elian Doran
4c27ed9997 fix(sidebar): pressing a sidebar button would collapse the section 2026-03-29 20:11:16 +03:00
Elian Doran
d2fd1362c0 feat(llm): redesign sidebar to work on a single conversation 2026-03-29 20:09:00 +03:00
Elian Doran
45e57f0d5e chore(llm): always show AI chat sidebar 2026-03-29 20:00:22 +03:00
Elian Doran
660facea96 fix(llm): hide sidebar item if already in a chat 2026-03-29 19:52:44 +03:00
Elian Doran
9fa2e940d6 fix(llm): chat note created for every note navigated to 2026-03-29 19:49:13 +03:00
Elian Doran
0ffcfb8f43 feat(llm): identify sidebar chat notes by note ID 2026-03-29 19:45:45 +03:00
Elian Doran
ad1b3df74e fix(llm): sidebar not collapsing properly 2026-03-29 19:36:58 +03:00
Elian Doran
0ccf10bbbb feat(llm): basic sidebar implementation 2026-03-29 19:35:33 +03:00
Elian Doran
59c007e801 feat(llm): API to create LLM notes similar to search 2026-03-29 18:55:43 +03:00
Elian Doran
0654bc1049 fix(llm): wrong context window 2026-03-29 15:20:08 +03:00
Elian Doran
9fabefc847 feat(llm): minimize context window indicator 2026-03-29 15:17:27 +03:00
Elian Doran
e70ded0be1 fix(llm): content window progress bar not shown at startup 2026-03-29 15:12:18 +03:00
Elian Doran
16806275e0 feat(llm): basic context window progress bar 2026-03-29 15:10:49 +03:00
Elian Doran
e8214c3aae chore(llm): update list of models 2026-03-29 15:03:53 +03:00
Elian Doran
3a8e148301 chore(llm): correct pricing 2026-03-29 14:54:51 +03:00
Elian Doran
a0b546614f chore(llm): make multiplier relative to default 2026-03-29 14:47:41 +03:00
Elian Doran
5fcea86b94 feat(llm): basic cost multiplier 2026-03-29 14:44:40 +03:00
Elian Doran
d8c00ed6c0 chore(llm): use FormDropdownList 2026-03-29 14:39:53 +03:00
Elian Doran
863e68ec88 feat(llm): add model switcher 2026-03-29 14:34:31 +03:00
Elian Doran
046ee343dc feat(llm): display the model that was used 2026-03-29 14:06:23 +03:00
Elian Doran
2db9e376d5 refactor(llm): delegate pricings to provider 2026-03-29 14:02:33 +03:00
Elian Doran
9458128ad6 feat(llm): display estimated cost 2026-03-29 13:57:25 +03:00
Elian Doran
89638e3f56 feat(llm): display usage info (prompt + completion) 2026-03-29 13:53:13 +03:00
Elian Doran
8d492d7d4b feat(llm): show tool calls as references 2026-03-29 13:37:35 +03:00
Elian Doran
246c561b64 feat(llm): basic tool use 2026-03-29 13:30:04 +03:00
Elian Doran
88295f2462 refactor(llm): use vercel/AI instead 2026-03-29 13:07:21 +03:00
Elian Doran
d2d4e1cbac refactor(llm): use vercel/AI instead 2026-03-29 13:03:05 +03:00
Elian Doran
261e5b59e0 refactor(llm): use shared types in commons 2026-03-29 12:44:53 +03:00
Elian Doran
fa7ec01329 fix(llm): use of crypto.randomUUID 2026-03-29 12:27:18 +03:00
Elian Doran
4c4a29f9cf chore(llm): fix type issues 2026-03-29 12:24:13 +03:00
Elian Doran
9ddcaf4552 refactor(server): add triliumResponseHandled to typings 2026-03-29 12:01:06 +03:00
Elian Doran
c806a99fbc feat(llm): display thinking process 2026-03-29 11:51:39 +03:00
Elian Doran
ad91d360ce fix(llm): thinking budget mismatch 2026-03-29 11:41:28 +03:00
Elian Doran
cf8d7cd71f feat(llm): persist errors 2026-03-29 11:37:12 +03:00
Elian Doran
f370799b1d chore(llm): start working on extended thjinking 2026-03-29 11:26:10 +03:00
Elian Doran
f8655b5de4 fix(llm): errors not selectable 2026-03-29 11:25:54 +03:00
Elian Doran
b551f0fe2d feat(llm): basic Markdown rendering 2026-03-28 21:19:59 +02:00
Elian Doran
f6e8bdb0fd fix(llm): text not selectable 2026-03-28 21:07:54 +02:00
Elian Doran
9029ea8085 fix(llm): last response not saved 2026-03-28 21:06:20 +02:00
Elian Doran
d61ade9fe9 feat(llm): add basic web search support 2026-03-28 21:00:53 +02:00
Elian Doran
aa1fe549c7 feat(llm): make source viewable 2026-03-28 20:52:40 +02:00
Elian Doran
e3701bbcb4 fix(llm): streaming not working due to compression 2026-03-28 20:45:35 +02:00
Elian Doran
fb7fc4bf0c feat(llm): basic chat interface 2026-03-28 20:39:09 +02:00
72 changed files with 4361 additions and 149 deletions

View File

@@ -125,6 +125,15 @@ Trilium provides powerful user scripting capabilities:
- OpenID and TOTP authentication support
- Sanitization of user-generated content
### Client-Side API Restrictions
- **Do not use `crypto.randomUUID()`** or other Web Crypto APIs that require secure contexts - Trilium can run over HTTP, not just HTTPS
- Use `randomString()` from `apps/client/src/services/utils.ts` for generating IDs instead
### Shared Types Policy
- Types shared between client and server belong in `@triliumnext/commons` (`packages/commons/src/lib/`)
- Import shared types directly from `@triliumnext/commons` - do not re-export them from app-specific modules
- Keep app-specific types (e.g., `LlmProvider` for server, `StreamCallbacks` for client) in their respective apps
## Common Development Tasks
### Adding New Note Types

View File

@@ -43,13 +43,14 @@
"@univerjs/preset-sheets-note": "0.18.0",
"@univerjs/preset-sheets-sort": "0.18.0",
"@univerjs/presets": "0.18.0",
"@zumer/snapdom": "2.7.0",
"@zumer/snapdom": "2.6.0",
"autocomplete.js": "0.38.1",
"bootstrap": "5.3.8",
"boxicons": "2.1.4",
"clsx": "2.1.1",
"color": "5.0.3",
"debounce": "3.0.0",
"dompurify": "3.3.3",
"draggabilly": "3.0.0",
"force-graph": "1.51.2",
"globals": "17.4.0",
@@ -64,11 +65,11 @@
"mark.js": "8.11.1",
"marked": "17.0.5",
"mermaid": "11.13.0",
"mind-elixir": "5.10.0",
"mind-elixir": "5.9.3",
"normalize.css": "8.0.1",
"panzoom": "9.4.4",
"preact": "10.29.0",
"react-i18next": "17.0.1",
"react-i18next": "17.0.0",
"react-window": "2.2.7",
"reveal.js": "6.0.0",
"rrule": "2.8.1",

View File

@@ -508,7 +508,7 @@ type EventMappings = {
contentSafeMarginChanged: {
top: number;
noteContext: NoteContext;
}
};
};
export type EventListener<T extends EventNames> = {

View File

@@ -18,7 +18,7 @@ const RELATION = "relation";
* end user. Those types should be used only for checking against, they are
* not for direct use.
*/
export type NoteType = "file" | "image" | "search" | "noteMap" | "launcher" | "doc" | "contentWidget" | "text" | "relationMap" | "render" | "canvas" | "mermaid" | "book" | "webView" | "code" | "mindMap" | "spreadsheet";
export type NoteType = "file" | "image" | "search" | "noteMap" | "launcher" | "doc" | "contentWidget" | "text" | "relationMap" | "render" | "canvas" | "mermaid" | "book" | "webView" | "code" | "mindMap" | "spreadsheet" | "llmChat";
export interface NotePathRecord {
isArchived: boolean;

View File

@@ -84,6 +84,55 @@ async function createSearchNote(opts = {}) {
return await froca.getNote(note.noteId);
}
async function createLlmChat() {
const note = await server.post<FNoteRow>("special-notes/llm-chat");
await ws.waitForMaxKnownEntityChangeId();
return await froca.getNote(note.noteId);
}
/**
* Gets the most recently modified LLM chat.
* Returns null if no chat exists.
*/
async function getMostRecentLlmChat() {
const note = await server.get<FNoteRow | null>("special-notes/most-recent-llm-chat");
if (!note) {
return null;
}
await ws.waitForMaxKnownEntityChangeId();
return await froca.getNote(note.noteId);
}
/**
* Gets the most recent LLM chat, or creates a new one if none exists.
* Used by sidebar chat for persistent conversations across page refreshes.
*/
async function getOrCreateLlmChat() {
const note = await server.get<FNoteRow>("special-notes/get-or-create-llm-chat");
await ws.waitForMaxKnownEntityChangeId();
return await froca.getNote(note.noteId);
}
export interface RecentLlmChat {
noteId: string;
title: string;
dateModified: string;
}
/**
* Gets a list of recent LLM chats for the history popup.
*/
async function getRecentLlmChats(limit: number = 10): Promise<RecentLlmChat[]> {
return await server.get<RecentLlmChat[]>(`special-notes/recent-llm-chats?limit=${limit}`);
}
export default {
getInboxNote,
getTodayNote,
@@ -94,5 +143,9 @@ export default {
getMonthNote,
getYearNote,
createSqlConsole,
createSearchNote
createSearchNote,
createLlmChat,
getMostRecentLlmChat,
getOrCreateLlmChat,
getRecentLlmChats
};

View File

@@ -13,6 +13,11 @@ export const experimentalFeatures = [
id: "new-layout",
name: t("experimental_features.new_layout_name"),
description: t("experimental_features.new_layout_description"),
},
{
id: "llm",
name: t("experimental_features.llm_name"),
description: t("experimental_features.llm_description"),
}
] as const satisfies ExperimentalFeature[];

View File

@@ -19,7 +19,8 @@ export const byNoteType: Record<Exclude<NoteType, "book">, string | null> = {
search: null,
text: null,
webView: null,
spreadsheet: null
spreadsheet: null,
llmChat: null
};
export const byBookType: Record<ViewTypeOptions, string | null> = {

View File

@@ -0,0 +1,110 @@
import type { LlmChatConfig, LlmCitation, LlmMessage, LlmModelInfo,LlmUsage } from "@triliumnext/commons";
import server from "./server.js";
/**
* Fetch available models for a provider.
*/
export async function getAvailableModels(provider: string = "anthropic"): Promise<LlmModelInfo[]> {
const response = await server.get<{ models?: LlmModelInfo[] }>(`llm-chat/models?provider=${encodeURIComponent(provider)}`);
return response.models ?? [];
}
export interface StreamCallbacks {
onChunk: (text: string) => void;
onThinking?: (text: string) => void;
onToolUse?: (toolName: string, input: Record<string, unknown>) => void;
onToolResult?: (toolName: string, result: string, isError?: boolean) => void;
onCitation?: (citation: LlmCitation) => void;
onUsage?: (usage: LlmUsage) => void;
onError: (error: string) => void;
onDone: () => void;
}
/**
* Stream a chat completion from the LLM API using Server-Sent Events.
*/
export async function streamChatCompletion(
messages: LlmMessage[],
config: LlmChatConfig,
callbacks: StreamCallbacks
): Promise<void> {
const headers = await server.getHeaders();
const response = await fetch(`${window.glob.baseApiUrl}llm-chat/stream`, {
method: "POST",
headers: {
...headers,
"Content-Type": "application/json"
} as HeadersInit,
body: JSON.stringify({ messages, config })
});
if (!response.ok) {
callbacks.onError(`HTTP ${response.status}: ${response.statusText}`);
return;
}
const reader = response.body?.getReader();
if (!reader) {
callbacks.onError("No response body");
return;
}
const decoder = new TextDecoder();
let buffer = "";
try {
while (true) {
const { done, value } = await reader.read();
if (done) break;
buffer += decoder.decode(value, { stream: true });
const lines = buffer.split("\n");
buffer = lines.pop() || "";
for (const line of lines) {
if (line.startsWith("data: ")) {
try {
const data = JSON.parse(line.slice(6));
switch (data.type) {
case "text":
callbacks.onChunk(data.content);
break;
case "thinking":
callbacks.onThinking?.(data.content);
break;
case "tool_use":
callbacks.onToolUse?.(data.toolName, data.toolInput);
break;
case "tool_result":
callbacks.onToolResult?.(data.toolName, data.result, data.isError);
break;
case "citation":
if (data.citation) {
callbacks.onCitation?.(data.citation);
}
break;
case "usage":
if (data.usage) {
callbacks.onUsage?.(data.usage);
}
break;
case "error":
callbacks.onError(data.error);
break;
case "done":
callbacks.onDone();
break;
}
} catch (e) {
console.error("Failed to parse SSE data line:", line, e);
}
}
}
}
} finally {
reader.releaseLock();
}
}

View File

@@ -1,6 +1,7 @@
import type { NoteType } from "../entities/fnote.js";
import type { MenuCommandItem, MenuItem, MenuItemBadge, MenuSeparatorItem } from "../menus/context_menu.js";
import type { TreeCommandNames } from "../menus/tree_context_menu.js";
import { isExperimentalFeatureEnabled } from "./experimental_features.js";
import froca from "./froca.js";
import { t } from "./i18n.js";
import server from "./server.js";
@@ -41,6 +42,7 @@ export const NOTE_TYPES: NoteTypeMapping[] = [
{ type: "relationMap", mime: "application/json", title: t("note_types.relation-map"), icon: "bxs-network-chart" },
// Misc note types
{ type: "llmChat", mime: "application/json", title: t("note_types.llm-chat"), icon: "bx-message-square-dots", isBeta: true },
{ type: "render", mime: "", title: t("note_types.render-note"), icon: "bx-extension" },
{ type: "search", title: t("note_types.saved-search"), icon: "bx-file-find", static: true },
{ type: "webView", mime: "", title: t("note_types.web-view"), icon: "bx-globe-alt" },
@@ -92,6 +94,7 @@ async function getNoteTypeItems(command?: TreeCommandNames) {
function getBlankNoteTypes(command?: TreeCommandNames): MenuItem<TreeCommandNames>[] {
return NOTE_TYPES
.filter((nt) => !nt.reserved && nt.type !== "book")
.filter((nt) => nt.type !== "llmChat" || isExperimentalFeatureEnabled("llm"))
.map((nt) => {
const menuItem: MenuCommandItem<TreeCommandNames> = {
title: nt.title,

View File

@@ -922,6 +922,7 @@ export default {
parseDate,
formatDateISO,
formatDateTime,
formatTime,
formatTimeInterval,
formatSize,
localNowDateTime,

View File

@@ -1750,10 +1750,13 @@ body:not(.mobile) #launcher-pane.horizontal .dropdown-submenu > .dropdown-menu {
justify-content: space-between;
align-items: baseline;
font-weight: bold;
text-transform: uppercase;
color: var(--muted-text-color) !important;
}
#right-pane .card-header-title {
text-transform: uppercase;
}
#right-pane .card-header-buttons {
display: flex;
transform: scale(0.9);

View File

@@ -1157,7 +1157,9 @@
"title": "Experimental Options",
"disclaimer": "These options are experimental and may cause instability. Use with caution.",
"new_layout_name": "New Layout",
"new_layout_description": "Try out the new layout for a more modern look and improved usability. Subject to heavy change in the upcoming releases."
"new_layout_description": "Try out the new layout for a more modern look and improved usability. Subject to heavy change in the upcoming releases.",
"llm_name": "AI / LLM Chat",
"llm_description": "Enable the AI chat sidebar and LLM chat notes powered by large language models."
},
"fonts": {
"theme_defined": "Theme defined",
@@ -1599,6 +1601,7 @@
"geo-map": "Geo Map",
"beta-feature": "Beta",
"ai-chat": "AI Chat",
"llm-chat": "AI Chat",
"task-list": "Task List",
"new-feature": "New",
"collections": "Collections",
@@ -1610,6 +1613,49 @@
"toggle-on-hint": "Note is not protected, click to make it protected",
"toggle-off-hint": "Note is protected, click to make it unprotected"
},
"llm_chat": {
"placeholder": "Type a message...",
"send": "Send",
"sending": "Sending...",
"empty_state": "Start a conversation by typing a message below.",
"searching_web": "Searching the web...",
"web_search": "Web search",
"note_tools": "Note access",
"sources": "Sources",
"extended_thinking": "Extended thinking",
"legacy_models": "Legacy models",
"thinking": "Thinking...",
"thought_process": "Thought process",
"tool_calls": "{{count}} tool call(s)",
"input": "Input",
"result": "Result",
"error": "Error",
"tool_error": "failed",
"total_tokens": "{{total}} tokens",
"tokens_detail": "{{prompt}} prompt + {{completion}} completion",
"tokens_used": "{{prompt}} prompt + {{completion}} completion = {{total}} tokens",
"tokens_used_with_cost": "{{prompt}} prompt + {{completion}} completion = {{total}} tokens (~${{cost}})",
"tokens_used_with_model": "{{model}}: {{prompt}} prompt + {{completion}} completion = {{total}} tokens",
"tokens_used_with_model_and_cost": "{{model}}: {{prompt}} prompt + {{completion}} completion = {{total}} tokens (~${{cost}})",
"tokens": "tokens",
"context_used": "{{percentage}}% used",
"note_context_enabled": "Click to disable note context: {{title}}",
"note_context_disabled": "Click to include current note in context",
"no_provider_message": "No AI provider configured. Add one to start chatting.",
"add_provider": "Add AI Provider",
"role_user": "You",
"role_assistant": "Assistant"
},
"sidebar_chat": {
"title": "AI Chat",
"launcher_title": "Open AI Chat",
"new_chat": "Start new chat",
"save_chat": "Save chat to notes",
"empty_state": "Start a conversation",
"history": "Chat history",
"recent_chats": "Recent chats",
"no_chats": "No previous chats"
},
"shared_switch": {
"shared": "Shared",
"toggle-on-title": "Share the note",
@@ -2230,5 +2276,21 @@
"sample_xy": "XY",
"sample_venn": "Venn",
"sample_ishikawa": "Ishikawa"
},
"llm": {
"settings_title": "AI / LLM",
"settings_description": "Configure AI and Large Language Model integrations.",
"add_provider": "Add Provider",
"add_provider_title": "Add AI Provider",
"configured_providers": "Configured Providers",
"no_providers_configured": "No providers configured yet.",
"provider_name": "Name",
"provider_type": "Provider",
"actions": "Actions",
"delete_provider": "Delete",
"delete_provider_confirmation": "Are you sure you want to delete the provider \"{{name}}\"?",
"api_key": "API Key",
"api_key_placeholder": "Enter your API key",
"cancel": "Cancel"
}
}

View File

@@ -1,6 +1,7 @@
import { useCallback, useLayoutEffect, useState } from "preact/hooks";
import FNote from "../../entities/fnote";
import { isExperimentalFeatureEnabled } from "../../services/experimental_features";
import froca from "../../services/froca";
import { isDesktop, isMobile } from "../../services/utils";
import TabSwitcher from "../mobile_widgets/TabSwitcher";
@@ -12,6 +13,7 @@ import HistoryNavigationButton from "./HistoryNavigation";
import { LaunchBarContext } from "./launch_bar_widgets";
import { CommandButton, CustomWidget, NoteLauncher, QuickSearchLauncherWidget, ScriptLauncher, TodayLauncher } from "./LauncherDefinitions";
import ProtectedSessionStatusWidget from "./ProtectedSessionStatusWidget";
import SidebarChatButton from "./SidebarChatButton";
import SpacerWidget from "./SpacerWidget";
import SyncStatus from "./SyncStatus";
@@ -98,6 +100,8 @@ function initBuiltinWidget(note: FNote, isHorizontalLayout: boolean) {
return <QuickSearchLauncherWidget />;
case "mobileTabSwitcher":
return <TabSwitcher />;
case "sidebarChat":
return isExperimentalFeatureEnabled("llm") ? <SidebarChatButton /> : undefined;
default:
console.warn(`Unrecognized builtin widget ${builtinWidget} for launcher ${note.noteId} "${note.title}"`);
}

View File

@@ -0,0 +1,24 @@
import { useCallback } from "preact/hooks";
import appContext from "../../components/app_context";
import { t } from "../../services/i18n";
import { LaunchBarActionButton } from "./launch_bar_widgets";
/**
* Launcher button to open the sidebar (which contains the chat).
* The chat widget is always visible in the sidebar for non-chat notes.
*/
export default function SidebarChatButton() {
const handleClick = useCallback(() => {
// Open right pane if hidden, or toggle it if visible
appContext.triggerEvent("toggleRightPane", {});
}, []);
return (
<LaunchBarActionButton
icon="bx bx-message-square-dots"
text={t("sidebar_chat.launcher_title")}
onClick={handleClick}
/>
);
}

View File

@@ -5,6 +5,7 @@ import { useEffect, useMemo, useState } from "preact/hooks";
import FNote from "../../entities/fnote";
import attributes from "../../services/attributes";
import { isExperimentalFeatureEnabled } from "../../services/experimental_features";
import froca from "../../services/froca";
import { t } from "../../services/i18n";
import { NOTE_TYPES, NoteTypeMapping } from "../../services/note_types";
@@ -28,6 +29,7 @@ export default function NoteTypeSwitcher() {
const restNoteTypes: NoteTypeMapping[] = [];
for (const noteType of NOTE_TYPES) {
if (noteType.reserved || noteType.static || noteType.type === "book") continue;
if (noteType.type === "llmChat" && !isExperimentalFeatureEnabled("llm")) continue;
if (SWITCHER_PINNED_NOTE_TYPES.has(noteType.type)) {
pinnedNoteTypes.push(noteType);
} else {

View File

@@ -12,7 +12,7 @@ import { TypeWidgetProps } from "./type_widgets/type_widget";
* A `NoteType` altered by the note detail widget, taking into consideration whether the note is editable or not and adding special note types such as an empty one,
* for protected session or attachment information.
*/
export type ExtendedNoteType = Exclude<NoteType, "launcher" | "text" | "code"> | "empty" | "readOnlyCode" | "readOnlyText" | "editableText" | "editableCode" | "attachmentDetail" | "attachmentList" | "protectedSession" | "sqlConsole";
export type ExtendedNoteType = Exclude<NoteType, "launcher" | "text" | "code" | "llmChat"> | "empty" | "readOnlyCode" | "readOnlyText" | "editableText" | "editableCode" | "attachmentDetail" | "attachmentList" | "protectedSession" | "sqlConsole" | "llmChat";
export type TypeWidget = ((props: TypeWidgetProps) => VNode | JSX.Element | undefined);
type NoteTypeView = () => (Promise<{ default: TypeWidget } | TypeWidget> | TypeWidget);
@@ -147,5 +147,11 @@ export const TYPE_MAPPINGS: Record<ExtendedNoteType, NoteTypeMapping> = {
className: "note-detail-spreadsheet",
printable: true,
isFullHeight: true
},
llmChat: {
view: () => import("./type_widgets/llm_chat/LlmChat"),
className: "note-detail-llm-chat",
printable: true,
isFullHeight: true
}
};

View File

@@ -5,16 +5,27 @@ interface FormDropdownList<T> extends Omit<DropdownProps, "children"> {
values: T[];
keyProperty: keyof T;
titleProperty: keyof T;
/** Property to show as a small suffix next to the title */
titleSuffixProperty?: keyof T;
descriptionProperty?: keyof T;
currentValue: string;
onChange(newValue: string): void;
}
export default function FormDropdownList<T>({ values, keyProperty, titleProperty, descriptionProperty, currentValue, onChange, ...restProps }: FormDropdownList<T>) {
export default function FormDropdownList<T>({ values, keyProperty, titleProperty, titleSuffixProperty, descriptionProperty, currentValue, onChange, ...restProps }: FormDropdownList<T>) {
const currentValueData = values.find(value => value[keyProperty] === currentValue);
const renderTitle = (item: T) => {
const title = item[titleProperty] as string;
const suffix = titleSuffixProperty ? item[titleSuffixProperty] as string : null;
if (suffix) {
return <>{title} <small>{suffix}</small></>;
}
return title;
};
return (
<Dropdown text={currentValueData?.[titleProperty] ?? ""} {...restProps}>
<Dropdown text={currentValueData ? renderTitle(currentValueData) : ""} {...restProps}>
{values.map(item => (
<FormListItem
onClick={() => onChange(item[keyProperty] as string)}
@@ -22,9 +33,9 @@ export default function FormDropdownList<T>({ values, keyProperty, titleProperty
description={descriptionProperty && item[descriptionProperty] as string}
selected={currentValue === item[keyProperty]}
>
{item[titleProperty] as string}
{renderTitle(item)}
</FormListItem>
))}
</Dropdown>
)
}
}

View File

@@ -1,3 +1,4 @@
import DOMPurify from "dompurify";
import type { CSSProperties, HTMLProps, RefObject } from "preact/compat";
type HTMLElementLike = string | HTMLElement | JQuery<HTMLElement>;
@@ -14,16 +15,16 @@ export default function RawHtml({containerRef, ...props}: RawHtmlProps & { conta
}
export function RawHtmlBlock({containerRef, ...props}: RawHtmlProps & { containerRef?: RefObject<HTMLDivElement>}) {
return <div ref={containerRef} {...getProps(props)} />
return <div ref={containerRef} {...getProps(props)} />;
}
function getProps({ className, html, style, onClick }: RawHtmlProps) {
return {
className: className,
className,
dangerouslySetInnerHTML: getHtml(html ?? ""),
style,
onClick
}
};
}
export function getHtml(html: string | HTMLElement | JQuery<HTMLElement>) {
@@ -39,3 +40,19 @@ export function getHtml(html: string | HTMLElement | JQuery<HTMLElement>) {
__html: html as string
};
}
/**
* Renders HTML content sanitized via DOMPurify to prevent XSS.
* Use this instead of {@link RawHtml} when the HTML originates from
* untrusted sources (e.g. LLM responses, user-generated markdown).
*/
export function SanitizedHtml({ className, html, style }: { className?: string; html: string; style?: CSSProperties }) {
return (
<div
className={className}
style={style}
// eslint-disable-next-line react/no-danger
dangerouslySetInnerHTML={{ __html: DOMPurify.sanitize(html) }}
/>
);
}

View File

@@ -7,6 +7,7 @@ import branches from "../../services/branches";
import dialog from "../../services/dialog";
import { getAvailableLocales, t } from "../../services/i18n";
import mime_types from "../../services/mime_types";
import { isExperimentalFeatureEnabled } from "../../services/experimental_features";
import { NOTE_TYPES } from "../../services/note_types";
import protected_session from "../../services/protected_session";
import server from "../../services/server";
@@ -72,7 +73,7 @@ export function NoteTypeDropdownContent({ currentNoteType, currentNoteMime, note
noCodeNotes?: boolean;
}) {
const mimeTypes = useMimeTypes();
const noteTypes = useMemo(() => NOTE_TYPES.filter((nt) => !nt.reserved && !nt.static), []);
const noteTypes = useMemo(() => NOTE_TYPES.filter((nt) => !nt.reserved && !nt.static && (nt.type !== "llmChat" || isExperimentalFeatureEnabled("llm"))), []);
const changeNoteType = useCallback(async (type: NoteType, mime?: string) => {
if (!note || (type === currentNoteType && mime === currentNoteMime)) {
return;

View File

@@ -85,7 +85,7 @@ export function NoteContextMenu({ note, noteContext, itemsAtStart, itemsNearNote
);
const isElectron = getIsElectron();
const isMac = getIsMac();
const hasSource = ["text", "code", "relationMap", "mermaid", "canvas", "mindMap", "spreadsheet"].includes(noteType);
const hasSource = ["text", "code", "relationMap", "mermaid", "canvas", "mindMap", "spreadsheet", "llmChat"].includes(noteType);
const isSearchOrBook = ["search", "book"].includes(noteType);
const isHelpPage = note.noteId.startsWith("_help");
const [syncServerHost] = useTriliumOption("syncServerHost");

View File

@@ -7,6 +7,7 @@ import { useCallback, useEffect, useRef, useState } from "preact/hooks";
import appContext from "../../components/app_context";
import { WidgetsByParent } from "../../services/bundle";
import { isExperimentalFeatureEnabled } from "../../services/experimental_features";
import { t } from "../../services/i18n";
import options from "../../services/options";
import { DEFAULT_GUTTER_SIZE } from "../../services/resizer";
@@ -19,6 +20,7 @@ import PdfAttachments from "./pdf/PdfAttachments";
import PdfLayers from "./pdf/PdfLayers";
import PdfPages from "./pdf/PdfPages";
import RightPanelWidget from "./RightPanelWidget";
import SidebarChat from "./SidebarChat";
import TableOfContents from "./TableOfContents";
const MIN_WIDTH_PERCENT = 5;
@@ -91,6 +93,11 @@ function useItems(rightPaneVisible: boolean, widgetsByParent: WidgetsByParent) {
el: <HighlightsList />,
enabled: noteType === "text" && highlightsList.length > 0,
},
{
el: <SidebarChat />,
enabled: noteType !== "llmChat" && isExperimentalFeatureEnabled("llm"),
position: 1000
},
...widgetsByParent.getLegacyWidgets("right-pane").map((widget) => ({
el: <CustomLegacyWidget key={widget._noteId} originalWidget={widget as LegacyRightPanelWidget} />,
enabled: true,

View File

@@ -51,7 +51,7 @@ export default function RightPanelWidget({ id, title, buttons, children, contain
>
<ActionButton icon="bx bx-chevron-down" text="" />
<div class="card-header-title">{title}</div>
<div class="card-header-buttons">
<div class="card-header-buttons" onClick={e => e.stopPropagation()}>
{buttons}
{contextMenuItems && (
<ActionButton

View File

@@ -0,0 +1,113 @@
/* Sidebar Chat Widget Styles */
.sidebar-chat-container {
display: flex;
flex-direction: column;
flex: 1;
min-height: 0; /* Allow shrinking in flex context */
overflow: hidden; /* Contain children within available space */
}
.sidebar-chat-container .llm-chat-input-form {
flex-shrink: 0; /* Keep input bar from shrinking */
.llm-chat-input {
font-size: 0.9em;
padding: 0.5em;
}
}
.sidebar-chat-messages {
flex: 1;
min-height: 0; /* Allow flex shrinking for scroll containment */
overflow-y: auto;
padding: 0.5rem;
display: flex;
flex-direction: column;
gap: 0.5rem;
}
/* Reuse llm-chat-message styles but make them more compact */
.sidebar-chat-messages .llm-chat-message-wrapper {
margin-top: 0;
max-width: 100%;
}
.sidebar-chat-messages .llm-chat-message {
padding: 0.5rem 0.75rem;
font-size: 0.9rem;
}
.sidebar-chat-messages .llm-chat-message-role {
font-size: 0.75rem;
}
.sidebar-chat-messages .llm-chat-tool-activity {
font-size: 0.85rem;
padding: 0.375rem 0.75rem;
margin-bottom: 0;
max-width: 100%;
}
/* Make the sidebar chat widget grow to fill available space when expanded */
#right-pane .widget.grow:not(.collapsed) {
flex: 1;
flex-shrink: 1; /* Override flex-shrink: 0 from main styles */
min-height: 0;
display: flex;
flex-direction: column;
}
#right-pane .widget.grow:not(.collapsed) .body-wrapper {
flex: 1;
min-height: 0;
display: flex;
flex-direction: column;
overflow: hidden; /* Override overflow: auto from main styles */
}
#right-pane .widget.grow:not(.collapsed) .card-body {
flex: 1;
min-height: 0;
overflow: hidden; /* Override overflow: auto - let child handle scrolling */
display: flex;
flex-direction: column;
}
/* Compact markdown in sidebar */
.sidebar-chat-messages .llm-chat-markdown {
font-size: 0.9rem;
line-height: 1.5;
}
.sidebar-chat-messages .llm-chat-markdown p {
margin: 0 0 0.5em 0;
}
.sidebar-chat-messages .llm-chat-markdown pre {
padding: 0.5rem;
font-size: 0.8rem;
}
.sidebar-chat-messages .llm-chat-markdown code {
font-size: 0.85em;
}
.sidebar-chat-history-item-content {
display: flex;
flex-direction: column;
min-width: 0;
}
.sidebar-chat-history-item-content span,
.sidebar-chat-history-item-content strong {
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
}
.sidebar-chat-history-date {
font-size: 0.75rem;
color: var(--muted-text-color);
margin-top: 0.125rem;
}

View File

@@ -0,0 +1,335 @@
import "./SidebarChat.css";
import type { Dropdown as BootstrapDropdown } from "bootstrap";
import { useCallback, useEffect, useRef, useState } from "preact/hooks";
import dateNoteService, { type RecentLlmChat } from "../../services/date_notes.js";
import { t } from "../../services/i18n.js";
import server from "../../services/server.js";
import { formatDateTime } from "../../utils/formatters";
import ActionButton from "../react/ActionButton.js";
import Dropdown from "../react/Dropdown.js";
import { FormListItem } from "../react/FormList.js";
import { useActiveNoteContext, useNote, useNoteProperty } from "../react/hooks.js";
import NoItems from "../react/NoItems.js";
import ChatInputBar from "../type_widgets/llm_chat/ChatInputBar.js";
import ChatMessage from "../type_widgets/llm_chat/ChatMessage.js";
import type { LlmChatContent } from "../type_widgets/llm_chat/llm_chat_types.js";
import { useLlmChat } from "../type_widgets/llm_chat/useLlmChat.js";
import RightPanelWidget from "./RightPanelWidget.js";
/**
* Sidebar chat widget that appears in the right panel.
* Uses a hidden LLM chat note for persistence across all notes.
* The same chat persists when switching between notes.
*/
export default function SidebarChat() {
const [chatNoteId, setChatNoteId] = useState<string | null>(null);
const [shouldSave, setShouldSave] = useState(false);
const [recentChats, setRecentChats] = useState<RecentLlmChat[]>([]);
const saveTimeoutRef = useRef<ReturnType<typeof setTimeout>>();
const historyDropdownRef = useRef<BootstrapDropdown | null>(null);
// Get the current active note context
const { noteId: activeNoteId, note: activeNote } = useActiveNoteContext();
// Reactively watch the chat note's title (updates via WebSocket sync after auto-rename)
const chatNote = useNote(chatNoteId);
const chatTitle = useNoteProperty(chatNote, "title") || t("sidebar_chat.title");
// Use shared chat hook with sidebar-specific options
const chat = useLlmChat(
// onMessagesChange - trigger save
() => setShouldSave(true),
{ defaultEnableNoteTools: true, supportsExtendedThinking: true }
);
// Update chat context when active note changes
useEffect(() => {
chat.setContextNoteId(activeNoteId ?? undefined);
}, [activeNoteId, chat.setContextNoteId]);
// Sync chatNoteId into the hook for auto-title generation
useEffect(() => {
chat.setChatNoteId(chatNoteId ?? undefined);
}, [chatNoteId, chat.setChatNoteId]);
// Ref to access chat methods in effects without triggering re-runs
const chatRef = useRef(chat);
chatRef.current = chat;
// Handle debounced save when shouldSave is triggered
useEffect(() => {
if (!shouldSave || !chatNoteId) {
setShouldSave(false);
return;
}
setShouldSave(false);
if (saveTimeoutRef.current) {
clearTimeout(saveTimeoutRef.current);
}
saveTimeoutRef.current = setTimeout(async () => {
const content = chat.getContent();
try {
await server.put(`notes/${chatNoteId}/data`, {
content: JSON.stringify(content)
});
} catch (err) {
console.error("Failed to save chat:", err);
}
}, 500);
return () => {
if (saveTimeoutRef.current) {
clearTimeout(saveTimeoutRef.current);
saveTimeoutRef.current = undefined;
}
};
}, [shouldSave, chatNoteId, chat]);
// Load the most recent chat on mount (runs once)
useEffect(() => {
let cancelled = false;
const loadMostRecentChat = async () => {
try {
const existingChat = await dateNoteService.getMostRecentLlmChat();
if (cancelled) return;
if (existingChat) {
setChatNoteId(existingChat.noteId);
// Load content inline to avoid dependency issues
try {
const blob = await server.get<{ content: string }>(`notes/${existingChat.noteId}/blob`);
if (!cancelled && blob?.content) {
const parsed: LlmChatContent = JSON.parse(blob.content);
chatRef.current.loadFromContent(parsed);
}
} catch (err) {
console.error("Failed to load chat content:", err);
}
} else {
// No existing chat - will create on first message
setChatNoteId(null);
chatRef.current.clearMessages();
}
} catch (err) {
console.error("Failed to load sidebar chat:", err);
}
};
loadMostRecentChat();
return () => {
cancelled = true;
};
}, []);
// Custom submit handler that ensures chat note exists first
const handleSubmit = useCallback(async (e: Event) => {
e.preventDefault();
if (!chat.input.trim() || chat.isStreaming) return;
// Ensure chat note exists before sending (lazy creation)
let noteId = chatNoteId;
if (!noteId) {
try {
const note = await dateNoteService.getOrCreateLlmChat();
if (note) {
setChatNoteId(note.noteId);
noteId = note.noteId;
}
} catch (err) {
console.error("Failed to create sidebar chat:", err);
return;
}
}
if (!noteId) {
console.error("Cannot send message: no chat note available");
return;
}
// Ensure the hook has the chatNoteId before submitting (state update from
// setChatNoteId above won't be visible until next render)
chat.setChatNoteId(noteId);
// Delegate to shared handler
await chat.handleSubmit(e);
}, [chatNoteId, chat]);
const handleKeyDown = useCallback((e: KeyboardEvent) => {
if (e.key === "Enter" && !e.shiftKey) {
e.preventDefault();
handleSubmit(e);
}
}, [handleSubmit]);
const handleNewChat = useCallback(async () => {
try {
const note = await dateNoteService.createLlmChat();
if (note) {
setChatNoteId(note.noteId);
chat.clearMessages();
}
} catch (err) {
console.error("Failed to create new chat:", err);
}
}, [chat]);
const handleSaveChat = useCallback(async () => {
if (!chatNoteId) return;
try {
await server.post("special-notes/save-llm-chat", { llmChatNoteId: chatNoteId });
// Create a new empty chat after saving
const note = await dateNoteService.createLlmChat();
if (note) {
setChatNoteId(note.noteId);
chat.clearMessages();
}
} catch (err) {
console.error("Failed to save chat to permanent location:", err);
}
}, [chatNoteId, chat]);
const loadRecentChats = useCallback(async () => {
try {
const chats = await dateNoteService.getRecentLlmChats(10);
setRecentChats(chats);
} catch (err) {
console.error("Failed to load recent chats:", err);
}
}, []);
const handleSelectChat = useCallback(async (noteId: string) => {
historyDropdownRef.current?.hide();
if (noteId === chatNoteId) return;
try {
const blob = await server.get<{ content: string }>(`notes/${noteId}/blob`);
if (blob?.content) {
const parsed: LlmChatContent = JSON.parse(blob.content);
setChatNoteId(noteId);
chat.loadFromContent(parsed);
}
} catch (err) {
console.error("Failed to load selected chat:", err);
}
}, [chatNoteId, chat]);
return (
<RightPanelWidget
id="sidebar-chat"
title={chatTitle}
grow
buttons={
<>
<ActionButton
icon="bx bx-plus"
text={t("sidebar_chat.new_chat")}
onClick={handleNewChat}
/>
<Dropdown
text=""
buttonClassName="bx bx-history"
title={t("sidebar_chat.history")}
iconAction
hideToggleArrow
dropdownContainerClassName="tn-dropdown-menu-scrollable"
dropdownOptions={{ popperConfig: { strategy: "fixed" } }}
dropdownRef={historyDropdownRef}
onShown={loadRecentChats}
>
{recentChats.length === 0 ? (
<FormListItem disabled>
{t("sidebar_chat.no_chats")}
</FormListItem>
) : (
recentChats.map(chatItem => (
<FormListItem
key={chatItem.noteId}
icon="bx bx-message-square-dots"
className={chatItem.noteId === chatNoteId ? "active" : ""}
onClick={() => handleSelectChat(chatItem.noteId)}
>
<div className="sidebar-chat-history-item-content">
{chatItem.noteId === chatNoteId
? <strong>{chatItem.title}</strong>
: <span>{chatItem.title}</span>}
<span className="sidebar-chat-history-date">
{formatDateTime(new Date(chatItem.dateModified), "short", "short")}
</span>
</div>
</FormListItem>
))
)}
</Dropdown>
<ActionButton
icon="bx bx-save"
text={t("sidebar_chat.save_chat")}
onClick={handleSaveChat}
disabled={chat.messages.length === 0}
/>
</>
}
>
<div className="sidebar-chat-container">
<div className="sidebar-chat-messages">
{chat.messages.length === 0 && !chat.isStreaming && (
<NoItems
icon="bx bx-conversation"
text={t("sidebar_chat.empty_state")}
/>
)}
{chat.messages.map(msg => (
<ChatMessage key={msg.id} message={msg} />
))}
{chat.toolActivity && !chat.streamingThinking && (
<div className="llm-chat-tool-activity">
<span className="llm-chat-tool-spinner" />
{chat.toolActivity}
</div>
)}
{chat.isStreaming && chat.streamingThinking && (
<ChatMessage
message={{
id: "streaming-thinking",
role: "assistant",
content: chat.streamingThinking,
createdAt: new Date().toISOString(),
type: "thinking"
}}
isStreaming
/>
)}
{chat.isStreaming && chat.streamingContent && (
<ChatMessage
message={{
id: "streaming",
role: "assistant",
content: chat.streamingContent,
createdAt: new Date().toISOString(),
citations: chat.pendingCitations.length > 0 ? chat.pendingCitations : undefined
}}
isStreaming
/>
)}
<div ref={chat.messagesEndRef} />
</div>
<ChatInputBar
chat={chat}
rows={2}
activeNoteId={activeNoteId ?? undefined}
activeNoteTitle={activeNote?.title}
onSubmit={handleSubmit}
onKeyDown={handleKeyDown}
/>
</div>
</RightPanelWidget>
);
}

View File

@@ -14,11 +14,12 @@ import SyncOptions from "./options/sync";
import OtherSettings from "./options/other";
import InternationalizationOptions from "./options/i18n";
import AdvancedSettings from "./options/advanced";
import LlmSettings from "./options/llm";
import "./ContentWidget.css";
import { t } from "../../services/i18n";
import BackendLog from "./code/BackendLog";
export type OptionPages = "_optionsAppearance" | "_optionsShortcuts" | "_optionsTextNotes" | "_optionsCodeNotes" | "_optionsImages" | "_optionsSpellcheck" | "_optionsPassword" | "_optionsMFA" | "_optionsEtapi" | "_optionsBackup" | "_optionsSync" | "_optionsOther" | "_optionsLocalization" | "_optionsAdvanced";
export type OptionPages = "_optionsAppearance" | "_optionsShortcuts" | "_optionsTextNotes" | "_optionsCodeNotes" | "_optionsImages" | "_optionsSpellcheck" | "_optionsPassword" | "_optionsMFA" | "_optionsEtapi" | "_optionsBackup" | "_optionsSync" | "_optionsOther" | "_optionsLocalization" | "_optionsAdvanced" | "_optionsLlm";
const CONTENT_WIDGETS: Record<OptionPages | "_backendLog", (props: TypeWidgetProps) => JSX.Element> = {
_optionsAppearance: AppearanceSettings,
@@ -35,6 +36,7 @@ const CONTENT_WIDGETS: Record<OptionPages | "_backendLog", (props: TypeWidgetPro
_optionsOther: OtherSettings,
_optionsLocalization: InternationalizationOptions,
_optionsAdvanced: AdvancedSettings,
_optionsLlm: LlmSettings,
_backendLog: BackendLog
}

View File

@@ -0,0 +1,238 @@
import type { RefObject } from "preact";
import { useState, useCallback } from "preact/hooks";
import { t } from "../../../services/i18n.js";
import ActionButton from "../../react/ActionButton.js";
import Button from "../../react/Button.js";
import Dropdown from "../../react/Dropdown.js";
import { FormDropdownDivider, FormDropdownSubmenu, FormListItem, FormListToggleableItem } from "../../react/FormList.js";
import type { UseLlmChatReturn } from "./useLlmChat.js";
import AddProviderModal, { type LlmProviderConfig } from "../options/llm/AddProviderModal.js";
import options from "../../../services/options.js";
/** Format token count with thousands separators */
function formatTokenCount(tokens: number): string {
return tokens.toLocaleString();
}
interface ChatInputBarProps {
/** The chat hook result */
chat: UseLlmChatReturn;
/** Number of rows for the textarea (default: 3) */
rows?: number;
/** Current active note ID (for note context toggle) */
activeNoteId?: string;
/** Current active note title (for note context toggle) */
activeNoteTitle?: string;
/** Custom submit handler (overrides chat.handleSubmit) */
onSubmit?: (e: Event) => void;
/** Custom key down handler (overrides chat.handleKeyDown) */
onKeyDown?: (e: KeyboardEvent) => void;
/** Callback when web search toggle changes */
onWebSearchChange?: () => void;
/** Callback when note tools toggle changes */
onNoteToolsChange?: () => void;
/** Callback when extended thinking toggle changes */
onExtendedThinkingChange?: () => void;
/** Callback when model changes */
onModelChange?: (model: string) => void;
}
export default function ChatInputBar({
chat,
rows = 3,
activeNoteId,
activeNoteTitle,
onSubmit,
onKeyDown,
onWebSearchChange,
onNoteToolsChange,
onExtendedThinkingChange,
onModelChange
}: ChatInputBarProps) {
const [showAddProviderModal, setShowAddProviderModal] = useState(false);
const handleSubmit = onSubmit ?? chat.handleSubmit;
const handleKeyDown = onKeyDown ?? chat.handleKeyDown;
const handleWebSearchToggle = (newValue: boolean) => {
chat.setEnableWebSearch(newValue);
onWebSearchChange?.();
};
const handleNoteToolsToggle = (newValue: boolean) => {
chat.setEnableNoteTools(newValue);
onNoteToolsChange?.();
};
const handleExtendedThinkingToggle = (newValue: boolean) => {
chat.setEnableExtendedThinking(newValue);
onExtendedThinkingChange?.();
};
const handleModelSelect = (model: string) => {
chat.setSelectedModel(model);
onModelChange?.(model);
};
const handleNoteContextToggle = () => {
if (chat.contextNoteId) {
chat.setContextNoteId(undefined);
} else if (activeNoteId) {
chat.setContextNoteId(activeNoteId);
}
};
const handleAddProvider = useCallback(async (provider: LlmProviderConfig) => {
// Get current providers and add the new one
const currentProviders = options.getJson("llmProviders") || [];
const newProviders = [...currentProviders, provider];
await options.save("llmProviders", JSON.stringify(newProviders));
// Refresh models to pick up the new provider
chat.refreshModels();
}, [chat]);
const isNoteContextEnabled = !!chat.contextNoteId && !!activeNoteId;
const currentModel = chat.availableModels.find(m => m.id === chat.selectedModel);
const currentModels = chat.availableModels.filter(m => !m.isLegacy);
const legacyModels = chat.availableModels.filter(m => m.isLegacy);
const contextWindow = currentModel?.contextWindow || 200000;
const percentage = Math.min((chat.lastPromptTokens / contextWindow) * 100, 100);
const isWarning = percentage > 75;
const isCritical = percentage > 90;
const pieColor = isCritical ? "var(--danger-color, #d9534f)" : isWarning ? "var(--warning-color, #f0ad4e)" : "var(--main-selection-color, #007bff)";
// Show setup prompt if no provider is configured
if (!chat.isCheckingProvider && !chat.hasProvider) {
return (
<div className="llm-chat-no-provider">
<div className="llm-chat-no-provider-content">
<span className="bx bx-bot llm-chat-no-provider-icon" />
<p>{t("llm_chat.no_provider_message")}</p>
<Button
text={t("llm_chat.add_provider")}
icon="bx bx-plus"
onClick={() => setShowAddProviderModal(true)}
/>
</div>
<AddProviderModal
show={showAddProviderModal}
onHidden={() => setShowAddProviderModal(false)}
onSave={handleAddProvider}
/>
</div>
);
}
return (
<form className="llm-chat-input-form" onSubmit={handleSubmit}>
<textarea
ref={chat.textareaRef as RefObject<HTMLTextAreaElement>}
className="llm-chat-input"
value={chat.input}
onInput={(e) => chat.setInput((e.target as HTMLTextAreaElement).value)}
placeholder={t("llm_chat.placeholder")}
disabled={chat.isStreaming}
onKeyDown={handleKeyDown}
rows={rows}
/>
<div className="llm-chat-options">
<div className="llm-chat-model-selector">
<span className="bx bx-chip" />
<Dropdown
text={<>{currentModel?.name}</>}
disabled={chat.isStreaming}
buttonClassName="llm-chat-model-select"
>
{currentModels.map(model => (
<FormListItem
key={model.id}
onClick={() => handleModelSelect(model.id)}
checked={chat.selectedModel === model.id}
>
{model.name} <small>({model.costDescription})</small>
</FormListItem>
))}
{legacyModels.length > 0 && (
<>
<FormDropdownDivider />
<FormDropdownSubmenu
icon="bx bx-history"
title={t("llm_chat.legacy_models")}
>
{legacyModels.map(model => (
<FormListItem
key={model.id}
onClick={() => handleModelSelect(model.id)}
checked={chat.selectedModel === model.id}
>
{model.name} <small>({model.costDescription})</small>
</FormListItem>
))}
</FormDropdownSubmenu>
</>
)}
<FormDropdownDivider />
<FormListToggleableItem
icon="bx bx-globe"
title={t("llm_chat.web_search")}
currentValue={chat.enableWebSearch}
onChange={handleWebSearchToggle}
disabled={chat.isStreaming}
/>
<FormListToggleableItem
icon="bx bx-note"
title={t("llm_chat.note_tools")}
currentValue={chat.enableNoteTools}
onChange={handleNoteToolsToggle}
disabled={chat.isStreaming}
/>
<FormListToggleableItem
icon="bx bx-brain"
title={t("llm_chat.extended_thinking")}
currentValue={chat.enableExtendedThinking}
onChange={handleExtendedThinkingToggle}
disabled={chat.isStreaming}
/>
</Dropdown>
{activeNoteId && activeNoteTitle && (
<Button
text={activeNoteTitle}
icon={isNoteContextEnabled ? "bx-file" : "bx-hide"}
kind="lowProfile"
size="micro"
className={`llm-chat-note-context ${isNoteContextEnabled ? "active" : ""}`}
onClick={handleNoteContextToggle}
disabled={chat.isStreaming}
title={isNoteContextEnabled
? t("llm_chat.note_context_enabled", { title: activeNoteTitle })
: t("llm_chat.note_context_disabled")}
/>
)}
{chat.lastPromptTokens > 0 && (
<div
className="llm-chat-context-indicator"
title={`${formatTokenCount(chat.lastPromptTokens)} / ${formatTokenCount(contextWindow)} ${t("llm_chat.tokens")}`}
>
<div
className="llm-chat-context-pie"
style={{
background: `conic-gradient(${pieColor} ${percentage}%, var(--accented-background-color) ${percentage}%)`
}}
/>
<span className="llm-chat-context-text">{t("llm_chat.context_used", { percentage: percentage.toFixed(0) })}</span>
</div>
)}
</div>
<ActionButton
icon={chat.isStreaming ? "bx bx-loader-alt bx-spin" : "bx bx-send"}
text={chat.isStreaming ? t("llm_chat.sending") : t("llm_chat.send")}
onClick={handleSubmit}
disabled={chat.isStreaming || !chat.input.trim()}
className="llm-chat-send-btn"
/>
</div>
</form>
);
}

View File

@@ -0,0 +1,244 @@
import "./LlmChat.css";
import { Marked } from "marked";
import { useMemo } from "preact/hooks";
import { t } from "../../../services/i18n.js";
import utils from "../../../services/utils.js";
import { SanitizedHtml } from "../../react/RawHtml.js";
import { type ContentBlock, getMessageText, type StoredMessage, type ToolCall } from "./llm_chat_types.js";
function shortenNumber(n: number): string {
if (n >= 1_000_000) return `${(n / 1_000_000).toFixed(1)}M`;
if (n >= 1_000) return `${(n / 1_000).toFixed(n >= 10_000 ? 0 : 1)}k`;
return n.toString();
}
// Configure marked for safe rendering
const markedInstance = new Marked({
breaks: true, // Convert \n to <br>
gfm: true // GitHub Flavored Markdown
});
/** Parse markdown to HTML. Sanitization is handled by SanitizedHtml. */
function renderMarkdown(markdown: string): string {
return markedInstance.parse(markdown) as string;
}
interface Props {
message: StoredMessage;
isStreaming?: boolean;
}
function ToolCallCard({ toolCall }: { toolCall: ToolCall }) {
const classes = [
"llm-chat-tool-call-inline",
toolCall.isError && "llm-chat-tool-call-error"
].filter(Boolean).join(" ");
return (
<details className={classes}>
<summary className="llm-chat-tool-call-inline-summary">
<span className={toolCall.isError ? "bx bx-error-circle" : "bx bx-wrench"} />
{toolCall.toolName}
{toolCall.isError && <span className="llm-chat-tool-call-error-badge">{t("llm_chat.tool_error")}</span>}
</summary>
<div className="llm-chat-tool-call-inline-body">
<div className="llm-chat-tool-call-input">
<strong>{t("llm_chat.input")}:</strong>
<pre>{JSON.stringify(toolCall.input, null, 2)}</pre>
</div>
{toolCall.result && (
<div className={`llm-chat-tool-call-result ${toolCall.isError ? "llm-chat-tool-call-result-error" : ""}`}>
<strong>{toolCall.isError ? t("llm_chat.error") : t("llm_chat.result")}:</strong>
<pre>{(() => {
if (typeof toolCall.result === "string" && (toolCall.result.startsWith("{") || toolCall.result.startsWith("["))) {
try {
return JSON.stringify(JSON.parse(toolCall.result), null, 2);
} catch {
return toolCall.result;
}
}
return toolCall.result;
})()}</pre>
</div>
)}
</div>
</details>
);
}
function renderContentBlocks(blocks: ContentBlock[], isStreaming?: boolean) {
return blocks.map((block, idx) => {
if (block.type === "text") {
const html = renderMarkdown(block.content);
return (
<div key={idx}>
<SanitizedHtml className="llm-chat-markdown" html={html} />
{isStreaming && idx === blocks.length - 1 && <span className="llm-chat-cursor" />}
</div>
);
}
if (block.type === "tool_call") {
return <ToolCallCard key={idx} toolCall={block.toolCall} />;
}
return null;
});
}
export default function ChatMessage({ message, isStreaming }: Props) {
const roleLabel = message.role === "user" ? t("llm_chat.role_user") : t("llm_chat.role_assistant");
const isError = message.type === "error";
const isThinking = message.type === "thinking";
const textContent = typeof message.content === "string" ? message.content : getMessageText(message.content);
// Render markdown for assistant messages with legacy string content
const renderedContent = useMemo(() => {
if (message.role === "assistant" && !isError && !isThinking && typeof message.content === "string") {
return renderMarkdown(message.content);
}
return null;
}, [message.content, message.role, isError, isThinking]);
const messageClasses = [
"llm-chat-message",
`llm-chat-message-${message.role}`,
isError && "llm-chat-message-error",
isThinking && "llm-chat-message-thinking"
].filter(Boolean).join(" ");
// Render thinking messages in a collapsible details element
if (isThinking) {
return (
<details className={messageClasses}>
<summary className="llm-chat-thinking-summary">
<span className="bx bx-brain" />
{t("llm_chat.thought_process")}
</summary>
<div className="llm-chat-message-content llm-chat-thinking-content">
{textContent}
{isStreaming && <span className="llm-chat-cursor" />}
</div>
</details>
);
}
// Legacy tool calls (from old format stored as separate field)
const legacyToolCalls = message.toolCalls;
const hasBlockContent = Array.isArray(message.content);
return (
<div className={`llm-chat-message-wrapper llm-chat-message-wrapper-${message.role}`}>
<div className={messageClasses}>
<div className="llm-chat-message-role">
{isError ? "Error" : roleLabel}
</div>
<div className="llm-chat-message-content">
{message.role === "assistant" && !isError ? (
hasBlockContent ? (
renderContentBlocks(message.content as ContentBlock[], isStreaming)
) : (
<>
<SanitizedHtml className="llm-chat-markdown" html={renderedContent || ""} />
{isStreaming && <span className="llm-chat-cursor" />}
</>
)
) : (
textContent
)}
</div>
{legacyToolCalls && legacyToolCalls.length > 0 && (
<details className="llm-chat-tool-calls">
<summary className="llm-chat-tool-calls-summary">
<span className="bx bx-wrench" />
{t("llm_chat.tool_calls", { count: legacyToolCalls.length })}
</summary>
<div className="llm-chat-tool-calls-list">
{legacyToolCalls.map((tool) => (
<ToolCallCard key={tool.id} toolCall={tool} />
))}
</div>
</details>
)}
{message.citations && message.citations.length > 0 && (
<div className="llm-chat-citations">
<div className="llm-chat-citations-label">
<span className="bx bx-link" />
{t("llm_chat.sources")}
</div>
<ul className="llm-chat-citations-list">
{message.citations.map((citation, idx) => {
// Determine display text: title, URL hostname, or cited text
let displayText = citation.title;
if (!displayText && citation.url) {
try {
displayText = new URL(citation.url).hostname;
} catch {
displayText = citation.url;
}
}
if (!displayText) {
displayText = citation.citedText?.slice(0, 50) || `Source ${idx + 1}`;
}
return (
<li key={idx}>
{citation.url ? (
<a
href={citation.url}
target="_blank"
rel="noopener noreferrer"
title={citation.citedText || citation.url}
>
{displayText}
</a>
) : (
<span title={citation.citedText}>
{displayText}
</span>
)}
</li>
);
})}
</ul>
</div>
)}
</div>
<div className={`llm-chat-footer llm-chat-footer-${message.role}`}>
<span
className="llm-chat-footer-time"
title={utils.formatDateTime(new Date(message.createdAt))}
>
{utils.formatTime(new Date(message.createdAt))}
</span>
{message.usage && typeof message.usage.promptTokens === "number" && (
<>
{message.usage.model && (
<>
<span className="llm-chat-usage-separator">·</span>
<span className="llm-chat-usage-model">{message.usage.model}</span>
</>
)}
<span className="llm-chat-usage-separator">·</span>
<span
className="llm-chat-usage-tokens"
title={t("llm_chat.tokens_detail", {
prompt: message.usage.promptTokens.toLocaleString(),
completion: message.usage.completionTokens.toLocaleString()
})}
>
<span className="bx bx-chip" />{" "}
{t("llm_chat.total_tokens", { total: shortenNumber(message.usage.totalTokens) })}
</span>
{message.usage.cost != null && (
<>
<span className="llm-chat-usage-separator">·</span>
<span className="llm-chat-usage-cost">~${message.usage.cost.toFixed(4)}</span>
</>
)}
</>
)}
</div>
</div>
);
}

View File

@@ -0,0 +1,725 @@
.llm-chat-container {
display: flex;
flex-direction: column;
height: 100%;
padding: 1rem;
box-sizing: border-box;
}
.llm-chat-messages {
flex: 1;
overflow-y: auto;
padding-bottom: 1rem;
}
.llm-chat-message-wrapper {
position: relative;
margin-top: 1rem;
padding-bottom: 1.25rem;
max-width: 85%;
}
.llm-chat-message-wrapper:first-child {
margin-top: 0;
}
.llm-chat-message-wrapper-user {
margin-left: auto;
}
.llm-chat-message-wrapper-assistant {
margin-right: auto;
}
/* Show footer only on hover */
.llm-chat-message-wrapper:hover .llm-chat-footer {
opacity: 1;
}
.llm-chat-message {
padding: 0.75rem 1rem;
border-radius: 8px;
user-select: text;
}
.llm-chat-message-user {
background: var(--accented-background-color);
}
.llm-chat-message-assistant {
background: var(--main-background-color);
border: 1px solid var(--main-border-color);
}
.llm-chat-message-role {
font-weight: 600;
margin-bottom: 0.25rem;
font-size: 0.8rem;
color: var(--muted-text-color);
}
.llm-chat-message-content {
word-wrap: break-word;
line-height: 1.5;
}
/* Preserve whitespace only for user messages (plain text) */
.llm-chat-message-user .llm-chat-message-content {
white-space: pre-wrap;
}
.llm-chat-cursor {
display: inline-block;
width: 8px;
height: 1.1em;
background: currentColor;
margin-left: 2px;
vertical-align: text-bottom;
animation: llm-chat-blink 1s infinite;
}
@keyframes llm-chat-blink {
0%, 50% { opacity: 1; }
51%, 100% { opacity: 0; }
}
/* Tool activity indicator */
.llm-chat-tool-activity {
display: flex;
align-items: center;
gap: 0.5rem;
padding: 0.5rem 1rem;
margin-bottom: 1rem;
border-radius: 8px;
background: var(--accented-background-color);
color: var(--muted-text-color);
font-size: 0.9rem;
max-width: 85%;
}
.llm-chat-tool-spinner {
width: 16px;
height: 16px;
border: 2px solid var(--muted-text-color);
border-top-color: transparent;
border-radius: 50%;
animation: llm-chat-spin 0.8s linear infinite;
}
@keyframes llm-chat-spin {
to { transform: rotate(360deg); }
}
/* Citations */
.llm-chat-citations {
margin-top: 0.75rem;
padding-top: 0.75rem;
border-top: 1px solid var(--main-border-color);
}
.llm-chat-citations-label {
display: flex;
align-items: center;
gap: 0.25rem;
font-size: 0.8rem;
font-weight: 600;
color: var(--muted-text-color);
margin-bottom: 0.25rem;
}
.llm-chat-citations-list {
margin: 0;
padding: 0;
list-style: none;
display: flex;
flex-wrap: wrap;
gap: 0.5rem;
}
.llm-chat-citations-list li {
font-size: 0.8rem;
}
.llm-chat-citations-list a {
color: var(--link-color, #007bff);
text-decoration: none;
padding: 0.125rem 0.5rem;
background: var(--accented-background-color);
border-radius: 4px;
display: inline-block;
}
.llm-chat-citations-list a:hover {
text-decoration: underline;
}
/* Error */
.llm-chat-error {
padding: 0.75rem 1rem;
margin-bottom: 1rem;
border-radius: 8px;
background: var(--danger-background-color, #fee);
border: 1px solid var(--danger-border-color, #fcc);
color: var(--danger-text-color, #c00);
user-select: text;
}
/* Error message (persisted in conversation) */
.llm-chat-message-error {
background: var(--danger-background-color, #fee);
border: 1px solid var(--danger-border-color, #fcc);
color: var(--danger-text-color, #c00);
}
.llm-chat-message-error .llm-chat-message-role {
color: var(--danger-text-color, #c00);
}
/* Thinking message (collapsible) */
.llm-chat-message-thinking {
background: var(--accented-background-color);
border: 1px dashed var(--main-border-color);
cursor: pointer;
}
.llm-chat-thinking-summary {
display: flex;
align-items: center;
gap: 0.5rem;
font-size: 0.85rem;
font-weight: 500;
color: var(--muted-text-color);
padding: 0.25rem 0;
list-style: none;
}
.llm-chat-thinking-summary::-webkit-details-marker {
display: none;
}
.llm-chat-thinking-summary::before {
content: "▶";
font-size: 0.7em;
transition: transform 0.2s ease;
}
.llm-chat-message-thinking[open] .llm-chat-thinking-summary::before {
transform: rotate(90deg);
}
.llm-chat-thinking-summary .bx {
font-size: 1rem;
}
.llm-chat-thinking-content {
margin-top: 0.5rem;
padding-top: 0.5rem;
border-top: 1px solid var(--main-border-color);
font-size: 0.9rem;
color: var(--muted-text-color);
white-space: pre-wrap;
}
/* Input form */
.llm-chat-input-form {
display: flex;
flex-direction: column;
gap: 0.5rem;
padding-top: 1rem;
border-top: 1px solid var(--main-border-color);
}
.llm-chat-input {
flex: 1;
min-height: 60px;
max-height: 200px;
resize: vertical;
padding: 0.75rem;
border: 1px solid var(--main-border-color);
border-radius: 8px;
font-family: inherit;
font-size: inherit;
background: var(--main-background-color);
color: var(--main-text-color);
}
.llm-chat-input:focus {
outline: none;
border-color: var(--main-selection-color);
box-shadow: 0 0 0 2px var(--main-selection-color-soft, rgba(0, 123, 255, 0.25));
}
.llm-chat-input:disabled {
opacity: 0.6;
cursor: not-allowed;
}
/* Options row */
.llm-chat-options {
display: flex;
align-items: center;
gap: 0.75rem;
}
.llm-chat-send-btn {
margin-left: auto;
font-size: 1.25rem;
}
.llm-chat-send-btn.disabled {
opacity: 0.4;
}
/* Model selector */
.llm-chat-model-selector {
display: flex;
align-items: center;
gap: 0.375rem;
font-size: 0.85rem;
color: var(--muted-text-color);
}
.llm-chat-model-selector .bx {
font-size: 1rem;
}
.llm-chat-model-selector .dropdown {
display: flex;
small {
margin-left: 0.5em;
color: var(--muted-text-color);
}
/* Position legacy models submenu to open upward */
.dropdown-submenu .dropdown-menu {
bottom: 0;
top: auto;
}
}
.llm-chat-model-select.select-button {
padding: 0.25rem 0.5rem;
border: 1px solid var(--main-border-color);
border-radius: 4px;
background: var(--main-background-color);
color: var(--main-text-color);
font-family: inherit;
font-size: 0.85rem;
cursor: pointer;
min-width: 140px;
text-align: left;
}
.llm-chat-model-select.select-button:focus {
outline: none;
border-color: var(--main-selection-color);
}
.llm-chat-model-select.select-button:disabled {
opacity: 0.5;
cursor: not-allowed;
}
/* Note context toggle */
.llm-chat-note-context.tn-low-profile {
max-width: 150px;
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
opacity: 0.5;
background: none;
border: none;
}
.llm-chat-note-context.tn-low-profile:hover:not(:disabled) {
opacity: 0.8;
background: none;
}
.llm-chat-note-context.tn-low-profile.active {
opacity: 1;
}
/* Markdown styles */
.llm-chat-markdown {
line-height: 1.6;
}
.llm-chat-markdown p {
margin: 0 0 0.75em 0;
}
.llm-chat-markdown p:last-child {
margin-bottom: 0;
}
.llm-chat-markdown h1,
.llm-chat-markdown h2,
.llm-chat-markdown h3,
.llm-chat-markdown h4,
.llm-chat-markdown h5,
.llm-chat-markdown h6 {
margin: 1em 0 0.5em 0;
font-weight: 600;
line-height: 1.3;
}
.llm-chat-markdown h1:first-child,
.llm-chat-markdown h2:first-child,
.llm-chat-markdown h3:first-child {
margin-top: 0;
}
.llm-chat-markdown h1 { font-size: 1.4em; }
.llm-chat-markdown h2 { font-size: 1.25em; }
.llm-chat-markdown h3 { font-size: 1.1em; }
.llm-chat-markdown ul,
.llm-chat-markdown ol {
margin: 0.5em 0;
padding-left: 1.5em;
}
.llm-chat-markdown li {
margin: 0.25em 0;
}
.llm-chat-markdown code {
background: var(--accented-background-color);
padding: 0.15em 0.4em;
border-radius: 4px;
font-family: var(--monospace-font-family, monospace);
font-size: 0.9em;
}
.llm-chat-markdown pre {
background: var(--accented-background-color);
padding: 0.75em 1em;
border-radius: 6px;
overflow-x: auto;
margin: 0.75em 0;
}
.llm-chat-markdown pre code {
background: none;
padding: 0;
font-size: 0.85em;
}
.llm-chat-markdown blockquote {
margin: 0.75em 0;
padding: 0.5em 1em;
border-left: 3px solid var(--main-border-color);
background: var(--accented-background-color);
}
.llm-chat-markdown blockquote p {
margin: 0;
}
.llm-chat-markdown a {
color: var(--link-color, #007bff);
text-decoration: none;
}
.llm-chat-markdown a:hover {
text-decoration: underline;
}
.llm-chat-markdown hr {
border: none;
border-top: 1px solid var(--main-border-color);
margin: 1em 0;
}
.llm-chat-markdown table {
border-collapse: collapse;
width: 100%;
margin: 0.75em 0;
}
.llm-chat-markdown th,
.llm-chat-markdown td {
border: 1px solid var(--main-border-color);
padding: 0.5em 0.75em;
text-align: left;
}
.llm-chat-markdown th {
background: var(--accented-background-color);
font-weight: 600;
}
.llm-chat-markdown strong {
font-weight: 600;
}
.llm-chat-markdown em {
font-style: italic;
}
/* Tool calls display */
.llm-chat-tool-calls {
margin-top: 0.75rem;
padding-top: 0.75rem;
border-top: 1px solid var(--main-border-color);
}
.llm-chat-tool-calls-summary {
display: flex;
align-items: center;
gap: 0.5rem;
font-size: 0.85rem;
font-weight: 500;
color: var(--muted-text-color);
padding: 0.25rem 0;
cursor: pointer;
list-style: none;
}
.llm-chat-tool-calls-summary::-webkit-details-marker {
display: none;
}
.llm-chat-tool-calls-summary::before {
content: "▶";
font-size: 0.7em;
transition: transform 0.2s ease;
}
.llm-chat-tool-calls[open] .llm-chat-tool-calls-summary::before {
transform: rotate(90deg);
}
.llm-chat-tool-calls-summary .bx {
font-size: 1rem;
}
.llm-chat-tool-calls-list {
margin-top: 0.5rem;
display: flex;
flex-direction: column;
gap: 0.75rem;
}
.llm-chat-tool-call {
background: var(--accented-background-color);
border-radius: 6px;
padding: 0.75rem;
font-size: 0.85rem;
}
.llm-chat-tool-call-name {
font-weight: 600;
margin-bottom: 0.5rem;
color: var(--main-text-color);
font-family: var(--monospace-font-family, monospace);
}
.llm-chat-tool-call-input,
.llm-chat-tool-call-result {
margin-top: 0.5rem;
}
.llm-chat-tool-call-input strong,
.llm-chat-tool-call-result strong {
display: block;
font-size: 0.75rem;
color: var(--muted-text-color);
margin-bottom: 0.25rem;
}
.llm-chat-tool-call pre {
margin: 0;
padding: 0.5rem;
background: var(--main-background-color);
border-radius: 4px;
overflow-x: auto;
font-size: 0.8rem;
font-family: var(--monospace-font-family, monospace);
max-height: 200px;
overflow-y: auto;
}
/* Inline tool call cards (timeline style) */
.llm-chat-tool-call-inline {
margin: 0.5rem 0;
background: var(--accented-background-color);
border-radius: 6px;
border-left: 3px solid var(--muted-text-color);
font-size: 0.85rem;
}
.llm-chat-tool-call-inline-summary {
display: flex;
align-items: center;
gap: 0.5rem;
padding: 0.5rem 0.75rem;
cursor: pointer;
list-style: none;
font-weight: 500;
color: var(--muted-text-color);
font-family: var(--monospace-font-family, monospace);
}
.llm-chat-tool-call-inline-summary::-webkit-details-marker {
display: none;
}
.llm-chat-tool-call-inline-summary::before {
content: "▶";
font-size: 0.7em;
transition: transform 0.2s ease;
}
.llm-chat-tool-call-inline[open] .llm-chat-tool-call-inline-summary::before {
transform: rotate(90deg);
}
.llm-chat-tool-call-inline-summary .bx {
font-size: 1rem;
}
.llm-chat-tool-call-inline-body {
padding: 0 0.75rem 0.75rem;
}
.llm-chat-tool-call-inline-body pre {
margin: 0;
padding: 0.5rem;
background: var(--main-background-color);
border-radius: 4px;
overflow-x: auto;
font-size: 0.8rem;
font-family: var(--monospace-font-family, monospace);
max-height: 200px;
overflow-y: auto;
}
.llm-chat-tool-call-inline-body strong {
display: block;
font-size: 0.75rem;
color: var(--muted-text-color);
margin-bottom: 0.25rem;
}
.llm-chat-tool-call-inline-body .llm-chat-tool-call-result {
margin-top: 0.5rem;
}
/* Tool call error styling */
.llm-chat-tool-call-error {
border-left-color: var(--danger-color, #dc3545);
}
.llm-chat-tool-call-error .llm-chat-tool-call-inline-summary {
color: var(--danger-color, #dc3545);
}
.llm-chat-tool-call-error-badge {
font-size: 0.75rem;
font-weight: 400;
font-family: var(--main-font-family);
color: var(--danger-color, #dc3545);
opacity: 0.8;
}
.llm-chat-tool-call-result-error pre {
color: var(--danger-color, #dc3545);
}
/* Message footer (timestamp + token usage, sits below the bubble) */
.llm-chat-footer {
position: absolute;
bottom: 0;
left: 0;
right: 0;
display: flex;
align-items: center;
gap: 0.375rem;
padding: 0.125rem 0.5rem;
font-size: 0.7rem;
color: var(--muted-text-color);
cursor: default;
opacity: 0;
transition: opacity 0.15s ease;
}
.llm-chat-footer-user {
justify-content: flex-end;
}
.llm-chat-footer .bx {
font-size: 0.875rem;
}
.llm-chat-footer-time {
cursor: help;
}
.llm-chat-usage-model {
font-weight: 500;
}
.llm-chat-usage-separator {
opacity: 0.5;
}
.llm-chat-usage-tokens {
cursor: help;
font-family: var(--monospace-font-family, monospace);
}
.llm-chat-usage-cost {
font-family: var(--monospace-font-family, monospace);
}
/* Context window indicator */
.llm-chat-context-indicator {
display: flex;
align-items: center;
gap: 0.375rem;
margin-left: 0.5rem;
cursor: help;
}
.llm-chat-context-pie {
width: 14px;
height: 14px;
border-radius: 50%;
flex-shrink: 0;
}
.llm-chat-context-text {
font-size: 0.75rem;
color: var(--muted-text-color);
}
/* No provider state */
.llm-chat-no-provider {
display: flex;
align-items: center;
justify-content: center;
padding: 1rem;
border-top: 1px solid var(--main-border-color);
}
.llm-chat-no-provider-content {
display: flex;
flex-direction: column;
align-items: center;
gap: 0.75rem;
text-align: center;
color: var(--muted-text-color);
}
.llm-chat-no-provider-icon {
font-size: 2rem;
opacity: 0.5;
}
.llm-chat-no-provider-content p {
margin: 0;
font-size: 0.9rem;
}

View File

@@ -0,0 +1,109 @@
import "./LlmChat.css";
import { useCallback, useEffect, useRef } from "preact/hooks";
import { t } from "../../../services/i18n.js";
import { useEditorSpacedUpdate } from "../../react/hooks.js";
import NoItems from "../../react/NoItems.js";
import { TypeWidgetProps } from "../type_widget.js";
import ChatInputBar from "./ChatInputBar.js";
import ChatMessage from "./ChatMessage.js";
import type { LlmChatContent } from "./llm_chat_types.js";
import { useLlmChat } from "./useLlmChat.js";
export default function LlmChat({ note, ntxId, noteContext }: TypeWidgetProps) {
const spacedUpdateRef = useRef<{ scheduleUpdate: () => void }>(null);
const chat = useLlmChat(
// onMessagesChange - trigger save
() => spacedUpdateRef.current?.scheduleUpdate(),
{ defaultEnableNoteTools: false, supportsExtendedThinking: true, chatNoteId: note?.noteId }
);
// Keep chatNoteId in sync when the note changes
useEffect(() => {
chat.setChatNoteId(note?.noteId);
}, [note?.noteId, chat.setChatNoteId]);
const spacedUpdate = useEditorSpacedUpdate({
note,
noteType: "llmChat",
noteContext,
getData: () => {
const content = chat.getContent();
return { content: JSON.stringify(content) };
},
onContentChange: (content) => {
if (!content) {
chat.clearMessages();
return;
}
try {
const parsed: LlmChatContent = JSON.parse(content);
chat.loadFromContent(parsed);
} catch (e) {
console.error("Failed to parse LLM chat content:", e);
chat.clearMessages();
}
}
});
spacedUpdateRef.current = spacedUpdate;
const triggerSave = useCallback(() => {
spacedUpdateRef.current?.scheduleUpdate();
}, []);
return (
<div className="llm-chat-container">
<div className="llm-chat-messages">
{chat.messages.length === 0 && !chat.isStreaming && (
<NoItems
icon="bx bx-conversation"
text={t("llm_chat.empty_state")}
/>
)}
{chat.messages.map(msg => (
<ChatMessage key={msg.id} message={msg} />
))}
{chat.toolActivity && !chat.streamingThinking && (
<div className="llm-chat-tool-activity">
<span className="llm-chat-tool-spinner" />
{chat.toolActivity}
</div>
)}
{chat.isStreaming && chat.streamingThinking && (
<ChatMessage
message={{
id: "streaming-thinking",
role: "assistant",
content: chat.streamingThinking,
createdAt: new Date().toISOString(),
type: "thinking"
}}
isStreaming
/>
)}
{chat.isStreaming && chat.streamingContent && (
<ChatMessage
message={{
id: "streaming",
role: "assistant",
content: chat.streamingContent,
createdAt: new Date().toISOString(),
citations: chat.pendingCitations.length > 0 ? chat.pendingCitations : undefined
}}
isStreaming
/>
)}
<div ref={chat.messagesEndRef} />
</div>
<ChatInputBar
chat={chat}
onWebSearchChange={triggerSave}
onNoteToolsChange={triggerSave}
onExtendedThinkingChange={triggerSave}
onModelChange={triggerSave}
/>
</div>
);
}

View File

@@ -0,0 +1,80 @@
import type { LlmCitation, LlmUsage } from "@triliumnext/commons";
export type MessageType = "message" | "error" | "thinking";
export interface ToolCall {
id: string;
toolName: string;
input: Record<string, unknown>;
result?: string;
isError?: boolean;
}
/** A block of text content (rendered as Markdown for assistant messages). */
export interface TextBlock {
type: "text";
content: string;
}
/** A tool invocation block shown inline in the message timeline. */
export interface ToolCallBlock {
type: "tool_call";
toolCall: ToolCall;
}
/** An ordered content block in an assistant message. */
export type ContentBlock = TextBlock | ToolCallBlock;
/**
* Extract the plain text from message content (works for both legacy string and block formats).
*/
export function getMessageText(content: string | ContentBlock[]): string {
if (typeof content === "string") {
return content;
}
return content
.filter((b): b is TextBlock => b.type === "text")
.map(b => b.content)
.join("");
}
/**
* Extract tool calls from message content blocks.
*/
export function getMessageToolCalls(message: StoredMessage): ToolCall[] {
// Legacy format: tool calls stored in separate field
if (message.toolCalls) {
return message.toolCalls;
}
// Block format: extract from content blocks
if (Array.isArray(message.content)) {
return message.content
.filter((b): b is ToolCallBlock => b.type === "tool_call")
.map(b => b.toolCall);
}
return [];
}
export interface StoredMessage {
id: string;
role: "user" | "assistant" | "system";
/** Message content: plain string (user messages, legacy) or ordered content blocks (assistant). */
content: string | ContentBlock[];
createdAt: string;
citations?: LlmCitation[];
/** Message type for special rendering. Defaults to "message" if omitted. */
type?: MessageType;
/** @deprecated Tool calls are now inline in content blocks. Kept for backward compatibility. */
toolCalls?: ToolCall[];
/** Token usage for this response */
usage?: LlmUsage;
}
export interface LlmChatContent {
version: 1;
messages: StoredMessage[];
selectedModel?: string;
enableWebSearch?: boolean;
enableNoteTools?: boolean;
enableExtendedThinking?: boolean;
}

View File

@@ -0,0 +1,404 @@
import type { LlmCitation, LlmMessage, LlmModelInfo, LlmUsage } from "@triliumnext/commons";
import { RefObject } from "preact";
import { useCallback, useEffect, useRef, useState } from "preact/hooks";
import { t } from "../../../services/i18n.js";
import { getAvailableModels, streamChatCompletion } from "../../../services/llm_chat.js";
import { randomString } from "../../../services/utils.js";
import type { ContentBlock, LlmChatContent, StoredMessage } from "./llm_chat_types.js";
export interface ModelOption extends LlmModelInfo {
costDescription?: string;
}
export interface LlmChatOptions {
/** Default value for enableNoteTools */
defaultEnableNoteTools?: boolean;
/** Whether extended thinking is supported */
supportsExtendedThinking?: boolean;
/** Initial context note ID (the note the user is viewing) */
contextNoteId?: string;
/** The chat note ID (used for auto-renaming on first message) */
chatNoteId?: string;
}
export interface UseLlmChatReturn {
// State
messages: StoredMessage[];
input: string;
isStreaming: boolean;
streamingContent: string;
streamingThinking: string;
toolActivity: string | null;
pendingCitations: LlmCitation[];
availableModels: ModelOption[];
selectedModel: string;
enableWebSearch: boolean;
enableNoteTools: boolean;
enableExtendedThinking: boolean;
contextNoteId: string | undefined;
lastPromptTokens: number;
messagesEndRef: RefObject<HTMLDivElement>;
textareaRef: RefObject<HTMLTextAreaElement>;
/** Whether a provider is configured and available */
hasProvider: boolean;
/** Whether we're still checking for providers */
isCheckingProvider: boolean;
// Setters
setInput: (value: string) => void;
setMessages: (messages: StoredMessage[]) => void;
setSelectedModel: (model: string) => void;
setEnableWebSearch: (value: boolean) => void;
setEnableNoteTools: (value: boolean) => void;
setEnableExtendedThinking: (value: boolean) => void;
setContextNoteId: (noteId: string | undefined) => void;
setChatNoteId: (noteId: string | undefined) => void;
// Actions
handleSubmit: (e: Event) => Promise<void>;
handleKeyDown: (e: KeyboardEvent) => void;
loadFromContent: (content: LlmChatContent) => void;
getContent: () => LlmChatContent;
clearMessages: () => void;
/** Refresh the provider/models list */
refreshModels: () => void;
}
export function useLlmChat(
onMessagesChange?: (messages: StoredMessage[]) => void,
options: LlmChatOptions = {}
): UseLlmChatReturn {
const { defaultEnableNoteTools = false, supportsExtendedThinking = false, contextNoteId: initialContextNoteId, chatNoteId: initialChatNoteId } = options;
const [messages, setMessagesInternal] = useState<StoredMessage[]>([]);
const [input, setInput] = useState("");
const [isStreaming, setIsStreaming] = useState(false);
const [streamingContent, setStreamingContent] = useState("");
const [streamingThinking, setStreamingThinking] = useState("");
const [toolActivity, setToolActivity] = useState<string | null>(null);
const [pendingCitations, setPendingCitations] = useState<LlmCitation[]>([]);
const [availableModels, setAvailableModels] = useState<ModelOption[]>([]);
const [selectedModel, setSelectedModel] = useState<string>("");
const [enableWebSearch, setEnableWebSearch] = useState(true);
const [enableNoteTools, setEnableNoteTools] = useState(defaultEnableNoteTools);
const [enableExtendedThinking, setEnableExtendedThinking] = useState(false);
const [contextNoteId, setContextNoteId] = useState<string | undefined>(initialContextNoteId);
const [chatNoteId, setChatNoteIdState] = useState<string | undefined>(initialChatNoteId);
const [lastPromptTokens, setLastPromptTokens] = useState<number>(0);
const [hasProvider, setHasProvider] = useState<boolean>(true); // Assume true initially
const [isCheckingProvider, setIsCheckingProvider] = useState<boolean>(true);
const messagesEndRef = useRef<HTMLDivElement>(null);
const textareaRef = useRef<HTMLTextAreaElement>(null);
// Refs to get fresh values in getContent (avoids stale closures)
const messagesRef = useRef(messages);
messagesRef.current = messages;
const selectedModelRef = useRef(selectedModel);
selectedModelRef.current = selectedModel;
const enableWebSearchRef = useRef(enableWebSearch);
enableWebSearchRef.current = enableWebSearch;
const enableNoteToolsRef = useRef(enableNoteTools);
enableNoteToolsRef.current = enableNoteTools;
const enableExtendedThinkingRef = useRef(enableExtendedThinking);
enableExtendedThinkingRef.current = enableExtendedThinking;
const chatNoteIdRef = useRef(chatNoteId);
chatNoteIdRef.current = chatNoteId;
const setChatNoteId = useCallback((noteId: string | undefined) => {
chatNoteIdRef.current = noteId;
setChatNoteIdState(noteId);
}, []);
const contextNoteIdRef = useRef(contextNoteId);
contextNoteIdRef.current = contextNoteId;
// Wrapper to call onMessagesChange when messages update
const setMessages = useCallback((newMessages: StoredMessage[]) => {
setMessagesInternal(newMessages);
onMessagesChange?.(newMessages);
}, [onMessagesChange]);
// Fetch available models on mount
const refreshModels = useCallback(() => {
setIsCheckingProvider(true);
getAvailableModels().then(models => {
const modelsWithDescription = models.map(m => ({
...m,
costDescription: m.costMultiplier ? `${m.costMultiplier}x` : undefined
}));
setAvailableModels(modelsWithDescription);
setHasProvider(models.length > 0);
setIsCheckingProvider(false);
if (!selectedModel) {
const defaultModel = models.find(m => m.isDefault) || models[0];
if (defaultModel) {
setSelectedModel(defaultModel.id);
}
}
}).catch(err => {
console.error("Failed to fetch available models:", err);
setHasProvider(false);
setIsCheckingProvider(false);
});
}, [selectedModel]);
useEffect(() => {
refreshModels();
}, []);
// Scroll to bottom when content changes
const scrollToBottom = useCallback(() => {
messagesEndRef.current?.scrollIntoView({ behavior: "smooth" });
}, []);
useEffect(() => {
scrollToBottom();
}, [messages, streamingContent, streamingThinking, toolActivity, scrollToBottom]);
// Load state from content object
const loadFromContent = useCallback((content: LlmChatContent) => {
setMessagesInternal(content.messages || []);
if (content.selectedModel) {
setSelectedModel(content.selectedModel);
}
if (typeof content.enableWebSearch === "boolean") {
setEnableWebSearch(content.enableWebSearch);
}
if (typeof content.enableNoteTools === "boolean") {
setEnableNoteTools(content.enableNoteTools);
}
if (supportsExtendedThinking && typeof content.enableExtendedThinking === "boolean") {
setEnableExtendedThinking(content.enableExtendedThinking);
}
// Restore last prompt tokens from the most recent message with usage
const lastUsage = [...(content.messages || [])].reverse().find(m => m.usage)?.usage;
setLastPromptTokens(lastUsage?.promptTokens ?? 0);
}, [supportsExtendedThinking]);
// Get current state as content object (uses refs to avoid stale closures)
const getContent = useCallback((): LlmChatContent => {
const content: LlmChatContent = {
version: 1,
messages: messagesRef.current,
selectedModel: selectedModelRef.current || undefined,
enableWebSearch: enableWebSearchRef.current,
enableNoteTools: enableNoteToolsRef.current
};
if (supportsExtendedThinking) {
content.enableExtendedThinking = enableExtendedThinkingRef.current;
}
return content;
}, [supportsExtendedThinking]);
const clearMessages = useCallback(() => {
setMessages([]);
setLastPromptTokens(0);
}, [setMessages]);
const handleSubmit = useCallback(async (e: Event) => {
e.preventDefault();
if (!input.trim() || isStreaming) return;
setToolActivity(null);
setPendingCitations([]);
const userMessage: StoredMessage = {
id: randomString(),
role: "user",
content: input.trim(),
createdAt: new Date().toISOString()
};
const newMessages = [...messages, userMessage];
setMessagesInternal(newMessages);
setInput("");
setIsStreaming(true);
setStreamingContent("");
setStreamingThinking("");
let thinkingContent = "";
const contentBlocks: ContentBlock[] = [];
const citations: LlmCitation[] = [];
let usage: LlmUsage | undefined;
/** Get or create the last text block to append streaming text to. */
function lastTextBlock(): ContentBlock & { type: "text" } {
const last = contentBlocks[contentBlocks.length - 1];
if (last?.type === "text") {
return last;
}
const block: ContentBlock = { type: "text", content: "" };
contentBlocks.push(block);
return block as ContentBlock & { type: "text" };
}
const apiMessages: LlmMessage[] = newMessages.map(m => ({
role: m.role,
content: typeof m.content === "string" ? m.content : m.content
.filter((b): b is ContentBlock & { type: "text" } => b.type === "text")
.map(b => b.content)
.join("")
}));
const streamOptions: Parameters<typeof streamChatCompletion>[1] = {
model: selectedModel || undefined,
enableWebSearch,
enableNoteTools,
contextNoteId,
chatNoteId: chatNoteIdRef.current
};
if (supportsExtendedThinking) {
streamOptions.enableExtendedThinking = enableExtendedThinking;
}
await streamChatCompletion(
apiMessages,
streamOptions,
{
onChunk: (text) => {
lastTextBlock().content += text;
setStreamingContent(contentBlocks
.filter((b): b is ContentBlock & { type: "text" } => b.type === "text")
.map(b => b.content)
.join(""));
setToolActivity(null);
},
onThinking: (text) => {
thinkingContent += text;
setStreamingThinking(thinkingContent);
setToolActivity(t("llm_chat.thinking"));
},
onToolUse: (toolName, toolInput) => {
const toolLabel = toolName === "web_search"
? t("llm_chat.searching_web")
: `Using ${toolName}...`;
setToolActivity(toolLabel);
contentBlocks.push({
type: "tool_call",
toolCall: {
id: randomString(),
toolName,
input: toolInput
}
});
},
onToolResult: (toolName, result, isError) => {
// Find the most recent tool_call block for this tool without a result
for (let i = contentBlocks.length - 1; i >= 0; i--) {
const block = contentBlocks[i];
if (block.type === "tool_call" && block.toolCall.toolName === toolName && !block.toolCall.result) {
block.toolCall.result = result;
block.toolCall.isError = isError;
break;
}
}
},
onCitation: (citation) => {
citations.push(citation);
setPendingCitations([...citations]);
},
onUsage: (u) => {
usage = u;
setLastPromptTokens(u.promptTokens);
},
onError: (errorMsg) => {
console.error("Chat error:", errorMsg);
const errorMessage: StoredMessage = {
id: randomString(),
role: "assistant",
content: errorMsg,
createdAt: new Date().toISOString(),
type: "error"
};
const finalMessages = [...newMessages, errorMessage];
setMessages(finalMessages);
setStreamingContent("");
setStreamingThinking("");
setIsStreaming(false);
setToolActivity(null);
},
onDone: () => {
const finalNewMessages: StoredMessage[] = [];
if (thinkingContent) {
finalNewMessages.push({
id: randomString(),
role: "assistant",
content: thinkingContent,
createdAt: new Date().toISOString(),
type: "thinking"
});
}
if (contentBlocks.length > 0) {
finalNewMessages.push({
id: randomString(),
role: "assistant",
content: contentBlocks,
createdAt: new Date().toISOString(),
citations: citations.length > 0 ? citations : undefined,
usage
});
}
if (finalNewMessages.length > 0) {
const allMessages = [...newMessages, ...finalNewMessages];
setMessages(allMessages);
}
setStreamingContent("");
setStreamingThinking("");
setPendingCitations([]);
setIsStreaming(false);
setToolActivity(null);
}
}
);
}, [input, isStreaming, messages, selectedModel, enableWebSearch, enableNoteTools, enableExtendedThinking, contextNoteId, supportsExtendedThinking, setMessages]);
const handleKeyDown = useCallback((e: KeyboardEvent) => {
if (e.key === "Enter" && !e.shiftKey) {
e.preventDefault();
handleSubmit(e);
}
}, [handleSubmit]);
return {
// State
messages,
input,
isStreaming,
streamingContent,
streamingThinking,
toolActivity,
pendingCitations,
availableModels,
selectedModel,
enableWebSearch,
enableNoteTools,
enableExtendedThinking,
contextNoteId,
lastPromptTokens,
messagesEndRef,
textareaRef,
hasProvider,
isCheckingProvider,
// Setters
setInput,
setMessages,
setSelectedModel,
setEnableWebSearch,
setEnableNoteTools,
setEnableExtendedThinking,
setContextNoteId,
setChatNoteId,
// Actions
handleSubmit,
handleKeyDown,
loadFromContent,
getContent,
clearMessages,
refreshModels
};
}

View File

@@ -0,0 +1,104 @@
import { useCallback, useMemo, useState } from "preact/hooks";
import { t } from "../../../services/i18n";
import Button from "../../react/Button";
import OptionsSection from "./components/OptionsSection";
import AddProviderModal, { type LlmProviderConfig, PROVIDER_TYPES } from "./llm/AddProviderModal";
import ActionButton from "../../react/ActionButton";
import dialog from "../../../services/dialog";
import { useTriliumOption } from "../../react/hooks";
export default function LlmSettings() {
const [providersJson, setProvidersJson] = useTriliumOption("llmProviders");
const providers = useMemo<LlmProviderConfig[]>(() => {
try {
return providersJson ? JSON.parse(providersJson) : [];
} catch {
return [];
}
}, [providersJson]);
const setProviders = useCallback((newProviders: LlmProviderConfig[]) => {
setProvidersJson(JSON.stringify(newProviders));
}, [setProvidersJson]);
const [showAddModal, setShowAddModal] = useState(false);
const handleAddProvider = useCallback((newProvider: LlmProviderConfig) => {
setProviders([...providers, newProvider]);
}, [providers, setProviders]);
const handleDeleteProvider = useCallback(async (providerId: string, providerName: string) => {
if (!(await dialog.confirm(t("llm.delete_provider_confirmation", { name: providerName })))) {
return;
}
setProviders(providers.filter(p => p.id !== providerId));
}, [providers, setProviders]);
return (
<OptionsSection title={t("llm.settings_title")}>
<p>{t("llm.settings_description")}</p>
<Button
size="small"
icon="bx bx-plus"
text={t("llm.add_provider")}
onClick={() => setShowAddModal(true)}
/>
<hr />
<h5>{t("llm.configured_providers")}</h5>
<ProviderList
providers={providers}
onDelete={handleDeleteProvider}
/>
<AddProviderModal
show={showAddModal}
onHidden={() => setShowAddModal(false)}
onSave={handleAddProvider}
/>
</OptionsSection>
);
}
interface ProviderListProps {
providers: LlmProviderConfig[];
onDelete: (providerId: string, providerName: string) => Promise<void>;
}
function ProviderList({ providers, onDelete }: ProviderListProps) {
if (!providers.length) {
return <div>{t("llm.no_providers_configured")}</div>;
}
return (
<div style={{ overflow: "auto" }}>
<table className="table table-stripped">
<thead>
<tr>
<th>{t("llm.provider_name")}</th>
<th>{t("llm.provider_type")}</th>
<th>{t("llm.actions")}</th>
</tr>
</thead>
<tbody>
{providers.map((provider) => {
const providerType = PROVIDER_TYPES.find(p => p.id === provider.provider);
return (
<tr key={provider.id}>
<td>{provider.name}</td>
<td>{providerType?.name || provider.provider}</td>
<td>
<ActionButton
icon="bx bx-trash"
text={t("llm.delete_provider")}
onClick={() => onDelete(provider.id, provider.name)}
/>
</td>
</tr>
);
})}
</tbody>
</table>
</div>
);
}

View File

@@ -0,0 +1,106 @@
import { createPortal } from "preact/compat";
import { useState, useRef } from "preact/hooks";
import Modal from "../../../react/Modal";
import FormGroup from "../../../react/FormGroup";
import FormSelect from "../../../react/FormSelect";
import FormTextBox from "../../../react/FormTextBox";
import { t } from "../../../../services/i18n";
export interface LlmProviderConfig {
id: string;
name: string;
provider: string;
apiKey: string;
}
export interface ProviderType {
id: string;
name: string;
}
export const PROVIDER_TYPES: ProviderType[] = [
{ id: "anthropic", name: "Anthropic" }
];
interface AddProviderModalProps {
show: boolean;
onHidden: () => void;
onSave: (provider: LlmProviderConfig) => void;
}
export default function AddProviderModal({ show, onHidden, onSave }: AddProviderModalProps) {
const [selectedProvider, setSelectedProvider] = useState(PROVIDER_TYPES[0].id);
const [apiKey, setApiKey] = useState("");
const formRef = useRef<HTMLFormElement>(null);
function handleSubmit() {
if (!apiKey.trim()) {
return;
}
const providerType = PROVIDER_TYPES.find(p => p.id === selectedProvider);
const newProvider: LlmProviderConfig = {
id: `${selectedProvider}_${Date.now()}`,
name: providerType?.name || selectedProvider,
provider: selectedProvider,
apiKey: apiKey.trim()
};
onSave(newProvider);
resetForm();
onHidden();
}
function resetForm() {
setSelectedProvider(PROVIDER_TYPES[0].id);
setApiKey("");
}
function handleCancel() {
resetForm();
onHidden();
}
return createPortal(
<Modal
show={show}
onHidden={handleCancel}
onSubmit={handleSubmit}
formRef={formRef}
title={t("llm.add_provider_title")}
className="add-provider-modal"
size="md"
footer={
<>
<button type="button" className="btn btn-secondary" onClick={handleCancel}>
{t("llm.cancel")}
</button>
<button type="submit" className="btn btn-primary" disabled={!apiKey.trim()}>
{t("llm.add_provider")}
</button>
</>
}
>
<FormGroup name="provider-type" label={t("llm.provider_type")}>
<FormSelect
values={PROVIDER_TYPES}
keyProperty="id"
titleProperty="name"
currentValue={selectedProvider}
onChange={setSelectedProvider}
/>
</FormGroup>
<FormGroup name="api-key" label={t("llm.api_key")}>
<FormTextBox
type="password"
currentValue={apiKey}
onChange={setApiKey}
placeholder={t("llm.api_key_placeholder")}
autoFocus
/>
</FormGroup>
</Modal>,
document.body
);
}

View File

@@ -15,6 +15,7 @@
"start-no-dir": "cross-env TRILIUM_PORT=37743 tsx ../../scripts/electron-start.mts src/main.ts",
"build": "tsx scripts/build.ts",
"start-prod": "pnpm build && cross-env TRILIUM_DATA_DIR=data TRILIUM_PORT=37841 ELECTRON_IS_DEV=0 electron dist",
"start-prod-no-dir": "pnpm build && cross-env TRILIUM_PORT=37841 ELECTRON_IS_DEV=0 electron dist",
"electron-forge:make": "pnpm build && electron-forge make dist",
"electron-forge:make-flatpak": "pnpm build && DEBUG=* electron-forge make dist --targets=@electron-forge/maker-flatpak",
"electron-forge:package": "pnpm build && electron-forge package dist",

View File

@@ -30,6 +30,8 @@
"proxy-nginx-subdir": "docker run --name trilium-nginx-subdir --rm --network=host -v ./docker/nginx.conf:/etc/nginx/conf.d/default.conf:ro nginx:latest"
},
"dependencies": {
"@ai-sdk/anthropic": "^2.0.0",
"ai": "^5.0.0",
"better-sqlite3": "12.8.0",
"html-to-text": "9.0.5",
"node-html-parser": "7.1.0",

View File

@@ -55,7 +55,16 @@ export default async function buildApp() {
});
if (!utils.isElectron) {
app.use(compression()); // HTTP compression
app.use(compression({
// Skip compression for SSE endpoints to enable real-time streaming
filter: (req, res) => {
// Skip compression for LLM chat streaming endpoint
if (req.path === "/api/llm-chat/stream") {
return false;
}
return compression.filter(req, res);
}
}));
}
let resourcePolicy = config["Network"]["corsResourcePolicy"] as 'same-origin' | 'same-site' | 'cross-origin' | undefined;

View File

@@ -297,7 +297,8 @@
},
"quarterNumber": "Quarter {quarterNumber}",
"special_notes": {
"search_prefix": "Search:"
"search_prefix": "Search:",
"llm_chat_prefix": "Chat:"
},
"test_sync": {
"not-configured": "Sync server host is not configured. Please configure sync first.",
@@ -308,6 +309,7 @@
"search-history-title": "Search History",
"note-map-title": "Note Map",
"sql-console-history-title": "SQL Console History",
"llm-chat-history-title": "AI Chat History",
"shared-notes-title": "Shared Notes",
"bulk-action-title": "Bulk Action",
"backend-log-title": "Backend Log",
@@ -351,11 +353,13 @@
"sync-title": "Sync",
"other": "Other",
"advanced-title": "Advanced",
"llm-title": "AI / LLM",
"visible-launchers-title": "Visible Launchers",
"user-guide": "User Guide",
"localization": "Language & Region",
"inbox-title": "Inbox",
"tab-switcher-title": "Tab Switcher"
"tab-switcher-title": "Tab Switcher",
"sidebar-chat-title": "AI Chat"
},
"notes": {
"new-note": "New note",

View File

@@ -17,6 +17,11 @@ export declare module "express-serve-static-core" {
"user-agent"?: string;
};
}
interface Response {
/** Set to true to prevent apiResultHandler from double-handling the response (e.g., for SSE streams) */
triliumResponseHandled?: boolean;
}
}
export declare module "express-session" {

View File

@@ -0,0 +1,109 @@
import type { LlmMessage } from "@triliumnext/commons";
import type { Request, Response } from "express";
import { generateChatTitle } from "../../services/llm/chat_title.js";
import { getProviderByType, hasConfiguredProviders, type LlmProviderConfig } from "../../services/llm/index.js";
import { streamToChunks } from "../../services/llm/stream.js";
import log from "../../services/log.js";
import { safeExtractMessageAndStackFromError } from "../../services/utils.js";
interface ChatRequest {
messages: LlmMessage[];
config?: LlmProviderConfig;
}
/**
* SSE endpoint for streaming chat completions.
*
* Response format (Server-Sent Events):
* data: {"type":"text","content":"Hello"}
* data: {"type":"text","content":" world"}
* data: {"type":"done"}
*
* On error:
* data: {"type":"error","error":"Error message"}
*/
async function streamChat(req: Request, res: Response) {
const { messages, config = {} } = req.body as ChatRequest;
if (!messages || !Array.isArray(messages) || messages.length === 0) {
res.status(400).json({ error: "messages array is required" });
return;
}
// Set up SSE headers - disable compression and buffering for real-time streaming
res.setHeader("Content-Type", "text/event-stream");
res.setHeader("Cache-Control", "no-cache, no-transform");
res.setHeader("Connection", "keep-alive");
res.setHeader("X-Accel-Buffering", "no"); // Disable nginx buffering
res.flushHeaders();
// Mark response as handled to prevent double-handling by apiResultHandler
res.triliumResponseHandled = true;
// Type assertion for flush method (available when compression is used)
const flushableRes = res as Response & { flush?: () => void };
try {
if (!hasConfiguredProviders()) {
res.write(`data: ${JSON.stringify({ type: "error", error: "No LLM providers configured. Please add a provider in Options → AI / LLM." })}\n\n`);
return;
}
const provider = getProviderByType(config.provider || "anthropic");
const result = provider.chat(messages, config);
// Get pricing and display name for the model
const modelId = config.model || provider.getAvailableModels().find(m => m.isDefault)?.id;
if (!modelId) {
res.write(`data: ${JSON.stringify({ type: "error", error: "No model specified and no default model available for the provider." })}\n\n`);
return;
}
const pricing = provider.getModelPricing(modelId);
const modelDisplayName = provider.getAvailableModels().find(m => m.id === modelId)?.name || modelId;
for await (const chunk of streamToChunks(result, { model: modelDisplayName, pricing })) {
res.write(`data: ${JSON.stringify(chunk)}\n\n`);
// Flush immediately to ensure real-time streaming
if (typeof flushableRes.flush === "function") {
flushableRes.flush();
}
}
// Auto-generate a title for the chat note on the first user message
const userMessages = messages.filter(m => m.role === "user");
if (userMessages.length === 1 && config.chatNoteId) {
try {
await generateChatTitle(config.chatNoteId, userMessages[0].content);
} catch (err) {
// Title generation is best-effort; don't fail the chat
log.error(`Failed to generate chat title: ${safeExtractMessageAndStackFromError(err)}`);
}
}
} catch (error) {
const errorMessage = error instanceof Error ? error.message : "Unknown error";
res.write(`data: ${JSON.stringify({ type: "error", error: errorMessage })}\n\n`);
} finally {
res.end();
}
}
/**
* Get available models for a provider.
*/
function getModels(req: Request, _res: Response) {
const providerType = req.query.provider as string || "anthropic";
// Return empty array when no providers configured - client handles this gracefully
if (!hasConfiguredProviders()) {
return { models: [] };
}
const llmProvider = getProviderByType(providerType);
const models = llmProvider.getAvailableModels();
return { models };
}
export default {
streamChat,
getModels
};

View File

@@ -104,7 +104,8 @@ const ALLOWED_OPTIONS = new Set<OptionNames>([
"experimentalFeatures",
"newLayout",
"mfaEnabled",
"mfaMethod"
"mfaMethod",
"llmProviders"
]);
function getOptions() {

View File

@@ -86,6 +86,29 @@ function createSearchNote(req: Request) {
return specialNotesService.createSearchNote(searchString, ancestorNoteId);
}
function createLlmChat() {
return specialNotesService.createLlmChat();
}
function getMostRecentLlmChat() {
const chat = specialNotesService.getMostRecentLlmChat();
// Return null explicitly if no chat found (not undefined)
return chat || null;
}
function getOrCreateLlmChat() {
return specialNotesService.getOrCreateLlmChat();
}
function getRecentLlmChats(req: Request) {
const limit = parseInt(req.query.limit as string) || 10;
return specialNotesService.getRecentLlmChats(limit);
}
function saveLlmChat(req: Request) {
return specialNotesService.saveLlmChat(req.body.llmChatNoteId);
}
function getHoistedNote() {
return becca.getNote(cls.getHoistedNoteId());
}
@@ -119,6 +142,11 @@ export default {
saveSqlConsole,
createSearchNote,
saveSearchNote,
createLlmChat,
getMostRecentLlmChat,
getOrCreateLlmChat,
getRecentLlmChats,
saveLlmChat,
createLauncher,
resetLauncher,
createOrUpdateScriptLauncherFromApi

View File

@@ -115,6 +115,7 @@ class FakeResponse extends EventEmitter implements Pick<Response<any, Record<str
}
json(obj) {
this.respHeaders["Content-Type"] = "application/json";
this.send(JSON.stringify(obj));
return this as unknown as MockedResponse;
}

View File

@@ -145,7 +145,7 @@ function internalRoute<P extends ParamsDictionary>(method: HttpMethod, path: str
function handleResponse(resultHandler: ApiResultHandler, req: express.Request, res: express.Response, result: unknown, start: number) {
// Skip result handling if the response has already been handled
if ((res as any).triliumResponseHandled) {
if (res.triliumResponseHandled) {
// Just log the request without additional processing
log.request(req, res, Date.now() - start, 0);
return;
@@ -161,7 +161,7 @@ function handleException(e: unknown | Error, method: HttpMethod, path: string, r
log.error(`${method} ${path} threw exception: '${errMessage}', stack: ${errStack}`);
// Skip sending response if it's already been handled by the route handler
if ((res as unknown as { triliumResponseHandled?: boolean }).triliumResponseHandled || res.headersSent) {
if (res.triliumResponseHandled || res.headersSent) {
return;
}

View File

@@ -34,6 +34,7 @@ import fontsRoute from "./api/fonts.js";
import imageRoute from "./api/image.js";
import importRoute from "./api/import.js";
import keysRoute from "./api/keys.js";
import llmChatRoute from "./api/llm_chat.js";
import loginApiRoute from "./api/login.js";
import metricsRoute from "./api/metrics.js";
import noteMapRoute from "./api/note_map.js";
@@ -291,6 +292,11 @@ function register(app: express.Application) {
asyncApiRoute(PST, "/api/special-notes/save-sql-console", specialNotesRoute.saveSqlConsole);
apiRoute(PST, "/api/special-notes/search-note", specialNotesRoute.createSearchNote);
apiRoute(PST, "/api/special-notes/save-search-note", specialNotesRoute.saveSearchNote);
apiRoute(PST, "/api/special-notes/llm-chat", specialNotesRoute.createLlmChat);
apiRoute(GET, "/api/special-notes/most-recent-llm-chat", specialNotesRoute.getMostRecentLlmChat);
apiRoute(GET, "/api/special-notes/get-or-create-llm-chat", specialNotesRoute.getOrCreateLlmChat);
apiRoute(GET, "/api/special-notes/recent-llm-chats", specialNotesRoute.getRecentLlmChats);
apiRoute(PST, "/api/special-notes/save-llm-chat", specialNotesRoute.saveLlmChat);
apiRoute(PST, "/api/special-notes/launchers/:noteId/reset", specialNotesRoute.resetLauncher);
apiRoute(PST, "/api/special-notes/launchers/:parentNoteId/:launcherType", specialNotesRoute.createLauncher);
apiRoute(PUT, "/api/special-notes/api-script-launcher", specialNotesRoute.createOrUpdateScriptLauncherFromApi);
@@ -323,6 +329,10 @@ function register(app: express.Application) {
apiRoute(PST, "/api/script/bundle/:noteId", scriptRoute.getBundle);
apiRoute(GET, "/api/script/relation/:noteId/:relationName", scriptRoute.getRelationBundles);
// LLM chat endpoints
asyncRoute(PST, "/api/llm-chat/stream", [auth.checkApiAuthOrElectron, csrfMiddleware], llmChatRoute.streamChat, null);
apiRoute(GET, "/api/llm-chat/models", llmChatRoute.getModels);
// no CSRF since this is called from android app
route(PST, "/api/sender/login", [loginRateLimiter], loginApiRoute.token, apiResultHandler);
asyncRoute(PST, "/api/sender/image", [auth.checkEtapiToken, uploadMiddlewareWithErrorHandling], senderRoute.uploadImage, apiResultHandler);

View File

@@ -66,6 +66,12 @@ function buildHiddenSubtreeDefinition(helpSubtree: HiddenSubtreeItem[]): HiddenS
type: "doc",
icon: "bx-data"
},
{
id: "_llmChat",
title: t("hidden-subtree.llm-chat-history-title"),
type: "doc",
icon: "bx-message-square-dots"
},
{
id: "_share",
title: t("hidden-subtree.shared-notes-title"),
@@ -247,6 +253,7 @@ function buildHiddenSubtreeDefinition(helpSubtree: HiddenSubtreeItem[]): HiddenS
{ id: "_optionsEtapi", title: t("hidden-subtree.etapi-title"), type: "contentWidget", icon: "bx-extension" },
{ id: "_optionsBackup", title: t("hidden-subtree.backup-title"), type: "contentWidget", icon: "bx-data" },
{ id: "_optionsSync", title: t("hidden-subtree.sync-title"), type: "contentWidget", icon: "bx-wifi" },
{ id: "_optionsLlm", title: t("hidden-subtree.llm-title"), type: "contentWidget", icon: "bx-bot" },
{ id: "_optionsAi", title: "AI Chat", type: "contentWidget", enforceDeleted: true },
{ id: "_optionsOther", title: t("hidden-subtree.other"), type: "contentWidget", icon: "bx-dots-horizontal" },
{ id: "_optionsLocalization", title: t("hidden-subtree.localization"), type: "contentWidget", icon: "bx-world" },

View File

@@ -78,6 +78,13 @@ export default function buildLaunchBarConfig() {
type: "launcher",
command: "toggleZenMode",
icon: "bx bxs-yin-yang"
},
{
id: "_lbSidebarChat",
title: t("hidden-subtree.sidebar-chat-title"),
type: "launcher",
builtinWidget: "sidebarChat",
icon: "bx bx-message-square-dots"
}
];

View File

@@ -0,0 +1,37 @@
import becca from "../../becca/becca.js";
import { getProvider } from "./index.js";
import log from "../log.js";
import { t } from "i18next";
/** Default title prefixes that indicate the note hasn't been manually renamed. */
function hasDefaultTitle(title: string): boolean {
// "Chat: <timestamp>" from sidebar/API-created chats
const chatPrefix = t("special_notes.llm_chat_prefix");
// "New note" from manually created chats
const newNoteTitle = t("notes.new-note");
return title.startsWith(chatPrefix) || title === newNoteTitle;
}
/**
* Generate a short descriptive title for a chat note based on the first user message,
* then rename the note. Only renames if the note still has a default title.
*/
export async function generateChatTitle(chatNoteId: string, firstMessage: string): Promise<void> {
const note = becca.getNote(chatNoteId);
if (!note) {
return;
}
if (!hasDefaultTitle(note.title)) {
return;
}
const provider = getProvider();
const title = await provider.generateTitle(firstMessage);
if (title) {
note.title = title;
note.save();
log.info(`Auto-renamed chat note ${chatNoteId} to "${title}"`);
}
}

View File

@@ -0,0 +1,105 @@
import type { LlmProvider } from "./types.js";
import { AnthropicProvider } from "./providers/anthropic.js";
import optionService from "../options.js";
import log from "../log.js";
/**
* Configuration for a single LLM provider instance.
* This matches the structure stored in the llmProviders option.
*/
export interface LlmProviderSetup {
id: string;
name: string;
provider: string;
apiKey: string;
}
/** Factory functions for creating provider instances */
const providerFactories: Record<string, (apiKey: string) => LlmProvider> = {
anthropic: (apiKey) => new AnthropicProvider(apiKey)
};
/** Cache of instantiated providers by their config ID */
let cachedProviders: Record<string, LlmProvider> = {};
/**
* Get configured providers from the options.
*/
function getConfiguredProviders(): LlmProviderSetup[] {
try {
const providersJson = optionService.getOptionOrNull("llmProviders");
if (!providersJson) {
return [];
}
return JSON.parse(providersJson) as LlmProviderSetup[];
} catch (e) {
log.error(`Failed to parse llmProviders option: ${e}`);
return [];
}
}
/**
* Get a provider instance by its configuration ID.
* If no ID is provided, returns the first configured provider.
*/
export function getProvider(providerId?: string): LlmProvider {
const configs = getConfiguredProviders();
if (configs.length === 0) {
throw new Error("No LLM providers configured. Please add a provider in Options → AI / LLM.");
}
// Find the requested provider or use the first one
const config = providerId
? configs.find(c => c.id === providerId)
: configs[0];
if (!config) {
throw new Error(`LLM provider not found: ${providerId}`);
}
// Check cache
if (cachedProviders[config.id]) {
return cachedProviders[config.id];
}
// Create new provider instance
const factory = providerFactories[config.provider];
if (!factory) {
throw new Error(`Unknown LLM provider type: ${config.provider}. Available: ${Object.keys(providerFactories).join(", ")}`);
}
const provider = factory(config.apiKey);
cachedProviders[config.id] = provider;
return provider;
}
/**
* Get the first configured provider of a specific type (e.g., "anthropic").
*/
export function getProviderByType(providerType: string): LlmProvider {
const configs = getConfiguredProviders();
const config = configs.find(c => c.provider === providerType);
if (!config) {
throw new Error(`No ${providerType} provider configured. Please add one in Options → AI / LLM.`);
}
return getProvider(config.id);
}
/**
* Check if any providers are configured.
*/
export function hasConfiguredProviders(): boolean {
return getConfiguredProviders().length > 0;
}
/**
* Clear the provider cache. Call this when provider configurations change.
*/
export function clearProviderCache(): void {
cachedProviders = {};
}
export type { LlmProvider, LlmProviderConfig, ModelInfo, ModelPricing } from "./types.js";

View File

@@ -0,0 +1,244 @@
import { createAnthropic, type AnthropicProvider as AnthropicSDKProvider } from "@ai-sdk/anthropic";
import { generateText, streamText, stepCountIs, type CoreMessage, type ToolSet } from "ai";
import type { LlmMessage } from "@triliumnext/commons";
import becca from "../../../becca/becca.js";
import { noteTools, attributeTools, currentNoteTools } from "../tools/index.js";
import type { LlmProvider, LlmProviderConfig, ModelInfo, ModelPricing, StreamResult } from "../types.js";
const DEFAULT_MODEL = "claude-sonnet-4-6";
const DEFAULT_MAX_TOKENS = 8096;
const TITLE_MODEL = "claude-haiku-4-5-20251001";
const TITLE_MAX_TOKENS = 30;
/**
* Calculate effective cost for comparison (weighted average: 1 input + 3 output).
* Output is weighted more heavily as it's typically the dominant cost factor.
*/
function effectiveCost(pricing: ModelPricing): number {
return (pricing.input + 3 * pricing.output) / 4;
}
/**
* Available Anthropic models with pricing (USD per million tokens).
* Source: https://docs.anthropic.com/en/docs/about-claude/models
*/
const BASE_MODELS: Omit<ModelInfo, "costMultiplier">[] = [
// ===== Current Models =====
{
id: "claude-sonnet-4-6",
name: "Claude Sonnet 4.6",
pricing: { input: 3, output: 15 },
contextWindow: 1000000,
isDefault: true
},
{
id: "claude-opus-4-6",
name: "Claude Opus 4.6",
pricing: { input: 5, output: 25 },
contextWindow: 1000000
},
{
id: "claude-haiku-4-5-20251001",
name: "Claude Haiku 4.5",
pricing: { input: 1, output: 5 },
contextWindow: 200000
},
// ===== Legacy Models =====
{
id: "claude-sonnet-4-5-20250929",
name: "Claude Sonnet 4.5",
pricing: { input: 3, output: 15 },
contextWindow: 200000, // 1M available with beta header
isLegacy: true
},
{
id: "claude-opus-4-5-20251101",
name: "Claude Opus 4.5",
pricing: { input: 5, output: 25 },
contextWindow: 200000,
isLegacy: true
},
{
id: "claude-opus-4-1-20250805",
name: "Claude Opus 4.1",
pricing: { input: 15, output: 75 },
contextWindow: 200000,
isLegacy: true
},
{
id: "claude-sonnet-4-20250514",
name: "Claude Sonnet 4.0",
pricing: { input: 3, output: 15 },
contextWindow: 200000, // 1M available with beta header
isLegacy: true
},
{
id: "claude-opus-4-20250514",
name: "Claude Opus 4.0",
pricing: { input: 15, output: 75 },
contextWindow: 200000,
isLegacy: true
}
];
// Use default model (Sonnet) as baseline for cost multiplier
const baselineModel = BASE_MODELS.find(m => m.isDefault) || BASE_MODELS[0];
const baselineCost = effectiveCost(baselineModel.pricing);
// Build models with cost multipliers
const AVAILABLE_MODELS: ModelInfo[] = BASE_MODELS.map(m => ({
...m,
costMultiplier: Math.round((effectiveCost(m.pricing) / baselineCost) * 10) / 10
}));
// Build pricing lookup from available models
const MODEL_PRICING: Record<string, ModelPricing> = Object.fromEntries(
AVAILABLE_MODELS.map(m => [m.id, m.pricing])
);
/**
* Build a lightweight context hint about the current note (title + type only, no content).
* The full content is available via the get_current_note tool.
*/
function buildNoteHint(noteId: string): string | null {
const note = becca.getNote(noteId);
if (!note) {
return null;
}
return `The user is currently viewing a ${note.type} note titled "${note.title}". Use the get_current_note tool to read its content if needed.`;
}
export class AnthropicProvider implements LlmProvider {
name = "anthropic";
private anthropic: AnthropicSDKProvider;
constructor(apiKey: string) {
if (!apiKey) {
throw new Error("API key is required for Anthropic provider");
}
this.anthropic = createAnthropic({ apiKey });
}
chat(messages: LlmMessage[], config: LlmProviderConfig): StreamResult {
let systemPrompt = config.systemPrompt || messages.find(m => m.role === "system")?.content;
const chatMessages = messages.filter(m => m.role !== "system");
// Add a lightweight hint about the current note (content available via tool)
if (config.contextNoteId) {
const noteHint = buildNoteHint(config.contextNoteId);
if (noteHint) {
systemPrompt = systemPrompt
? `${systemPrompt}\n\n${noteHint}`
: noteHint;
}
}
// Convert to AI SDK message format with cache control breakpoints.
// The system prompt and conversation history (all but the last user message)
// are stable across turns, so we mark them for caching to reduce costs.
const CACHE_CONTROL = { anthropic: { cacheControl: { type: "ephemeral" as const } } };
const coreMessages: CoreMessage[] = [];
// System prompt as a cacheable message
if (systemPrompt) {
coreMessages.push({
role: "system",
content: systemPrompt,
providerOptions: CACHE_CONTROL
});
}
// Conversation messages
for (let i = 0; i < chatMessages.length; i++) {
const m = chatMessages[i];
const isLastBeforeNewTurn = i === chatMessages.length - 2;
coreMessages.push({
role: m.role as "user" | "assistant",
content: m.content,
// Cache breakpoint on the second-to-last message:
// everything up to here is identical across consecutive turns.
...(isLastBeforeNewTurn && { providerOptions: CACHE_CONTROL })
});
}
const model = this.anthropic(config.model || DEFAULT_MODEL);
// Build options for streamText
const streamOptions: Parameters<typeof streamText>[0] = {
model,
messages: coreMessages,
maxOutputTokens: config.maxTokens || DEFAULT_MAX_TOKENS
};
// Enable extended thinking for deeper reasoning
if (config.enableExtendedThinking) {
const thinkingBudget = config.thinkingBudget || 10000;
streamOptions.providerOptions = {
anthropic: {
thinking: {
type: "enabled",
budgetTokens: thinkingBudget
}
}
};
streamOptions.maxOutputTokens = Math.max(
streamOptions.maxOutputTokens || DEFAULT_MAX_TOKENS,
thinkingBudget + 4000
);
}
// Build tools object
const tools: ToolSet = {};
if (config.enableWebSearch) {
tools.web_search = this.anthropic.tools.webSearch_20250305({
maxUses: 5
});
}
if (config.contextNoteId) {
Object.assign(tools, currentNoteTools(config.contextNoteId));
}
if (config.enableNoteTools) {
Object.assign(tools, noteTools);
Object.assign(tools, attributeTools);
}
if (Object.keys(tools).length > 0) {
streamOptions.tools = tools;
// Allow multiple tool use cycles before final response
streamOptions.stopWhen = stepCountIs(5);
// Let model decide when to use tools vs respond with text
streamOptions.toolChoice = "auto";
}
return streamText(streamOptions);
}
getModelPricing(model: string): ModelPricing | undefined {
return MODEL_PRICING[model];
}
getAvailableModels(): ModelInfo[] {
return AVAILABLE_MODELS;
}
async generateTitle(firstMessage: string): Promise<string> {
const { text } = await generateText({
model: this.anthropic(TITLE_MODEL),
maxOutputTokens: TITLE_MAX_TOKENS,
messages: [
{
role: "user",
content: `Summarize the following message as a very short chat title (max 6 words). Reply with ONLY the title, no quotes or punctuation at the end.\n\nMessage: ${firstMessage}`
}
]
});
return text.trim();
}
}

View File

@@ -0,0 +1,106 @@
/**
* Shared streaming utilities for converting AI SDK streams to SSE chunks.
*/
import type { LlmStreamChunk } from "@triliumnext/commons";
import type { ModelPricing, StreamResult } from "./types.js";
/**
* Calculate estimated cost in USD based on token usage and pricing.
*/
function calculateCost(inputTokens: number, outputTokens: number, pricing?: ModelPricing): number | undefined {
if (!pricing) return undefined;
const inputCost = (inputTokens / 1_000_000) * pricing.input;
const outputCost = (outputTokens / 1_000_000) * pricing.output;
return inputCost + outputCost;
}
export interface StreamOptions {
/** Model identifier for display */
model?: string;
/** Model pricing for cost calculation (from provider) */
pricing?: ModelPricing;
}
/**
* Convert an AI SDK StreamResult to an async iterable of LlmStreamChunk.
* This is provider-agnostic - works with any AI SDK provider.
*/
export async function* streamToChunks(result: StreamResult, options: StreamOptions = {}): AsyncIterable<LlmStreamChunk> {
try {
for await (const part of result.fullStream) {
switch (part.type) {
case "text-delta":
yield { type: "text", content: part.text };
break;
case "reasoning-delta":
yield { type: "thinking", content: part.text };
break;
case "tool-call":
yield {
type: "tool_use",
toolName: part.toolName,
toolInput: part.input as Record<string, unknown>
};
break;
case "tool-result": {
const output = part.output;
const isError = typeof output === "object" && output !== null && "error" in output;
yield {
type: "tool_result",
toolName: part.toolName,
result: typeof output === "string"
? output
: JSON.stringify(output),
isError
};
break;
}
case "source":
// Citation from web search (only URL sources have url property)
if (part.sourceType === "url") {
yield {
type: "citation",
citation: {
url: part.url,
title: part.title
}
};
}
break;
case "error":
yield { type: "error", error: String(part.error) };
break;
}
}
// Get usage information after stream completes
const usage = await result.usage;
if (usage && typeof usage.inputTokens === "number" && typeof usage.outputTokens === "number") {
const cost = calculateCost(usage.inputTokens, usage.outputTokens, options.pricing);
yield {
type: "usage",
usage: {
promptTokens: usage.inputTokens,
completionTokens: usage.outputTokens,
totalTokens: usage.inputTokens + usage.outputTokens,
cost,
model: options.model
}
};
}
yield { type: "done" };
} catch (error) {
const message = error instanceof Error ? error.message : "Unknown error";
yield { type: "error", error: message };
}
}

View File

@@ -0,0 +1,137 @@
/**
* LLM tools for attribute operations (get, set, delete labels/relations).
*/
import { tool } from "ai";
import { z } from "zod";
import becca from "../../../becca/becca.js";
import attributeService from "../../attributes.js";
/**
* Get all owned attributes (labels/relations) of a note.
*/
export const getAttributes = tool({
description: "Get all attributes (labels and relations) of a note. Labels store text values; relations link to other notes by ID.",
inputSchema: z.object({
noteId: z.string().describe("The ID of the note")
}),
execute: async ({ noteId }) => {
const note = becca.getNote(noteId);
if (!note) {
return { error: "Note not found" };
}
return note.getOwnedAttributes()
.filter((attr) => !attr.isAutoLink())
.map((attr) => ({
attributeId: attr.attributeId,
type: attr.type,
name: attr.name,
value: attr.value,
isInheritable: attr.isInheritable
}));
}
});
/**
* Get a single attribute by its ID.
*/
export const getAttribute = tool({
description: "Get a single attribute by its ID.",
inputSchema: z.object({
attributeId: z.string().describe("The ID of the attribute")
}),
execute: async ({ attributeId }) => {
const attribute = becca.getAttribute(attributeId);
if (!attribute) {
return { error: "Attribute not found" };
}
return {
attributeId: attribute.attributeId,
noteId: attribute.noteId,
type: attribute.type,
name: attribute.name,
value: attribute.value,
isInheritable: attribute.isInheritable
};
}
});
/**
* Add or update an attribute on a note.
*/
export const setAttribute = tool({
description: "Add or update an attribute on a note. If an attribute with the same type and name exists, it is updated; otherwise a new one is created. Use type 'label' for text values, 'relation' for linking to another note (value must be a noteId).",
inputSchema: z.object({
noteId: z.string().describe("The ID of the note"),
type: z.enum(["label", "relation"]).describe("The attribute type"),
name: z.string().describe("The attribute name"),
value: z.string().optional().describe("The attribute value (for relations, this must be a target noteId)")
}),
execute: async ({ noteId, type, name, value = "" }) => {
const note = becca.getNote(noteId);
if (!note) {
return { error: "Note not found" };
}
if (note.isProtected) {
return { error: "Note is protected and cannot be modified" };
}
if (attributeService.isAttributeDangerous(type, name)) {
return { error: `Attribute '${name}' is potentially dangerous and cannot be set by the LLM` };
}
if (type === "relation" && value && !becca.getNote(value)) {
return { error: "Target note not found for relation" };
}
note.setAttribute(type, name, value);
return {
success: true,
noteId: note.noteId,
type,
name,
value
};
}
});
/**
* Remove an attribute from a note.
*/
export const deleteAttribute = tool({
description: "Remove an attribute from a note by its attribute ID.",
inputSchema: z.object({
noteId: z.string().describe("The ID of the note that owns the attribute"),
attributeId: z.string().describe("The ID of the attribute to delete")
}),
execute: async ({ noteId, attributeId }) => {
const attribute = becca.getAttribute(attributeId);
if (!attribute) {
return { error: "Attribute not found" };
}
if (attribute.noteId !== noteId) {
return { error: "Attribute does not belong to the specified note" };
}
const note = becca.getNote(noteId);
if (note?.isProtected) {
return { error: "Note is protected and cannot be modified" };
}
attribute.markAsDeleted();
return {
success: true,
attributeId
};
}
});
export const attributeTools = {
get_attributes: getAttributes,
get_attribute: getAttribute,
set_attribute: setAttribute,
delete_attribute: deleteAttribute
};

View File

@@ -0,0 +1,7 @@
/**
* LLM tools that wrap existing Trilium services.
* These reuse the same logic as ETAPI without any HTTP overhead.
*/
export { noteTools, currentNoteTools } from "./note_tools.js";
export { attributeTools } from "./attribute_tools.js";

View File

@@ -0,0 +1,246 @@
/**
* LLM tools for note operations (search, read, create, update, append).
*/
import { tool } from "ai";
import { z } from "zod";
import becca from "../../../becca/becca.js";
import markdownExport from "../../export/markdown.js";
import markdownImport from "../../import/markdown.js";
import noteService from "../../notes.js";
import SearchContext from "../../search/search_context.js";
import searchService from "../../search/services/search.js";
/**
* Convert note content to a format suitable for LLM consumption.
* Text notes are converted from HTML to Markdown to reduce token usage.
*/
export function getNoteContentForLlm(note: { type: string; getContent: () => string | Buffer }) {
const content = note.getContent();
if (typeof content !== "string") {
return "[binary content]";
}
if (note.type === "text") {
return markdownExport.toMarkdown(content);
}
return content;
}
/**
* Convert LLM-provided content to a format suitable for storage.
* For text notes, converts Markdown to HTML.
*/
function setNoteContentFromLlm(note: { type: string; title: string; setContent: (content: string) => void }, content: string) {
if (note.type === "text") {
note.setContent(markdownImport.renderToHtml(content, note.title));
} else {
note.setContent(content);
}
}
/**
* Search for notes in the knowledge base.
*/
export const searchNotes = tool({
description: "Search for notes in the user's knowledge base. Returns note metadata including title, type, and IDs.",
inputSchema: z.object({
query: z.string().describe("Search query (supports Trilium search syntax)")
}),
execute: async ({ query }) => {
const searchContext = new SearchContext({});
const results = searchService.findResultsWithQuery(query, searchContext);
return results.slice(0, 10).map(sr => {
const note = becca.notes[sr.noteId];
if (!note) return null;
return {
noteId: note.noteId,
title: note.getTitleOrProtected(),
type: note.type
};
}).filter(Boolean);
}
});
/**
* Read the content of a specific note.
*/
export const readNote = tool({
description: "Read the full content of a note by its ID. Use search_notes first to find relevant note IDs. Text notes are returned as Markdown.",
inputSchema: z.object({
noteId: z.string().describe("The ID of the note to read")
}),
execute: async ({ noteId }) => {
const note = becca.getNote(noteId);
if (!note) {
return { error: "Note not found" };
}
if (!note.isContentAvailable()) {
return { error: "Note is protected" };
}
return {
noteId: note.noteId,
title: note.getTitleOrProtected(),
type: note.type,
content: getNoteContentForLlm(note)
};
}
});
/**
* Update the content of a note.
*/
export const updateNoteContent = tool({
description: "Replace the entire content of a note. Use this to completely rewrite a note's content. For text notes, provide Markdown content.",
inputSchema: z.object({
noteId: z.string().describe("The ID of the note to update"),
content: z.string().describe("The new content for the note (Markdown for text notes, plain text for code notes)")
}),
execute: async ({ noteId, content }) => {
const note = becca.getNote(noteId);
if (!note) {
return { error: "Note not found" };
}
if (!note.isContentAvailable()) {
return { error: "Note is protected and cannot be modified" };
}
if (!note.hasStringContent()) {
return { error: `Cannot update content for note type: ${note.type}` };
}
note.saveRevision();
setNoteContentFromLlm(note, content);
return {
success: true,
noteId: note.noteId,
title: note.getTitleOrProtected()
};
}
});
/**
* Append content to a note.
*/
export const appendToNote = tool({
description: "Append content to the end of an existing note. For text notes, provide Markdown content.",
inputSchema: z.object({
noteId: z.string().describe("The ID of the note to append to"),
content: z.string().describe("The content to append (Markdown for text notes, plain text for code notes)")
}),
execute: async ({ noteId, content }) => {
const note = becca.getNote(noteId);
if (!note) {
return { error: "Note not found" };
}
if (!note.isContentAvailable()) {
return { error: "Note is protected and cannot be modified" };
}
if (!note.hasStringContent()) {
return { error: `Cannot update content for note type: ${note.type}` };
}
const existingContent = note.getContent();
if (typeof existingContent !== "string") {
return { error: "Note has binary content" };
}
let newContent: string;
if (note.type === "text") {
const htmlToAppend = markdownImport.renderToHtml(content, note.getTitleOrProtected());
newContent = existingContent + htmlToAppend;
} else {
newContent = existingContent + (existingContent.endsWith("\n") ? "" : "\n") + content;
}
note.saveRevision();
note.setContent(newContent);
return {
success: true,
noteId: note.noteId,
title: note.getTitleOrProtected()
};
}
});
/**
* Create a new note.
*/
export const createNote = tool({
description: "Create a new note in the user's knowledge base. Returns the created note's ID and title.",
inputSchema: z.object({
parentNoteId: z.string().describe("The ID of the parent note where the new note will be created. Use 'root' for top-level notes."),
title: z.string().describe("The title of the new note"),
content: z.string().describe("The content of the note (Markdown for text notes, plain text for code notes)"),
type: z.enum(["text", "code"]).optional().describe("The type of note to create. Defaults to 'text'.")
}),
execute: async ({ parentNoteId, title, content, type = "text" }) => {
const parentNote = becca.getNote(parentNoteId);
if (!parentNote) {
return { error: "Parent note not found" };
}
if (!parentNote.isContentAvailable()) {
return { error: "Cannot create note under a protected parent" };
}
const htmlContent = type === "text"
? markdownImport.renderToHtml(content, title)
: content;
try {
const { note } = noteService.createNewNote({
parentNoteId,
title,
content: htmlContent,
type
});
return {
success: true,
noteId: note.noteId,
title: note.getTitleOrProtected(),
type: note.type
};
} catch (err) {
return { error: err instanceof Error ? err.message : "Failed to create note" };
}
}
});
/**
* Read the content of the note the user is currently viewing.
* Created dynamically so it captures the contextNoteId.
*/
export function currentNoteTools(contextNoteId: string) {
return {
get_current_note: tool({
description: "Read the content of the note the user is currently viewing. Call this when the user asks about or refers to their current note.",
inputSchema: z.object({}),
execute: async () => {
const note = becca.getNote(contextNoteId);
if (!note) {
return { error: "Note not found" };
}
if (!note.isContentAvailable()) {
return { error: "Note is protected" };
}
return {
noteId: note.noteId,
title: note.getTitleOrProtected(),
type: note.type,
content: getNoteContentForLlm(note)
};
}
})
};
}
export const noteTools = {
search_notes: searchNotes,
read_note: readNote,
update_note_content: updateNoteContent,
append_to_note: appendToNote,
create_note: createNote
};

View File

@@ -0,0 +1,80 @@
/**
* Server-specific LLM Provider types.
* Shared types (LlmMessage, LlmCitation, LlmStreamChunk, LlmChatConfig)
* should be imported from @triliumnext/commons.
*/
import type { LlmChatConfig, LlmMessage } from "@triliumnext/commons";
import type { streamText } from "ai";
/**
* Extended provider config with server-specific options.
*/
export interface LlmProviderConfig extends LlmChatConfig {
maxTokens?: number;
temperature?: number;
}
/**
* Result type from streamText - the AI SDK's unified streaming interface.
*/
export type StreamResult = ReturnType<typeof streamText>;
/**
* Pricing per million tokens for a model.
*/
export interface ModelPricing {
/** Cost per million input tokens in USD */
input: number;
/** Cost per million output tokens in USD */
output: number;
}
/**
* Information about an available model.
*/
export interface ModelInfo {
/** Model identifier (e.g., "claude-sonnet-4-20250514") */
id: string;
/** Human-readable name (e.g., "Claude Sonnet 4") */
name: string;
/** Pricing per million tokens */
pricing: ModelPricing;
/** Whether this is the default model */
isDefault?: boolean;
/** Cost multiplier relative to the cheapest model (1x = cheapest) */
costMultiplier?: number;
/** Maximum context window size in tokens */
contextWindow?: number;
/** Whether this is a legacy/older model */
isLegacy?: boolean;
}
export interface LlmProvider {
name: string;
/**
* Create a streaming chat completion.
* Returns the AI SDK StreamResult which is provider-agnostic.
*/
chat(
messages: LlmMessage[],
config: LlmProviderConfig
): StreamResult;
/**
* Get pricing for a model. Returns undefined if pricing is not available.
*/
getModelPricing(model: string): ModelPricing | undefined;
/**
* Get list of available models for this provider.
*/
getAvailableModels(): ModelInfo[];
/**
* Generate a short title summarizing a message.
* Used for auto-renaming chat notes. Should use a fast, cheap model.
*/
generateTitle(firstMessage: string): Promise<string>;
}

View File

@@ -15,7 +15,8 @@ const noteTypes = [
{ type: "doc", defaultMime: "" },
{ type: "contentWidget", defaultMime: "" },
{ type: "mindMap", defaultMime: "application/json" },
{ type: "spreadsheet", defaultMime: "application/json" }
{ type: "spreadsheet", defaultMime: "application/json" },
{ type: "llmChat", defaultMime: "application/json" }
];
function getDefaultMimeForNoteType(typeName: string) {

View File

@@ -209,7 +209,10 @@ const defaultOptions: DefaultOption[] = [
]),
isSynced: true
},
{ name: "experimentalFeatures", value: "[]", isSynced: true }
{ name: "experimentalFeatures", value: "[]", isSynced: true },
// AI / LLM
{ name: "llmProviders", value: "[]", isSynced: false }
];
/**

View File

@@ -10,7 +10,7 @@ import SearchContext from "./search/search_context.js";
import { LBTPL_NOTE_LAUNCHER, LBTPL_CUSTOM_WIDGET, LBTPL_SPACER, LBTPL_SCRIPT } from "./hidden_subtree.js";
import { t } from "i18next";
import BNote from '../becca/entities/bnote.js';
import { SaveSearchNoteResponse, SaveSqlConsoleResponse } from "@triliumnext/commons";
import { SaveSearchNoteResponse, SaveSqlConsoleResponse, SaveLlmChatResponse } from "@triliumnext/commons";
function getInboxNote(date: string) {
const workspaceNote = hoistedNoteService.getWorkspaceNote();
@@ -123,6 +123,114 @@ function saveSearchNote(searchNoteId: string) {
return result satisfies SaveSearchNoteResponse;
}
function createLlmChat() {
const { note } = noteService.createNewNote({
parentNoteId: getMonthlyParentNoteId("_llmChat", "llmChat"),
title: `${t("special_notes.llm_chat_prefix")} ${dateUtils.localNowDateTime()}`,
content: JSON.stringify({
version: 1,
messages: []
}),
type: "llmChat",
mime: "application/json"
});
note.setLabel("iconClass", "bx bx-message-square-dots");
note.setLabel("keepCurrentHoisting");
return note;
}
/**
* Gets the most recently modified LLM chat note.
* Used by sidebar chat to persist conversation across page refreshes.
* Returns null if no chat exists.
*/
function getMostRecentLlmChat() {
// Search for all llmChat notes and return the most recently modified
const results = searchService.searchNotes(
"note.type = llmChat",
new SearchContext({
ancestorNoteId: "_llmChat",
limit: 1,
orderBy: "utcDateModified",
orderDirection: "desc"
})
);
return results.length > 0 ? results[0] : null;
}
/**
* Gets the most recent LLM chat or creates a new one if none exists.
* Used by sidebar chat for persistent conversations.
*/
function getOrCreateLlmChat() {
const existingChat = getMostRecentLlmChat();
if (existingChat) {
return existingChat;
}
return createLlmChat();
}
/**
* Gets a list of recent LLM chat notes.
* Used by sidebar chat history popup.
*/
function getRecentLlmChats(limit: number = 10) {
const results = searchService.searchNotes(
"note.type = llmChat",
new SearchContext({
ancestorNoteId: "_llmChat",
limit,
orderBy: "utcDateModified",
orderDirection: "desc"
})
);
return results.map(note => ({
noteId: note.noteId,
title: note.title,
dateModified: note.utcDateModified
}));
}
function getLlmChatHome() {
const workspaceNote = hoistedNoteService.getWorkspaceNote();
if (!workspaceNote) {
throw new Error("Unable to find workspace note");
}
if (!workspaceNote.isRoot()) {
return workspaceNote.searchNoteInSubtree("#workspaceLlmChatHome") || workspaceNote.searchNoteInSubtree("#llmChatHome") || workspaceNote;
} else {
const today = dateUtils.localNowDate();
return workspaceNote.searchNoteInSubtree("#llmChatHome") || dateNoteService.getDayNote(today);
}
}
function saveLlmChat(llmChatNoteId: string) {
const llmChatNote = becca.getNote(llmChatNoteId);
if (!llmChatNote) {
throw new Error(`Unable to find LLM chat note ID: ${llmChatNoteId}`);
}
const llmChatHome = getLlmChatHome();
const result = llmChatNote.cloneTo(llmChatHome.noteId);
for (const parentBranch of llmChatNote.getParentBranches()) {
if (parentBranch.parentNote?.hasAncestor("_hidden")) {
parentBranch.markAsDeleted();
}
}
return result satisfies SaveLlmChatResponse;
}
function getMonthlyParentNoteId(rootNoteId: string, prefix: string) {
const month = dateUtils.localNowDate().substring(0, 7);
const labelName = `${prefix}MonthNote`;
@@ -282,6 +390,11 @@ export default {
saveSqlConsole,
createSearchNote,
saveSearchNote,
createLlmChat,
getMostRecentLlmChat,
getOrCreateLlmChat,
getRecentLlmChats,
saveLlmChat,
createLauncher,
resetLauncher,
createOrUpdateScriptLauncherFromApi

View File

@@ -13,8 +13,8 @@
"i18next-http-backend": "3.0.2",
"preact": "10.29.0",
"preact-iso": "2.11.1",
"preact-render-to-string": "6.6.7",
"react-i18next": "17.0.1"
"preact-render-to-string": "6.6.6",
"react-i18next": "17.0.0"
},
"devDependencies": {
"@preact/preset-vite": "2.10.5",

View File

@@ -18,6 +18,7 @@
"desktop:start": "pnpm run --filter desktop dev",
"desktop:build": "pnpm run --filter desktop build",
"desktop:start-prod": "pnpm run --filter desktop start-prod",
"desktop:start-prod-no-dir": "pnpm run --filter desktop start-prod-no-dir",
"edit-docs:edit-docs": "pnpm run --filter edit-docs edit-docs",
"edit-docs:build": "pnpm run --filter edit-docs build",
"website:start": "pnpm run --filter website dev",
@@ -135,7 +136,8 @@
"lodash@>=4.0.0 <=4.17.22": ">=4.17.23",
"diff@<4.0.4": ">=4.0.4",
"diff@>=6.0.0 <8.0.3": ">=8.0.3",
"tar@<7.5.7": ">=7.5.7"
"tar@<7.5.7": ">=7.5.7",
"zod@<3.25.76": ">=4.0.0"
},
"ignoredBuiltDependencies": [
"sqlite3"

View File

@@ -21,7 +21,7 @@
"ckeditor5-metadata.json"
],
"devDependencies": {
"@ckeditor/ckeditor5-dev-build-tools": "55.3.0",
"@ckeditor/ckeditor5-dev-build-tools": "55.2.0",
"@ckeditor/ckeditor5-inspector": ">=4.1.0",
"@ckeditor/ckeditor5-package-tools": "5.1.0",
"@typescript-eslint/eslint-plugin": "8.57.2",

View File

@@ -22,7 +22,7 @@
"ckeditor5-metadata.json"
],
"devDependencies": {
"@ckeditor/ckeditor5-dev-build-tools": "55.3.0",
"@ckeditor/ckeditor5-dev-build-tools": "55.2.0",
"@ckeditor/ckeditor5-inspector": ">=4.1.0",
"@ckeditor/ckeditor5-package-tools": "5.1.0",
"@typescript-eslint/eslint-plugin": "8.57.2",

View File

@@ -24,7 +24,7 @@
"ckeditor5-metadata.json"
],
"devDependencies": {
"@ckeditor/ckeditor5-dev-build-tools": "55.3.0",
"@ckeditor/ckeditor5-dev-build-tools": "55.2.0",
"@ckeditor/ckeditor5-inspector": ">=4.1.0",
"@ckeditor/ckeditor5-package-tools": "5.1.0",
"@typescript-eslint/eslint-plugin": "8.57.2",

View File

@@ -24,7 +24,7 @@
"ckeditor5-metadata.json"
],
"devDependencies": {
"@ckeditor/ckeditor5-dev-build-tools": "55.3.0",
"@ckeditor/ckeditor5-dev-build-tools": "55.2.0",
"@ckeditor/ckeditor5-inspector": ">=4.1.0",
"@ckeditor/ckeditor5-package-tools": "5.1.0",
"@typescript-eslint/eslint-plugin": "8.57.2",

View File

@@ -24,7 +24,7 @@
"ckeditor5-metadata.json"
],
"devDependencies": {
"@ckeditor/ckeditor5-dev-build-tools": "55.3.0",
"@ckeditor/ckeditor5-dev-build-tools": "55.2.0",
"@ckeditor/ckeditor5-inspector": ">=4.1.0",
"@ckeditor/ckeditor5-package-tools": "5.1.0",
"@typescript-eslint/eslint-plugin": "8.57.2",

View File

@@ -16,3 +16,4 @@ export * from "./lib/notes.js";
export * from "./lib/week_utils.js";
export { default as BUILTIN_ATTRIBUTES } from "./lib/builtin_attributes.js";
export * from "./lib/spreadsheet/render_to_html.js";
export * from "./lib/llm_api.js";

View File

@@ -44,7 +44,8 @@ export interface HiddenSubtreeItem {
| "quickSearch"
| "commandPalette"
| "toggleZenMode"
| "mobileTabSwitcher";
| "mobileTabSwitcher"
| "sidebarChat";
command?: keyof typeof Command;
/**
* If set to true, then branches will be enforced to be in the correct place.

View File

@@ -0,0 +1,103 @@
/**
* Shared LLM types for chat integration.
* Used by both client and server for API communication.
*/
/**
* A chat message in the conversation.
*/
export interface LlmMessage {
role: "user" | "assistant" | "system";
content: string;
}
/**
* Citation information extracted from LLM responses.
* May include URL (for web search) or document metadata (for document citations).
*/
export interface LlmCitation {
/** Source URL (typically from web search) */
url?: string;
/** Document or page title */
title?: string;
/** The text that was cited */
citedText?: string;
}
/**
* Configuration for LLM chat requests.
*/
export interface LlmChatConfig {
provider?: string;
model?: string;
systemPrompt?: string;
/** Enable web search tool */
enableWebSearch?: boolean;
/** Enable note tools (search and read notes) */
enableNoteTools?: boolean;
/** Enable extended thinking for deeper reasoning */
enableExtendedThinking?: boolean;
/** Token budget for extended thinking (default: 10000) */
thinkingBudget?: number;
/** Current note context (note ID the user is viewing) */
contextNoteId?: string;
/** The note ID of the chat note (used for auto-renaming on first message) */
chatNoteId?: string;
}
/**
* Pricing per million tokens for a model.
*/
export interface LlmModelPricing {
/** Cost per million input tokens in USD */
input: number;
/** Cost per million output tokens in USD */
output: number;
}
/**
* Information about an available LLM model.
*/
export interface LlmModelInfo {
/** Model identifier (e.g., "claude-sonnet-4-20250514") */
id: string;
/** Human-readable name (e.g., "Claude Sonnet 4") */
name: string;
/** Pricing per million tokens */
pricing: LlmModelPricing;
/** Whether this is the default model */
isDefault?: boolean;
/** Whether this is a legacy/older model */
isLegacy?: boolean;
/** Cost multiplier relative to the cheapest model (1x = cheapest) */
costMultiplier?: number;
/** Maximum context window size in tokens */
contextWindow?: number;
}
/**
* Token usage information from the LLM response.
*/
export interface LlmUsage {
promptTokens: number;
completionTokens: number;
totalTokens: number;
/** Estimated cost in USD (if available) */
cost?: number;
/** Model identifier used for this response */
model?: string;
}
/**
* Stream chunk types for real-time SSE updates.
* Defines the protocol between server and client.
*/
export type LlmStreamChunk =
| { type: "text"; content: string }
| { type: "thinking"; content: string }
| { type: "tool_use"; toolName: string; toolInput: Record<string, unknown> }
| { type: "tool_result"; toolName: string; result: string; isError?: boolean }
| { type: "citation"; citation: LlmCitation }
| { type: "usage"; usage: LlmUsage }
| { type: "error"; error: string }
| { type: "done" };

View File

@@ -21,7 +21,8 @@ export const NOTE_TYPE_ICONS = {
doc: "bx bxs-file-doc",
contentWidget: "bx bxs-widget",
mindMap: "bx bx-sitemap",
spreadsheet: "bx bx-table"
spreadsheet: "bx bx-table",
llmChat: "bx bx-message-square-dots"
};
const FILE_MIME_MAPPINGS = {

View File

@@ -140,6 +140,10 @@ export interface OptionDefinitions extends KeyboardShortcutsOptions<KeyboardActi
seenCallToActions: string;
experimentalFeatures: string;
// AI / LLM
/** JSON array of configured LLM providers with their API keys */
llmProviders: string;
}
export type OptionNames = keyof OptionDefinitions;

View File

@@ -122,7 +122,8 @@ export const ALLOWED_NOTE_TYPES = [
"webView",
"code",
"mindMap",
"spreadsheet"
"spreadsheet",
"llmChat"
] as const;
export type NoteType = (typeof ALLOWED_NOTE_TYPES)[number];

View File

@@ -214,6 +214,8 @@ export interface ConvertAttachmentToNoteResponse {
export type SaveSqlConsoleResponse = CloneResponse;
export type SaveLlmChatResponse = CloneResponse;
export interface BacklinkCountResponse {
count: number;
}

270
pnpm-lock.yaml generated
View File

@@ -40,6 +40,7 @@ overrides:
diff@<4.0.4: '>=4.0.4'
diff@>=6.0.0 <8.0.3: '>=8.0.3'
tar@<7.5.7: '>=7.5.7'
zod@<3.25.76: '>=4.0.0'
patchedDependencies:
'@ckeditor/ckeditor5-code-block':
@@ -217,7 +218,7 @@ importers:
version: 0.2.1(mermaid@11.13.0)
'@mind-elixir/node-menu':
specifier: 5.0.1
version: 5.0.1(mind-elixir@5.10.0)
version: 5.0.1(mind-elixir@5.9.3)
'@popperjs/core':
specifier: 2.11.8
version: 2.11.8
@@ -267,8 +268,8 @@ importers:
specifier: 0.18.0
version: 0.18.0(@types/react-dom@19.1.6(@types/react@19.1.7))(@types/react@19.1.7)(react-dom@19.2.4(react@19.2.4))(react@19.2.4)(rxjs@7.8.2)
'@zumer/snapdom':
specifier: 2.7.0
version: 2.7.0
specifier: 2.6.0
version: 2.6.0
autocomplete.js:
specifier: 0.38.1
version: 0.38.1
@@ -287,6 +288,9 @@ importers:
debounce:
specifier: 3.0.0
version: 3.0.0
dompurify:
specifier: 3.3.3
version: 3.3.3
draggabilly:
specifier: 3.0.0
version: 3.0.0
@@ -330,8 +334,8 @@ importers:
specifier: 11.13.0
version: 11.13.0
mind-elixir:
specifier: 5.10.0
version: 5.10.0
specifier: 5.9.3
version: 5.9.3
normalize.css:
specifier: 8.0.1
version: 8.0.1
@@ -342,8 +346,8 @@ importers:
specifier: 10.29.0
version: 10.29.0
react-i18next:
specifier: 17.0.1
version: 17.0.1(i18next@25.10.10(typescript@5.9.3))(react-dom@19.2.4(react@19.2.4))(react@19.2.4)(typescript@5.9.3)
specifier: 17.0.0
version: 17.0.0(i18next@25.10.10(typescript@5.9.3))(react-dom@19.2.4(react@19.2.4))(react@19.2.4)(typescript@5.9.3)
react-window:
specifier: 2.2.7
version: 2.2.7(react-dom@19.2.4(react@19.2.4))(react@19.2.4)
@@ -552,6 +556,12 @@ importers:
apps/server:
dependencies:
'@ai-sdk/anthropic':
specifier: ^2.0.0
version: 2.0.71(zod@4.3.6)
ai:
specifier: ^5.0.0
version: 5.0.161(zod@4.3.6)
better-sqlite3:
specifier: 12.8.0
version: 12.8.0
@@ -880,13 +890,13 @@ importers:
version: 10.29.0
preact-iso:
specifier: 2.11.1
version: 2.11.1(preact-render-to-string@6.6.7(preact@10.29.0))(preact@10.29.0)
version: 2.11.1(preact-render-to-string@6.6.6(preact@10.29.0))(preact@10.29.0)
preact-render-to-string:
specifier: 6.6.7
version: 6.6.7(preact@10.29.0)
specifier: 6.6.6
version: 6.6.6(preact@10.29.0)
react-i18next:
specifier: 17.0.1
version: 17.0.1(i18next@25.10.10(typescript@5.9.3))(react-dom@19.2.4(react@19.2.4))(react@19.2.4)(typescript@5.9.3)
specifier: 17.0.0
version: 17.0.0(i18next@25.10.10(typescript@5.9.3))(react-dom@19.2.4(react@19.2.4))(react@19.2.4)(typescript@5.9.3)
devDependencies:
'@preact/preset-vite':
specifier: 2.10.5
@@ -947,8 +957,8 @@ importers:
packages/ckeditor5-admonition:
devDependencies:
'@ckeditor/ckeditor5-dev-build-tools':
specifier: 55.3.0
version: 55.3.0(@swc/helpers@0.5.17)(postcss@8.5.8)(tslib@2.8.1)(typescript@5.9.3)
specifier: 55.2.0
version: 55.2.0(@swc/helpers@0.5.17)(postcss@8.5.8)(tslib@2.8.1)(typescript@5.9.3)
'@ckeditor/ckeditor5-inspector':
specifier: '>=4.1.0'
version: 5.0.0
@@ -1007,8 +1017,8 @@ importers:
packages/ckeditor5-footnotes:
devDependencies:
'@ckeditor/ckeditor5-dev-build-tools':
specifier: 55.3.0
version: 55.3.0(@swc/helpers@0.5.17)(postcss@8.5.8)(tslib@2.8.1)(typescript@5.9.3)
specifier: 55.2.0
version: 55.2.0(@swc/helpers@0.5.17)(postcss@8.5.8)(tslib@2.8.1)(typescript@5.9.3)
'@ckeditor/ckeditor5-inspector':
specifier: '>=4.1.0'
version: 5.0.0
@@ -1067,8 +1077,8 @@ importers:
packages/ckeditor5-keyboard-marker:
devDependencies:
'@ckeditor/ckeditor5-dev-build-tools':
specifier: 55.3.0
version: 55.3.0(@swc/helpers@0.5.17)(postcss@8.5.8)(tslib@2.8.1)(typescript@5.9.3)
specifier: 55.2.0
version: 55.2.0(@swc/helpers@0.5.17)(postcss@8.5.8)(tslib@2.8.1)(typescript@5.9.3)
'@ckeditor/ckeditor5-inspector':
specifier: '>=4.1.0'
version: 5.0.0
@@ -1134,8 +1144,8 @@ importers:
version: 0.109.0
devDependencies:
'@ckeditor/ckeditor5-dev-build-tools':
specifier: 55.3.0
version: 55.3.0(@swc/helpers@0.5.17)(postcss@8.5.8)(tslib@2.8.1)(typescript@5.9.3)
specifier: 55.2.0
version: 55.2.0(@swc/helpers@0.5.17)(postcss@8.5.8)(tslib@2.8.1)(typescript@5.9.3)
'@ckeditor/ckeditor5-inspector':
specifier: '>=4.1.0'
version: 5.0.0
@@ -1201,8 +1211,8 @@ importers:
version: 4.17.23
devDependencies:
'@ckeditor/ckeditor5-dev-build-tools':
specifier: 55.3.0
version: 55.3.0(@swc/helpers@0.5.17)(postcss@8.5.8)(tslib@2.8.1)(typescript@5.9.3)
specifier: 55.2.0
version: 55.2.0(@swc/helpers@0.5.17)(postcss@8.5.8)(tslib@2.8.1)(typescript@5.9.3)
'@ckeditor/ckeditor5-inspector':
specifier: '>=4.1.0'
version: 5.0.0
@@ -1526,6 +1536,28 @@ packages:
'@adobe/css-tools@4.4.4':
resolution: {integrity: sha512-Elp+iwUx5rN5+Y8xLt5/GRoG20WGoDCQ/1Fb+1LiGtvwbDavuSk0jhD/eZdckHAuzcDzccnkv+rEjyWfRx18gg==}
'@ai-sdk/anthropic@2.0.71':
resolution: {integrity: sha512-JXTtAwlyxGzzRtpiAXk/O93aOTgdfoVX28EoUuRNVqZRgtkoniLQTtqeb8uZ4oXljNJlXzaJLNasS/U90w/wjw==}
engines: {node: '>=18'}
peerDependencies:
zod: ^3.25.76 || ^4.1.8
'@ai-sdk/gateway@2.0.65':
resolution: {integrity: sha512-yaWzvQQWgAzV0m3eidfpRub1+PggDOr2hLnSOI+L2ZispyJ/7EoSzhjKzNCADj6PHnnPaOMH933Xhl1Z/NSxJw==}
engines: {node: '>=18'}
peerDependencies:
zod: ^3.25.76 || ^4.1.8
'@ai-sdk/provider-utils@3.0.22':
resolution: {integrity: sha512-fFT1KfUUKktfAFm5mClJhS1oux9tP2qgzmEZVl5UdwltQ1LO/s8hd7znVrgKzivwv1s1FIPza0s9OpJaNB/vHw==}
engines: {node: '>=18'}
peerDependencies:
zod: ^3.25.76 || ^4.1.8
'@ai-sdk/provider@2.0.1':
resolution: {integrity: sha512-KCUwswvsC5VsW2PWFqF8eJgSCu5Ysj7m1TxiHTVA6g7k360bk0RNQENT8KTMAYEs+8fWPD3Uu4dEmzGHc+jGng==}
engines: {node: '>=18'}
'@aklinker1/rollup-plugin-visualizer@5.12.0':
resolution: {integrity: sha512-X24LvEGw6UFmy0lpGJDmXsMyBD58XmX1bbwsaMLhNoM+UMQfQ3b2RtC+nz4b/NoRK5r6QJSKJHBNVeUdwqybaQ==}
engines: {node: '>=14'}
@@ -1972,8 +2004,8 @@ packages:
'@ckeditor/ckeditor5-core@47.6.1':
resolution: {integrity: sha512-6dtnquhjymLkNhdC9T6gk/Mf2bDnHSTZrhkByaXC96CbmQDriCgfcaAVY6pQgDNxBQ6fZrev0TnKBLfTItrMsg==}
'@ckeditor/ckeditor5-dev-build-tools@55.3.0':
resolution: {integrity: sha512-87WlVerNpgc0xlnnPTKX+1Z/LrTWeueaOQK/XWns/AKJDoGbwUyQo6rhlRsEvDGKdKXOdHXgQijxgh9Yo1I9KQ==}
'@ckeditor/ckeditor5-dev-build-tools@55.2.0':
resolution: {integrity: sha512-pUa3GqCOEb7m5xhbUPV6gKLIgsX/TI3MXT51u0wa+A822ZFVbaXoGd2LissPkuK9WcGfmgU1gT8TzcyFTCTYig==}
engines: {node: '>=24.11.0', npm: '>=5.7.1'}
hasBin: true
@@ -7003,6 +7035,10 @@ packages:
'@upsetjs/venn.js@2.0.0':
resolution: {integrity: sha512-WbBhLrooyePuQ1VZxrJjtLvTc4NVfpOyKx0sKqioq9bX1C1m7Jgykkn8gLrtwumBioXIqam8DLxp88Adbue6Hw==}
'@vercel/oidc@3.1.0':
resolution: {integrity: sha512-Fw28YZpRnA3cAHHDlkt7xQHiJ0fcL+NRcIqsocZQUSmbzeIKRpwttJjik5ZGanXP+vlA4SbTg+AbA3bP363l+w==}
engines: {node: '>= 20'}
'@vitest/browser-webdriverio@4.1.2':
resolution: {integrity: sha512-5VKfMSq6ZoEAmvVu3sJGkDjEjGuxwk72tOgoNJfJYv+c+UQX1D4UqSdL8kXUMJcTQx1tKeWwQ9Zym0gRdMfyrA==}
peerDependencies:
@@ -7196,7 +7232,6 @@ packages:
'@xmldom/xmldom@0.8.10':
resolution: {integrity: sha512-2WALfTl4xo2SkGCYRt6rDTFfk9R1czmBvUQy12gK2KuRKIpWEhcbbzy8EZXtz/jkRqHX8bFEc6FC1HjX4TUWYw==}
engines: {node: '>=10.0.0'}
deprecated: this version has critical issues, please update to the latest version
'@xtuc/ieee754@1.2.0':
resolution: {integrity: sha512-DX8nKgqcGwsc0eJSqYt5lwP4DH5FlHnmuWWBRy7X0NcaGR0ZtuyeESgMwTYVEtxmsNGY+qit4QYT/MIYTOTPeA==}
@@ -7208,8 +7243,8 @@ packages:
resolution: {integrity: sha512-0fztsk/0ryJ+2PPr9EyXS5/Co7OK8q3zY/xOoozEWaUsL5x+C0cyZ4YyMuUffOO2Dx/rAdq4JMPqW0VUtm+vzA==}
engines: {bun: '>=0.7.0', deno: '>=1.0.0', node: '>=18.0.0'}
'@zumer/snapdom@2.7.0':
resolution: {integrity: sha512-ZiELKzDszeFOazPQ/ExXzgtdoW9jADVjDjInr5XDAlVdCx0RbNsFiG7RLyM48XnA7EyCA9yTvmXSc3ElDrTRqA==}
'@zumer/snapdom@2.6.0':
resolution: {integrity: sha512-JpPPkuMzozRVX6KArgCiMgLpgVW82kWgyoFk5DWGKE5msWGEshXEUdQHLLEyZRO7qioI1pI+yaBJz81tEP9gPg==}
abab@2.0.6:
resolution: {integrity: sha512-j2afSsaIENvHZN2B8GOpF566vZ5WVk5opAiMTvWgaQT8DkbOqsTfvNAvHoRGU2zzP8cPoqys+xHTRDWW8L+/BA==}
@@ -7314,6 +7349,12 @@ packages:
resolution: {integrity: sha512-4I7Td01quW/RpocfNayFdFVk1qSuoh0E7JrbRJ16nH01HhKFQ88INq9Sd+nd72zqRySlr9BmDA8xlEJ6vJMrYA==}
engines: {node: '>=8'}
ai@5.0.161:
resolution: {integrity: sha512-CVANs7auUNEi/hRhdJDKcPYaCLWXveIfmoiekNSRel3i8WUieB6iEncDS5smcubWsx7hGtTgXxNRTg0YG0ljtA==}
engines: {node: '>=18'}
peerDependencies:
zod: ^3.25.76 || ^4.1.8
ajv-draft-04@1.0.0:
resolution: {integrity: sha512-mv00Te6nmYbRp5DCwclxtt7yV/joXJPGS7nM+97GdxvuttCOfgI3K4U25zboyeX0O+myI8ERluxQe5wljMmVIw==}
peerDependencies:
@@ -9466,6 +9507,10 @@ packages:
resolution: {integrity: sha512-6RxOBZ/cYgd8usLwsEl+EC09Au/9BcmCKYF2/xbml6DNczf7nv0MQb+7BA2F+li6//I+28VNlQR37XfQtcAJuA==}
engines: {node: '>=18.0.0'}
eventsource-parser@3.0.6:
resolution: {integrity: sha512-Vo1ab+QXPzZ4tCa8SwIHJFaSzy4R6SHf7BY79rFBDf0idraZWAkYrDjDj8uWaSm3S2TK+hJ7/t1CEmZ7jXw+pg==}
engines: {node: '>=18.0.0'}
execa@1.0.0:
resolution: {integrity: sha512-adbxcyWV46qiHyvSp50TKt05tB4tK3HcmF7/nxfAdhnox83seTDbwnaqKO4sXRy7roHAIFqJP/Rw/AuEbX61LA==}
engines: {node: '>=6'}
@@ -11002,6 +11047,9 @@ packages:
json-schema-traverse@1.0.0:
resolution: {integrity: sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug==}
json-schema@0.4.0:
resolution: {integrity: sha512-es94M3nTIfsEPisRafak+HDLfHXnKBhV3vU5eqPcS3flIWqcxJWgXHXiey3YrpaNsanY5ei1VoYEbOzijuq9BA==}
json-stable-stringify-without-jsonify@1.0.1:
resolution: {integrity: sha512-Bdboy+l7tA3OGW6FjyFHWkP5LuByj1Tk33Ljyq0axyzdk9//JSi2u3fP1QSmd1KNwq6VOKYGlAu87CisVir6Pw==}
@@ -11781,8 +11829,8 @@ packages:
resolution: {integrity: sha512-z0yWI+4FDrrweS8Zmt4Ej5HdJmky15+L2e6Wgn3+iK5fWzb6T3fhNFq2+MeTRb064c6Wr4N/wv0DzQTjNzHNGQ==}
engines: {node: '>=10'}
mind-elixir@5.10.0:
resolution: {integrity: sha512-AY/tDXz8stMbx0MIutdn63Dz0uwY1VVMKIxCqOOA2hg5WGdCGm2qqEZF498deLDxoZbL+hDf1SwBWzvWADBoPA==}
mind-elixir@5.9.3:
resolution: {integrity: sha512-OTTO6ofvDuzN4fxuBngqhQLJmIqModr2NgQb4OY+5DGRt54B+YNAvNnlspYwUXXGq2Rbht1DhXgeU4dr4CUy6Q==}
mini-css-extract-plugin@2.9.4:
resolution: {integrity: sha512-ZWYT7ln73Hptxqxk2DxPU9MmapXRhxkJD6tkSR04dnQxm8BGu2hzgKLugK5yySD97u/8yy7Ma7E76k9ZdvtjkQ==}
@@ -13063,8 +13111,8 @@ packages:
preact: 10.29.0
preact-render-to-string: '>=6.4.0'
preact-render-to-string@6.6.7:
resolution: {integrity: sha512-3XdbsX3+vn9dQW+jJI/FsI9rlkgl6dbeUpqLsChak6jp3j3auFqBCkno7VChbMFs5Q8ylBj6DrUkKRwtVN3nvw==}
preact-render-to-string@6.6.6:
resolution: {integrity: sha512-EfqZJytnjJldV+YaaqhthU2oXsEf5e+6rDv957p+zxAvNfFLQOPfvBOTncscQ+akzu6Wrl7s3Pa0LjUQmWJsGQ==}
peerDependencies:
preact: 10.29.0
@@ -13289,10 +13337,10 @@ packages:
peerDependencies:
react: ^19.2.4
react-i18next@17.0.1:
resolution: {integrity: sha512-iG65FGnFHcYyHNuT01ukffYWCOBFTWSdVD8EZd/dCVWgtjFPObcSsvYYNwcsokO/rDcTb5d6D8Acv8MrOdm6Hw==}
react-i18next@17.0.0:
resolution: {integrity: sha512-L7aqwOePCExt6nlF7000lN2YKWnR7IpSpQId9sj01798Xn3LAncBdTHKl9lA/nr+YrG78BTqWPJxq9mlrrmH7Q==}
peerDependencies:
i18next: '>= 26.0.1'
i18next: '>= 25.10.10'
react: '>= 16.8.0'
react-dom: '*'
react-native: '*'
@@ -15945,12 +15993,12 @@ packages:
resolution: {integrity: sha512-zK7YHHz4ZXpW89AHXUPbQVGKI7uvkd3hzusTdotCg1UxyaVtg0zFJSTfW/Dq5f7OBBVnq6cZIaC8Ti4hb6dtCA==}
engines: {node: '>= 14'}
zod@3.24.4:
resolution: {integrity: sha512-OdqJE9UDRPwWsrHjLN2F8bPxvwJBK22EHLWtanu0LSYr5YqzsaaW3RMgmjwr8Rypg5k+meEJdSPXJZXE/yqOMg==}
zod@4.1.12:
resolution: {integrity: sha512-JInaHOamG8pt5+Ey8kGmdcAcg3OL9reK8ltczgHTAwNhMys/6ThXHityHxVV2p3fkw/c+MAvBHFVYHFZDmjMCQ==}
zod@4.3.6:
resolution: {integrity: sha512-rftlrkhHZOcjDwkGlnUtZZkvaPHCsDATp4pGpuOOMDaTdDDXF91wuVDJoWoPsKX/3YPQ5fHuF3STjcYyKr+Qhg==}
zustand@4.5.6:
resolution: {integrity: sha512-ibr/n1hBzLLj5Y+yUcU7dYw8p6WnIVzdJbnX+1YpaScvZVF2ziugqHs+LAmHw4lWO9c/zRj+K1ncgWDQuthEdQ==}
engines: {node: '>=12.7.0'}
@@ -15978,6 +16026,30 @@ snapshots:
'@adobe/css-tools@4.4.4': {}
'@ai-sdk/anthropic@2.0.71(zod@4.3.6)':
dependencies:
'@ai-sdk/provider': 2.0.1
'@ai-sdk/provider-utils': 3.0.22(zod@4.3.6)
zod: 4.3.6
'@ai-sdk/gateway@2.0.65(zod@4.3.6)':
dependencies:
'@ai-sdk/provider': 2.0.1
'@ai-sdk/provider-utils': 3.0.22(zod@4.3.6)
'@vercel/oidc': 3.1.0
zod: 4.3.6
'@ai-sdk/provider-utils@3.0.22(zod@4.3.6)':
dependencies:
'@ai-sdk/provider': 2.0.1
'@standard-schema/spec': 1.1.0
eventsource-parser: 3.0.6
zod: 4.3.6
'@ai-sdk/provider@2.0.1':
dependencies:
json-schema: 0.4.0
'@aklinker1/rollup-plugin-visualizer@5.12.0(rollup@4.52.0)':
dependencies:
open: 8.4.2
@@ -16782,6 +16854,8 @@ snapshots:
'@ckeditor/ckeditor5-core': 47.6.1
'@ckeditor/ckeditor5-upload': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-ai@47.6.1(bufferutil@4.0.9)(utf-8-validate@6.0.5)':
dependencies:
@@ -16929,6 +17003,8 @@ snapshots:
'@ckeditor/ckeditor5-core': 47.6.1
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-code-block@47.6.1(patch_hash=2361d8caad7d6b5bddacc3a3b4aa37dbfba260b1c1b22a450413a79c1bb1ce95)':
dependencies:
@@ -16940,8 +17016,6 @@ snapshots:
'@ckeditor/ckeditor5-ui': 47.6.1
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-collaboration-core@47.6.1':
dependencies:
@@ -16997,7 +17071,7 @@ snapshots:
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-dev-build-tools@55.3.0(@swc/helpers@0.5.17)(postcss@8.5.8)(tslib@2.8.1)(typescript@5.9.3)':
'@ckeditor/ckeditor5-dev-build-tools@55.2.0(@swc/helpers@0.5.17)(postcss@8.5.8)(tslib@2.8.1)(typescript@5.9.3)':
dependencies:
'@rollup/plugin-commonjs': 28.0.9(rollup@4.52.0)
'@rollup/plugin-json': 6.1.0(rollup@4.52.0)
@@ -17115,6 +17189,8 @@ snapshots:
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
es-toolkit: 1.39.5
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-editor-classic@47.6.1':
dependencies:
@@ -17124,6 +17200,8 @@ snapshots:
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
es-toolkit: 1.39.5
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-editor-decoupled@47.6.1':
dependencies:
@@ -17133,6 +17211,8 @@ snapshots:
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
es-toolkit: 1.39.5
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-editor-inline@47.6.1':
dependencies:
@@ -17222,8 +17302,6 @@ snapshots:
'@ckeditor/ckeditor5-ui': 47.6.1
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-export-word@47.6.1':
dependencies:
@@ -17248,6 +17326,8 @@ snapshots:
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
es-toolkit: 1.39.5
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-font@47.6.1':
dependencies:
@@ -17301,8 +17381,6 @@ snapshots:
'@ckeditor/ckeditor5-ui': 47.6.1
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-highlight@47.6.1':
dependencies:
@@ -17312,8 +17390,6 @@ snapshots:
'@ckeditor/ckeditor5-ui': 47.6.1
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-horizontal-line@47.6.1':
dependencies:
@@ -17332,8 +17408,6 @@ snapshots:
'@ckeditor/ckeditor5-utils': 47.6.1
'@ckeditor/ckeditor5-widget': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-html-support@47.6.1':
dependencies:
@@ -17366,8 +17440,6 @@ snapshots:
'@ckeditor/ckeditor5-widget': 47.6.1
ckeditor5: 47.6.1
es-toolkit: 1.39.5
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-import-word@47.6.1':
dependencies:
@@ -17380,8 +17452,6 @@ snapshots:
'@ckeditor/ckeditor5-ui': 47.6.1
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-indent@47.6.1':
dependencies:
@@ -17393,8 +17463,6 @@ snapshots:
'@ckeditor/ckeditor5-ui': 47.6.1
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-inspector@5.0.0': {}
@@ -17405,8 +17473,6 @@ snapshots:
'@ckeditor/ckeditor5-ui': 47.6.1
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-line-height@47.6.1':
dependencies:
@@ -17431,8 +17497,6 @@ snapshots:
'@ckeditor/ckeditor5-widget': 47.6.1
ckeditor5: 47.6.1
es-toolkit: 1.39.5
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-list-multi-level@47.6.1':
dependencies:
@@ -17457,8 +17521,6 @@ snapshots:
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
es-toolkit: 1.39.5
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-markdown-gfm@47.6.1':
dependencies:
@@ -17496,8 +17558,6 @@ snapshots:
'@ckeditor/ckeditor5-utils': 47.6.1
'@ckeditor/ckeditor5-widget': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-mention@47.6.1(patch_hash=5981fb59ba35829e4dff1d39cf771000f8a8fdfa7a34b51d8af9549541f2d62d)':
dependencies:
@@ -17507,8 +17567,6 @@ snapshots:
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
es-toolkit: 1.39.5
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-merge-fields@47.6.1':
dependencies:
@@ -17521,8 +17579,6 @@ snapshots:
'@ckeditor/ckeditor5-widget': 47.6.1
ckeditor5: 47.6.1
es-toolkit: 1.39.5
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-minimap@47.6.1':
dependencies:
@@ -17585,8 +17641,6 @@ snapshots:
'@ckeditor/ckeditor5-utils': 47.6.1
'@ckeditor/ckeditor5-widget': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-pagination@47.6.1':
dependencies:
@@ -17694,8 +17748,6 @@ snapshots:
'@ckeditor/ckeditor5-ui': 47.6.1
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-slash-command@47.6.1':
dependencies:
@@ -17708,8 +17760,6 @@ snapshots:
'@ckeditor/ckeditor5-ui': 47.6.1
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-source-editing-enhanced@47.6.1':
dependencies:
@@ -17757,8 +17807,6 @@ snapshots:
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
es-toolkit: 1.39.5
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-table@47.6.1':
dependencies:
@@ -17771,8 +17819,6 @@ snapshots:
'@ckeditor/ckeditor5-widget': 47.6.1
ckeditor5: 47.6.1
es-toolkit: 1.39.5
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-template@47.6.1':
dependencies:
@@ -17882,8 +17928,6 @@ snapshots:
'@ckeditor/ckeditor5-engine': 47.6.1
'@ckeditor/ckeditor5-utils': 47.6.1
es-toolkit: 1.39.5
transitivePeerDependencies:
- supports-color
'@ckeditor/ckeditor5-widget@47.6.1':
dependencies:
@@ -17903,8 +17947,6 @@ snapshots:
'@ckeditor/ckeditor5-utils': 47.6.1
ckeditor5: 47.6.1
es-toolkit: 1.39.5
transitivePeerDependencies:
- supports-color
'@codemirror/autocomplete@6.18.6':
dependencies:
@@ -19507,7 +19549,7 @@ snapshots:
dependencies:
'@jimp/types': 1.6.0
'@jimp/utils': 1.6.0
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-blur@1.6.0':
dependencies:
@@ -19517,7 +19559,7 @@ snapshots:
'@jimp/plugin-circle@1.6.0':
dependencies:
'@jimp/types': 1.6.0
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-color@1.6.0':
dependencies:
@@ -19525,7 +19567,7 @@ snapshots:
'@jimp/types': 1.6.0
'@jimp/utils': 1.6.0
tinycolor2: 1.6.0
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-contain@1.6.0':
dependencies:
@@ -19534,7 +19576,7 @@ snapshots:
'@jimp/plugin-resize': 1.6.0
'@jimp/types': 1.6.0
'@jimp/utils': 1.6.0
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-cover@1.6.0':
dependencies:
@@ -19542,20 +19584,20 @@ snapshots:
'@jimp/plugin-crop': 1.6.0
'@jimp/plugin-resize': 1.6.0
'@jimp/types': 1.6.0
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-crop@1.6.0':
dependencies:
'@jimp/core': 1.6.0
'@jimp/types': 1.6.0
'@jimp/utils': 1.6.0
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-displace@1.6.0':
dependencies:
'@jimp/types': 1.6.0
'@jimp/utils': 1.6.0
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-dither@1.6.0':
dependencies:
@@ -19565,12 +19607,12 @@ snapshots:
dependencies:
'@jimp/types': 1.6.0
'@jimp/utils': 1.6.0
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-flip@1.6.0':
dependencies:
'@jimp/types': 1.6.0
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-hash@1.6.0':
dependencies:
@@ -19588,7 +19630,7 @@ snapshots:
'@jimp/plugin-mask@1.6.0':
dependencies:
'@jimp/types': 1.6.0
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-print@1.6.0':
dependencies:
@@ -19601,18 +19643,18 @@ snapshots:
parse-bmfont-binary: 1.0.6
parse-bmfont-xml: 1.1.6
simple-xml-to-json: 1.2.3
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-quantize@1.6.0':
dependencies:
image-q: 4.0.0
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-resize@1.6.0':
dependencies:
'@jimp/core': 1.6.0
'@jimp/types': 1.6.0
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-rotate@1.6.0':
dependencies:
@@ -19621,7 +19663,7 @@ snapshots:
'@jimp/plugin-resize': 1.6.0
'@jimp/types': 1.6.0
'@jimp/utils': 1.6.0
zod: 3.24.4
zod: 4.3.6
'@jimp/plugin-threshold@1.6.0':
dependencies:
@@ -19630,11 +19672,11 @@ snapshots:
'@jimp/plugin-hash': 1.6.0
'@jimp/types': 1.6.0
'@jimp/utils': 1.6.0
zod: 3.24.4
zod: 4.3.6
'@jimp/types@1.6.0':
dependencies:
zod: 3.24.4
zod: 4.3.6
'@jimp/utils@1.6.0':
dependencies:
@@ -19906,9 +19948,9 @@ snapshots:
'@microsoft/tsdoc@0.15.1': {}
'@mind-elixir/node-menu@5.0.1(mind-elixir@5.10.0)':
'@mind-elixir/node-menu@5.0.1(mind-elixir@5.9.3)':
dependencies:
mind-elixir: 5.10.0
mind-elixir: 5.9.3
'@mixmark-io/domino@2.2.0': {}
@@ -24046,6 +24088,8 @@ snapshots:
d3-selection: 3.0.0
d3-transition: 3.0.1(d3-selection@3.0.0)
'@vercel/oidc@3.1.0': {}
'@vitest/browser-webdriverio@4.1.2(bufferutil@4.0.9)(msw@2.7.5(@types/node@24.12.0)(typescript@5.9.3))(utf-8-validate@6.0.5)(vite@8.0.3(@types/node@24.12.0)(esbuild@0.27.4)(jiti@2.6.1)(less@4.1.3)(sass-embedded@1.91.0)(sass@1.91.0)(terser@5.44.0)(tsx@4.21.0)(yaml@2.8.2))(vitest@4.1.2)(webdriverio@9.27.0(bufferutil@4.0.9)(utf-8-validate@6.0.5))':
dependencies:
'@vitest/browser': 4.1.2(bufferutil@4.0.9)(msw@2.7.5(@types/node@24.12.0)(typescript@5.9.3))(utf-8-validate@6.0.5)(vite@8.0.3(@types/node@24.12.0)(esbuild@0.27.4)(jiti@2.6.1)(less@4.1.3)(sass-embedded@1.91.0)(sass@1.91.0)(terser@5.44.0)(tsx@4.21.0)(yaml@2.8.2))(vitest@4.1.2)
@@ -24375,7 +24419,7 @@ snapshots:
'@zip.js/zip.js@2.8.11': {}
'@zumer/snapdom@2.7.0': {}
'@zumer/snapdom@2.6.0': {}
abab@2.0.6: {}
@@ -24458,6 +24502,14 @@ snapshots:
clean-stack: 2.2.0
indent-string: 4.0.0
ai@5.0.161(zod@4.3.6):
dependencies:
'@ai-sdk/gateway': 2.0.65(zod@4.3.6)
'@ai-sdk/provider': 2.0.1
'@ai-sdk/provider-utils': 3.0.22(zod@4.3.6)
'@opentelemetry/api': 1.9.0
zod: 4.3.6
ajv-draft-04@1.0.0(ajv@8.13.0):
optionalDependencies:
ajv: 8.13.0
@@ -25297,8 +25349,6 @@ snapshots:
ckeditor5-collaboration@47.6.1:
dependencies:
'@ckeditor/ckeditor5-collaboration-core': 47.6.1
transitivePeerDependencies:
- supports-color
ckeditor5-premium-features@47.6.1(bufferutil@4.0.9)(ckeditor5@47.6.1)(utf-8-validate@6.0.5):
dependencies:
@@ -27196,6 +27246,8 @@ snapshots:
eventsource-parser@3.0.2: {}
eventsource-parser@3.0.6: {}
execa@1.0.0:
dependencies:
cross-spawn: 6.0.6
@@ -29010,6 +29062,8 @@ snapshots:
json-schema-traverse@1.0.0: {}
json-schema@0.4.0: {}
json-stable-stringify-without-jsonify@1.0.1: {}
json-stringify-pretty-compact@4.0.0: {}
@@ -30083,7 +30137,7 @@ snapshots:
mimic-response@3.1.0: {}
mind-elixir@5.10.0: {}
mind-elixir@5.9.3: {}
mini-css-extract-plugin@2.9.4(webpack@5.101.3(@swc/core@1.11.29(@swc/helpers@0.5.17))(esbuild@0.27.4)):
dependencies:
@@ -31464,12 +31518,12 @@ snapshots:
powershell-utils@0.1.0: {}
preact-iso@2.11.1(preact-render-to-string@6.6.7(preact@10.29.0))(preact@10.29.0):
preact-iso@2.11.1(preact-render-to-string@6.6.6(preact@10.29.0))(preact@10.29.0):
dependencies:
preact: 10.29.0
preact-render-to-string: 6.6.7(preact@10.29.0)
preact-render-to-string: 6.6.6(preact@10.29.0)
preact-render-to-string@6.6.7(preact@10.29.0):
preact-render-to-string@6.6.6(preact@10.29.0):
dependencies:
preact: 10.29.0
@@ -31710,7 +31764,7 @@ snapshots:
react: 19.2.4
scheduler: 0.27.0
react-i18next@17.0.1(i18next@25.10.10(typescript@5.9.3))(react-dom@19.2.4(react@19.2.4))(react@19.2.4)(typescript@5.9.3):
react-i18next@17.0.0(i18next@25.10.10(typescript@5.9.3))(react-dom@19.2.4(react@19.2.4))(react@19.2.4)(typescript@5.9.3):
dependencies:
'@babel/runtime': 7.29.2
html-parse-stringify: 3.0.1
@@ -34951,10 +35005,10 @@ snapshots:
compress-commons: 6.0.2
readable-stream: 4.7.0
zod@3.24.4: {}
zod@4.1.12: {}
zod@4.3.6: {}
zustand@4.5.6(@types/react@19.1.7)(react@19.2.4):
dependencies:
use-sync-external-store: 1.6.0(react@19.2.4)