Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

server : (webui) migrate project to ReactJS with typescript #11688

Merged
merged 12 commits into from
Feb 6, 2025

Conversation

ngxson
Copy link
Collaborator

@ngxson ngxson commented Feb 5, 2025

Close #11663 #10915 #9608

Along side with moving to ReactJS, this PR also added some subtle changes that would be (almost) impossible to do with VueJS:

  1. Markdown is now rendered as DOM instead (via react-markdown) of as innerHTML (via markdown-it). This improves performance on rendering long markdown input, while also enable selecting the text content while it's being generated, as described in Bug: llama-server web UI resets the text selection during inference on every token update #9608
  2. Each conversation now has its own address, for example http://localhost:5173/#/chat/conv-1738775907156
  3. The "Copy" button on code block now changes its text to "Copied" on clicking. Very small change, but now possible thanks to react-markdown
  4. User can switch between conversations while text generation is still in progress. This is mostly to match the behavior of claude / chatgpt / deekseek / etc, but also to demonstrate that the application's data flow is now decoupled.

This change will be transparent for end-users and requires no migrations.

@ngxson ngxson requested a review from ggerganov February 5, 2025 22:40
@github-actions github-actions bot added examples devops improvements to build systems and github actions server labels Feb 5, 2025
@woof-dog
Copy link
Contributor

woof-dog commented Feb 6, 2025

I like that I can copy anytime now! But Please be very careful!

This change breaks LaTeX formatting for some math equations!!

Existing:
image

This branch:
image

It's very odd, sometimes it works:
image

Maybe only it's now broken when the math starts with a number!

@woof-dog
Copy link
Contributor

woof-dog commented Feb 6, 2025

My only other comment is that in with the data flow no longer being coupled, the textarea is not decoupled. I cannot send two prompts in two conversations in parallel, which would be nice when I want to ask two separate long questions and get 2 answers by the time I am back from my coffee break. I still have to use two tabs with this PR even though as a user I would expect to send the second from the same tab, but different conversation. So if I start a new conversation the "Stop" remains the same.

But this is a nitpick, not as important as the other things, but for your future consideration.

Comment on lines 301 to 329
export function preprocessLaTeX(content: string): string {
// Step 1: Protect code blocks
const codeBlocks: string[] = [];
content = content.replace(/(```[\s\S]*?```|`[^`\n]+`)/g, (_, code) => {
codeBlocks.push(code);
return `<<CODE_BLOCK_${codeBlocks.length - 1}>>`;
});

// Step 2: Protect existing LaTeX expressions
const latexExpressions: string[] = [];
content = content.replace(
/(\$\$[\s\S]*?\$\$|\\\[[\s\S]*?\\\]|\\\(.*?\\\))/g,
(match) => {
latexExpressions.push(match);
return `<<LATEX_${latexExpressions.length - 1}>>`;
}
);

// Step 3: Escape dollar signs that are likely currency indicators
content = content.replace(/\$(?=\d)/g, '\\$');

// Step 4: Restore LaTeX expressions
content = content.replace(
/<<LATEX_(\d+)>>/g,
(_, index) => latexExpressions[parseInt(index)]
);

// Step 5: Restore code blocks
content = content.replace(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is like going back to the old version of the UI and almost as hacky as my pull request 😢

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree that this is not the best solution, obviously I copied this from internet, see the comment above this function.

The proper way would be to implement a custom remark/rehype plug for that, will see if I can make it quickly

@ggerganov
Copy link
Owner

while also enable selecting the text content while it's being generated,

Hm, this still does not work for me - the selection is being cleared on each new token. Tried Vivaldi and Safari on macOS.

Copy link
Owner

@ggerganov ggerganov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not familiar enough with the web technologies to make a proper review. Running some tests, everything seems to work, so for me it's good enough 👍

@ngxson
Copy link
Collaborator Author

ngxson commented Feb 6, 2025

I still have to use two tabs with this PR even though as a user I would expect to send the second from the same tab, but different conversation. So if I start a new conversation the "Stop" remains the same.

OK I did test again on claude and it works like you described. Will see if I can do a quick hack to make it work that way.

@ngxson
Copy link
Collaborator Author

ngxson commented Feb 6, 2025

@woof-dog Multiple conversations can now be generated at the same time, remember to start the server with -np ... to enable multiple slots. Can you please test it?

@woof-dog
Copy link
Contributor

woof-dog commented Feb 6, 2025

@woof-dog Multiple conversations can now be generated at the same time, remember to start the server with -np ... to enable multiple slots. Can you please test it?

Yes, this is excellent 😄 works just as expected. This is going to make using the llama.cpp server so much better for me. Thank you

@ngxson
Copy link
Collaborator Author

ngxson commented Feb 6, 2025

Hm, this still does not work for me - the selection is being cleared on each new token. Tried Vivaldi and Safari on macOS.

OK so seems like only the code block is affected because it's rendered by highlight.js, I'll see if there are any alternatives.

@ngxson
Copy link
Collaborator Author

ngxson commented Feb 6, 2025

latex display should be ok now:

This is the formula:

$\frac{e^{x_i}}{\sum_{j=1}^{n}e^{x_j}}$

Given an input vector \(\mathbf{x} = [x_1, x_2, \ldots, x_n]\)

\[
y_i = \frac{e^{x_i}}{\sum_{j=1}^n e^{x_j}}
\]

$2x + y = z$

Code block latex:
```latex
\frac{e^{x_i}}{\sum_{j=1}^{n}e^{x_j}}
```

Test dollar sign: $1234 $4567

Invalid latex syntax: $E = mc^$ and $$E = mc^$$

will be rendered as:

image

@ngxson
Copy link
Collaborator Author

ngxson commented Feb 6, 2025

Hm, this still does not work for me - the selection is being cleared on each new token. Tried Vivaldi and Safari on macOS.

OK so seems like only the code block is affected because it's rendered by highlight.js, I'll see if there are any alternatives.

Should be fixed with the latest commit

@ngxson ngxson requested a review from Copilot February 6, 2025 16:10

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot reviewed 16 out of 31 changed files in this pull request and generated no comments.

Files not reviewed (15)
  • examples/server/webui/.gitignore: Language not supported
  • examples/server/webui/.prettierignore: Language not supported
  • examples/server/webui/index.html: Language not supported
  • examples/server/webui/package.json: Language not supported
  • examples/server/webui/public/demo-conversation.json: Language not supported
  • examples/server/webui/src/index.scss: Language not supported
  • examples/server/webui/src/katex-gpt.js: Evaluated as low risk
  • examples/server/webui/src/highlight-config.js: Evaluated as low risk
  • examples/server/webui/src/App.tsx: Evaluated as low risk
  • examples/server/webui/src/Config.ts: Evaluated as low risk
  • .github/workflows/server.yml: Evaluated as low risk
  • examples/server/webui/src/components/Sidebar.tsx: Evaluated as low risk
  • examples/server/webui/src/components/SettingDialog.tsx: Evaluated as low risk
  • examples/server/webui/src/utils/app.context.tsx: Evaluated as low risk
  • examples/server/webui/src/components/MarkdownDisplay.tsx: Evaluated as low risk
@ngxson ngxson merged commit 2fb3c32 into ggerganov:master Feb 6, 2025
6 checks passed
apicalshark added a commit to apicalshark/llama.cpp that referenced this pull request Feb 6, 2025
server : (webui) migrate project to ReactJS with typescript (ggerganov#11688)
@brucepro
Copy link
Contributor

brucepro commented Feb 6, 2025

Just an FYI. I am adding MCP SSE support to the webui. Will test this and migrate my code then do a bit of testing.

@MoonRide303
Copy link

@ngxson I noticed one regression - after changing temperature in settings it's now being saved as string, causing fallback to default value during generation:

Wrong type supplied for parameter 'temperature'. Expected 'number', using default value

@ngxson
Copy link
Collaborator Author

ngxson commented Feb 7, 2025

@MoonRide303 Ok thanks, I completely forgot about that. Working on a fix now.

@ngxson
Copy link
Collaborator Author

ngxson commented Feb 7, 2025

@brucepro don't hesitate to push a PR, I have a look and adapt it. Probably I'll do that along with #11662 , so your MCP implementation can appear in the "experimental features" section

@brucepro
Copy link
Contributor

@brucepro don't hesitate to push a PR, I have a look and adapt it. Probably I'll do that along with #11662 , so your MCP implementation can appear in the "experimental features" section

My thought was to make a super basic implementation of an MCP SSE React page to make sure all the bugs are worked out without being diluted by the WebUI code.
I have a basic implementation of the MCP SSE client with test python sse servers working in a basic react site. Tools currently working for any SSE endpoint, but I didn't test much. I think using this one to work all the kinks out and then work on migrating it to the webui is a good way to go. But I am open to suggestions. I am going to get prompts and resources working next. Let me know or send a PR for improvements.
https://github.com/brucepro/llamacppMCPClientDemo

@exxocism
Copy link

That's nice

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
devops improvements to build systems and github actions examples server
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Feature Request: move server webui from vuejs to reactjs (with typescript)
6 participants