diff --git a/docs/knowledge_base/demos/chatbot.md b/docs/knowledge_base/demos/chatbot.md new file mode 100644 index 000000000..52a200d2a --- /dev/null +++ b/docs/knowledge_base/demos/chatbot.md @@ -0,0 +1,16 @@ +This demo showcases Taipy's ability to enable end-users to run inference using LLMs. Here, we +use GPT-3 to create a chatbot and display the conversation in an interactive chat interface. + +[Try it live](https://demo-llm-chat.taipy.cloud/){: .tp-btn target='blank' } +[Get it on GitHub](https://github.com/Avaiga/demo-llm-chat){: .tp-btn .tp-btn--accent target='blank' } + +# Understanding the Application +This application allows the user to chat with GPT-3 by sending +its input to the OpenAI API and returning the conversation in +a chat window. The user is also able to come back to a previous +conversation and continue it. + +![ChatBot](images/chatbot_meds_conv.png){width=100%} + +A tutorial on how to write this application and similar +LLM inference applications is available [here](../tutorials/chatbot/index.md). \ No newline at end of file diff --git a/docs/knowledge_base/demos/images/chatbot_meds_conv.png b/docs/knowledge_base/demos/images/chatbot_meds_conv.png new file mode 100644 index 000000000..44a921792 Binary files /dev/null and b/docs/knowledge_base/demos/images/chatbot_meds_conv.png differ diff --git a/docs/knowledge_base/demos/index.md b/docs/knowledge_base/demos/index.md index 18048666d..dd07fbf1e 100644 --- a/docs/knowledge_base/demos/index.md +++ b/docs/knowledge_base/demos/index.md @@ -228,4 +228,64 @@ Let's explore demos of applications made with Taipy. + + +
  • + +
    + +
    +
    +

    Drift Detection

    + Front-end | Back-end +

    Showcases the ability to select inputs and execute and visualize outputs of data + pipelines in a Taipy application by using the example of detecting drift on a + dataset. +

    +
    +
    +
  • + +
  • + +
    + +
    +
    +

    Realtime Pollution Dashboard

    + Front-end | Back-end +

    Displays real-time pollution data from sensors around a factory. The data is streamed + from another server and displayed in a dashboard. +

    +
    +
    +
  • + +
  • + +
    + +
    +
    +

    LLM ChatBot

    + Front-end +

    A chatbot that uses OpenAI's API with GPT-3. Can be used as a template for implementing apps that use LLM inference. +

    +
    +
    +
  • + +
  • + +
    + +
    +
    +

    Fraud Detection

    + Front-end +

    A Taipy Application that analyzes credit card transactions to detect fraud. +

    +
    +
    +
  • diff --git a/docs/knowledge_base/index.md b/docs/knowledge_base/index.md index 4e83b51d3..0803f355a 100644 --- a/docs/knowledge_base/index.md +++ b/docs/knowledge_base/index.md @@ -200,6 +200,19 @@ hide: +
  • + +
    + +
    +
    +

    LLM ChatBot

    +

    + Create a chatbot interface using Taipy and an LLM API. +

    +
    +
    +
  • @@ -464,6 +477,60 @@ hide:
  • + +
  • + +
    + +
    +
    +

    Drift Detection

    +

    Showcases the ability to select inputs and execute and visualize outputs of data + pipelines in a Taipy application by using the example of detecting drift on a + dataset. +

    +
    +
    +
  • +
  • + +
    + +
    +
    +

    Realtime Pollution Dashboard

    +

    Displays real-time pollution data from sensors around a factory. The data is streamed + from another server and displayed in a dashboard. +

    +
    +
    +
  • + +
  • + +
    + +
    +
    +

    LLM ChatBot

    +

    A chatbot that uses OpenAI's API with GPT-3. Can be used as a template for implementing apps that use LLM inference. +

    +
    +
    +
  • + +
  • + +
    + +
    +
    +

    Fraud Detection

    +

    A Taipy Application that analyzes credit card transactions to detect fraud. +

    +
    +
    +
  • # Tips & Tricks diff --git a/docs/knowledge_base/tutorials/chatbot/chatbot_cloud.png b/docs/knowledge_base/tutorials/chatbot/chatbot_cloud.png new file mode 100644 index 000000000..dfd67431b Binary files /dev/null and b/docs/knowledge_base/tutorials/chatbot/chatbot_cloud.png differ diff --git a/docs/knowledge_base/tutorials/chatbot/chatbot_env_var.png b/docs/knowledge_base/tutorials/chatbot/chatbot_env_var.png new file mode 100644 index 000000000..cc26bd0ea Binary files /dev/null and b/docs/knowledge_base/tutorials/chatbot/chatbot_env_var.png differ diff --git a/docs/knowledge_base/tutorials/chatbot/chatbot_first_result.png b/docs/knowledge_base/tutorials/chatbot/chatbot_first_result.png new file mode 100644 index 000000000..2d183163a Binary files /dev/null and b/docs/knowledge_base/tutorials/chatbot/chatbot_first_result.png differ diff --git a/docs/knowledge_base/tutorials/chatbot/chatbot_plane.png b/docs/knowledge_base/tutorials/chatbot/chatbot_plane.png new file mode 100644 index 000000000..b8b5b21fa Binary files /dev/null and b/docs/knowledge_base/tutorials/chatbot/chatbot_plane.png differ diff --git a/docs/knowledge_base/tutorials/chatbot/chatbot_roundconv.png b/docs/knowledge_base/tutorials/chatbot/chatbot_roundconv.png new file mode 100644 index 000000000..3d201bc3e Binary files /dev/null and b/docs/knowledge_base/tutorials/chatbot/chatbot_roundconv.png differ diff --git a/docs/knowledge_base/tutorials/chatbot/index.md b/docs/knowledge_base/tutorials/chatbot/index.md new file mode 100644 index 000000000..8e90f2111 --- /dev/null +++ b/docs/knowledge_base/tutorials/chatbot/index.md @@ -0,0 +1,275 @@ +In this tutorial we will create a simple chatbot website using Taipy. + +[Try it live](https://demo-llm-chat.taipy.cloud/){: .tp-btn target='blank' } +[Get it on GitHub](https://github.com/Avaiga/demo-llm-chat){: .tp-btn .tp-btn--accent target='blank' } + +

    + Render of the app +

    + + +Here we will use OpenAI's API with GPT-3. This tutorial can easily +be adapted to other LLMs. + + +# Step 1: Install Requirements + +Create a `requirements.txt` file with the following content: + +```bash +taipy==3.0.0 +openai==1.3.7 +``` + +Install the requirements using pip in a terminal: + +```bash +pip install -r requirements.txt +``` + +# Step 2: Imports + +Create a `main.py` file with the following imports: + +```python +from taipy.gui import Gui, State, notify +import openai +``` + +# Step 3: Initialize variables + +Initialize the following variables in the main.py file: + +```python +context = "The following is a conversation with an AI assistant. The assistant is helpful, creative, clever, and very friendly.\n\nHuman: Hello, who are you?\nAI: I am an AI created by OpenAI. How can I help you today? " +conversation = { + "Conversation": ["Who are you?", "Hi! I am GPT-3. How can I help you today?"] +} +current_user_message = "" +``` + +- `context` is the initial context for the conversation, the LLM will use this to understand what behaviour is expected from it. +- `conversation` is a dictionary that will store the conversation history to be displayed +- `current_user_message` is the current message that the user is typing + +# Step 4: Create a function to generate responses + +**This step is the one that needs to be adapted if you want to +use a different LLM.** + +Initialize the OpenAI client with your API key. You can find +your API key [here](https://platform.openai.com/api-keys). + +```python +client = openai.Client(api_key="ENTER_YOUR_API_KEY_HERE") +``` + +Create a function that takes as input a string `prompt` which +is the user message and returns a string which is the response from the LLM. + +```python +def request(state: State, prompt: str) -> str: + """ + Send a prompt to the GPT-3 API and return the response. + + Args: + - state: The current state of the app. + - prompt: The prompt to send to the API. + + Returns: + The response from the API. + """ + response = state.client.chat.completions.create( + messages=[ + { + "role": "user", + "content": f"{prompt}", + } + ], + model="gpt-3.5-turbo", + ) + return response.choices[0].message.content +``` + +# Step 5: Create a function to add the new messages to the conversation + +Create a function that gets triggered when the user sends a +message. This function will add the user's message to the context, +send it to the API, get the response, add the response to the +context and to the displayed conversation. + +```python +def send_message(state: State) -> None: + """ + Send the user's message to the API and update the conversation. + + Args: + - state: The current state of the app. + """ + # Add the user's message to the context + state.context += f"Human: \n {state.current_user_message}\n\n AI:" + # Send the user's message to the API and get the response + answer = request(state, state.context).replace("\n", "") + # Add the response to the context for future messages + state.context += answer + # Update the conversation + conv = state.conversation._dict.copy() + conv["Conversation"] += [state.current_user_message, answer] + state.conversation = conv + # Clear the input field + state.current_user_message = "" +``` + +# Step 6: Create the User Interface + +In Taipy, one way to define pages is to use Markdown strings. Here we use a +[table](../../../manuals/gui/viselements/table.md_template) to display the + `conversation` dictionary and an + [input](../../../manuals/gui/viselements/input) so that the + user can type their message. When the user presses enter, + the `send_message` function is triggered. + +```python +page = """ +<|{conversation}|table|show_all|width=100%|> +<|{current_user_message}|input|label=Write your message here...|on_action=send_message|class_name=fullwidth|> +""" +``` + +# Step 7: Run the application + +Finally we run the application: + +```python +if __name__ == "__main__": + Gui(page).run(dark_mode=True, title="Taipy Chat") +``` + +And here is the result: + +

    + Render of the app +

    + +# Step 8: Styling + +The app's style is Taipy's default stylekit. We are going to +make some changes so that it looks more like a chat app. + +First in a `main.css` file, create styles to display user and +AI messages differently: + +```css +.gpt_message td { + margin-left: 30px; + margin-bottom: 20px; + margin-top: 20px; + position: relative; + display: inline-block; + padding: 20px; + background-color: #ff462b; + border-radius: 20px; + max-width: 80%; + box-shadow: 0 4px 8px 0 rgba(0, 0, 0, 0.2), 0 6px 20px 0 rgba(0, 0, 0, 0.19); + font-size: large; +} + +.user_message td { + margin-right: 30px; + margin-bottom: 20px; + margin-top: 20px; + position: relative; + display: inline-block; + padding: 20px; + background-color: #140a1e; + border-radius: 20px; + max-width: 80%; + float: right; + box-shadow: 0 4px 8px 0 rgba(0, 0, 0, 0.2), 0 6px 20px 0 rgba(0, 0, 0, 0.19); + font-size: large; +} +``` + +We now need to tell Taipy to apply these styles to the rows in +the table. We'll first create a function that will return the +correct class name for each row: + +```python +def style_conv(state: State, idx: int, row: int) -> str: + """ + Apply a style to the conversation table depending on the message's author. + + Args: + - state: The current state of the app. + - idx: The index of the message in the table. + - row: The row of the message in the table. + + Returns: + The style to apply to the message. + """ + if idx is None: + return None + elif idx % 2 == 0: + return "user_message" + else: + return "gpt_message" +``` + +We then apply this function to the table by adding the `style` property + +```python +<|{conversation}|table|show_all|style=style_conv|> +``` + +And voilĂ : + +

    + The styled application +

    + +# Step 9: More features + +I have added notifications, a sidebar with a button to clear the conversation +and a history of previous conversations. I won't go into the +details of how to do this here, but you can find the full code +in the [GitHub repository](https://github.com/Avaiga/demo-llm-chat) + +# Step 10: Deploying the app to Taipy Cloud + +We are now going to deploy the app to Taipy Cloud so it is +accessible from anyone with a link. + +Firstly we need to store the API key in an environment variable. +Replace the line that defines `client` in [Step 5](#step-5-create-a-function-to-generate-responses) with: + +```python +import os +client = openai.Client(api_key = os.environ["OPENAI_API_KEY"]) +``` + +Now, instead of having our API key in the code, the app will look +for it in the environment variables. + +We can now deploy the app to Taipy Cloud: +1. Connect to [Taipy Cloud](https://cloud.taipy.io/) and sign in +2. Click on "Add Machine" and fill in the fields +3. Select the created machine and click on "Add app" +4. Zip the `main.py`, `main.css` and `requirements.txt` files and upload the zip file to the "App files" field. Fill in the other fields +5. In the "Environment Variables" tab, create a new environment variable called `OPENAI_API_KEY` and paste your API key as the value like in the image below +6. Press "Deploy app" + +

    + Environment Variables Tab +

    + + +After a while, your app should be running and will be accessible +from the displayed link! + +

    + Taipy Cloud Interface +

    + +

    + The final application +

    diff --git a/docs/knowledge_base/tutorials/index.md b/docs/knowledge_base/tutorials/index.md index 840f1335e..88c1461fe 100644 --- a/docs/knowledge_base/tutorials/index.md +++ b/docs/knowledge_base/tutorials/index.md @@ -134,6 +134,20 @@ Follow our tutorials and get the core concepts of Taipy. +
  • + +
    + +
    +
    +

    LLM ChatBot

    +

    + Create a chatbot interface using Taipy and an LLM API. +

    +
    +
    +
  • +
  • diff --git a/mkdocs.yml_template b/mkdocs.yml_template index 7aeb2eb6f..007fe2199 100644 --- a/mkdocs.yml_template +++ b/mkdocs.yml_template @@ -34,6 +34,7 @@ nav: - "3 - Scenario configuration": knowledge_base/tutorials/complete_application/step_03/step_03.md - "4 - Scenario page": knowledge_base/tutorials/complete_application/step_04/step_04.md - "5 - Performance page": knowledge_base/tutorials/complete_application/step_05/step_05.md + - "Creating an LLM ChatBot": knowledge_base/tutorials/chatbot/index.md - "Markdown Syntax": knowledge_base/tutorials/markdown_syntax.md - "Data Dashboard": knowledge_base/tutorials/data_dashboard.md - "Changing Line Types Using Charts": knowledge_base/tutorials/charts.md @@ -55,7 +56,9 @@ nav: - "COVID Dashboard": knowledge_base/demos/covid_dashboard.md - "Movie Genre Selector": knowledge_base/demos/movie_genre_selector.md - "Realtime Pollution Dashboard" : knowledge_base/demos/pollution_sensors.md - - "Background Remover" : knowledge_base/demos/background-remover.md + - "Background Remover" : knowledge_base/demos/background_remover.md + - "LLM ChatBot" : knowledge_base/demos/chatbot.md + - "Fraud Detection": knowledge_base/demos/fraud_detection.md - "Tips and tricks": - "Tips and tricks": knowledge_base/tips/index.md - "Scenarios": knowledge_base/tips/scenarios/index.md