App to easily create different Gradiochat apps with different context, system messages and gradio themes.
Features
Easy-to-use interface
Supports integration with various AI models
Real-time chat capabilities
Open-source and customizable
Documentation
Documentation can be found hosted on this GitHub repository’s pages. Additionally you can find package manager specific guidelines on pypi respectively.
How to use
This comprehensive guide explains how to use the GradioChat package to create customizable LLM-powered chat applications with Gradio. GradioChat provides a simple yet powerful framework for building chat interfaces that can connect to various language models.
Installation
Install the package using pip or uv using the explanation below.
# Eval is false to prevent testing when nbdev_test or nbdev_prepare is run. The api_key is stored in a .env file and that is not accessible at test time.from gradiochat.config import ModelConfig, ChatAppConfigfrom gradiochat.ui import create_chat_appfrom pathlib import Path# Create model configurationmodel_config = ModelConfig( model_name="mistralai/Mistral-7B-Instruct-v0.2", provider="huggingface", api_key_env_var="HF_API_KEY"# Optional: Set in .env file or environment)# Create chat application configurationconfig = ChatAppConfig( app_name="My Chat App", description="A simple chat application powered by Mistral", system_prompt="You are a helpful assistant.", model=model_config)# Create and launch the chat applicationapp = create_chat_app(config)app.build_interface().launch()
/home/jelle/code/gradiochat/src/gradiochat/ui.py:89: UserWarning: You have not specified a value for the `type` parameter. Defaulting to the 'tuples' format for chatbot messages, but this is deprecated and will be removed in a future version of Gradio. Please set type='messages' instead, which uses openai-style dictionaries with 'role' and 'content' keys.
chatbot = gr.Chatbot(
* Running on local URL: http://127.0.0.1:7860
To create a public link, set `share=True` in `launch()`.
Configuration
The core of GradioChat is its configuration system which uses Pydantic for validation.
ModelConfig
The ModelConfig class defines how to connect to a language model:
from gradiochat.config import ModelConfig# HuggingFace modelhf_model = ModelConfig( model_name="mistralai/Mistral-7B-Instruct-v0.2", provider="huggingface", api_key_env_var="HF_API_KEY", # Will read from environment variable api_base_url=None, # Optional: Custom API endpoint max_completion_tokens=1024, temperature=0.7)
Message
The Message class represents a single message in a conversation:
from gradiochat.config import Message# Create a system messagesystem_msg = Message( role="system", content="You are a helpful assistant.")# Create a user messageuser_msg = Message( role="user", content="Hello, can you help me with Python?")# Create an assistant messageassistant_msg = Message( role="assistant", content="Of course! I'd be happy to help with Python. What would you like to know?")
ChatAppConfig
The ChatAppConfig class is the main configuration for your chat application:
from gradiochat.config import ChatAppConfig, ModelConfigfrom pathlib import Path# Create model configurationmodel_config = ModelConfig( model_name="mistralai/Mistral-7B-Instruct-v0.2", provider="huggingface", api_key_env_var="HF_API_KEY")# Create chat application configurationconfig = ChatAppConfig( app_name="Python Helper", description="Get help with Python programming", system_prompt="You are a Python expert who helps users with programming questions.", starter_prompt="Hello! I'm your Python assistant. Ask me any Python-related question.", context_files=[Path("docs/python_tips.md")], # Optional: Add context from files model=model_config, theme=None, # Optional: Custom Gradio theme logo_path=Path("assets/logo.png"), # Optional: Path to logo image show_system_prompt=True, # Whether to show system prompt in UI show_context=True# Whether to show context in UI)
Creating a Chat Application
Using Environment Variables
For API keys, it’s recommended to use environment variables. You can create a ’ .env file in your project root:
HF_API_KEY=your_huggingface_api_key_here
Then load it in your application:
from dotenv import load_dotenvload_dotenv() # Load environment variables from .env file# Now create your ModelConfig with api_key_env_var
Adding Context Files
You can provide additional context to your LLM by adding markdown files:
from pathlib import Pathconfig = ChatAppConfig(# ... other parameters context_files=[ Path("docs/product_info.md"), Path("docs/faq.md") ],# ... other parameters)
Customization
Custom Themes
You can customize the appearance of your chat application using Gradio themes. You can build those yourself with help from the gradio_themebuilder or you can use one of the predefined themes in gradio_themes. The predifined themes are listed below.
- themeWDODelta
import gradio as grmy_theme = gr.themes.Base( primary_hue="fuchsia",)# Use the theme in your configconfig = ChatAppConfig(# ... other parameters theme=my_theme,# ... other parameters)
API Reference
BaseChatApp
The BaseChatApp class provides the core functionality for chat applications:
from gradiochat.app import BaseChatAppfrom gradiochat.config import ChatAppConfig# Create configurationconfig = ChatAppConfig(...)# Create base appbase_app = BaseChatApp(config)# Generate a responseresponse = base_app.generate_response("What is Python?")# Generate a streaming response# IMPORTANT: I don't actually think this already works. To be continued.for chunk in base_app.generate_stream("Tell me about Python"):print(chunk, end="", flush=True)
GradioChat
The GradioChat class provides the Gradio UI for the chat application:
from gradiochat.ui import GradioChatfrom gradiochat.app import BaseChatApp# Create base appbase_app = BaseChatApp(config)# Create Gradio interfacegradio_app = GradioChat(base_app)# Build and launch the interfaceinterface = gradio_app.build_interface()interface.launch()
LLM Clients
The package currently supports HuggingFace models through the HuggingFaceClient class:
from gradiochat.app import HuggingFaceClientfrom gradiochat.config import ModelConfig, Message# Create model configmodel_config = ModelConfig( model_name="mistralai/Mistral-7B-Instruct-v0.2", provider="huggingface", api_key_env_var="HF_API_KEY")# Create clientclient = HuggingFaceClient(model_config)# Generate a completionmessages = [ Message(role="system", content="You are a helpful assistant."), Message(role="user", content="What is Python?")]response = client.chat_completion(messages)
Complete Example
Here’s a complete example that demonstrates most features:
import gradio as grfrom gradiochat.config import ModelConfig, ChatAppConfigfrom gradiochat.gradio_themes import themeWDODeltafrom pathlib import Pathfrom dotenv import load_dotenv# Load environment variablesload_dotenv()# Create a custom themetheme = themeWDODelta# Create model configurationmodel_config = ModelConfig( model_name="mistralai/Mistral-7B-Instruct-v0.2", provider="huggingface", api_key_env_var="HF_API_KEY", max_completion_tokens=2048, temperature=0.8)# Create chat application configurationconfig = ChatAppConfig( app_name="Python Expert", description="Get expert help with Python programming", system_prompt="You are a Python expert who helps users with programming questions. Provide clear, concise, and accurate information.", starter_prompt="Hello! I'm your Python assistant. How can I help you today?", context_files=[Path("docs/python_reference.md")], model=model_config, theme=theme, logo_path=Path("assets/python_logo.png"), show_system_prompt=True, show_context=True)# Create and launch the chat applicationfrom gradiochat.ui import create_chat_appapp = create_chat_app(config)app.build_interface().launch(share=True)
Officially nbdev doesn’t support uv. It works with conda or pip, preferably with a single environment for all your Python projects if I understand correctly. The nbdev package then handles the dependencies and such via the setings.ini and setup.py and requirements.txt. But don’t take my word for that, dive into the actual documentation.
I stumbled into some quircks trying to combine nbdev and uv. Most of those are probably a result from nbdev needing settings.ini and/or settings.py, while uv uses pyproject.toml. All of this, is to say. You might possibly run into some issues running this package because I wanted to do something that’s not officially supported or possible.
While developing I learned a few things that seemed to work after some struggles.
Make sure to use Python 3.10 if you want to use Gradio. This is strange. With uv I can use 3.11 and Gradio just fine. But the moment I push to Github the nbdev CI action fails with the statement that my app can at most support 3.10.16. I think that is because in Github pip is used and not uv.
I also think you need Python 3.10 if you want to deploy to HuggingSpaces. Lower is possible, higher is not yet supported by HuggingSpaces.
I thought that we needed Python 3.11 to be able to combine nbdev with uv. But I haven’t noticed any issues yet.
Install gradiochat in Development mode
# make sure gradiochat package is installed in development mode$ pip install -e .# make changes under nbs/ directory# ...# compile to have changes apply to gradiochat$ nbdev_prepare