A flexible, adaptive classification system that allows for dynamic addition of new classes and continuous learning from examples. Built on top of transformers from HuggingFace, this library provides an easy-to-use interface for creating and updating text classifiers.
- 🚀 Works with any transformer classifier model
- 📈 Continuous learning capabilities
- 🎯 Dynamic class addition
- 💾 Safe and efficient state persistence
- 🔄 Prototype-based learning
- 🧠 Neural adaptation layer
pip install adaptive-classifier
from adaptive_classifier import AdaptiveClassifier
# Initialize with any HuggingFace model
classifier = AdaptiveClassifier("bert-base-uncased")
# Add some examples
texts = [
"The product works great!",
"Terrible experience",
"Neutral about this purchase"
]
labels = ["positive", "negative", "neutral"]
classifier.add_examples(texts, labels)
# Make predictions
predictions = classifier.predict("This is amazing!")
print(predictions) # [('positive', 0.85), ('neutral', 0.12), ('negative', 0.03)]
# Save the classifier
classifier.save("./my_classifier")
# Load it later
loaded_classifier = AdaptiveClassifier.load("./my_classifier")
# The library is also integrated with Hugging Face. So you can push and load from HF Hub.
# Save to Hub
classifier.push_to_hub("adaptive-classifier/model-name")
# Load from Hub
classifier = AdaptiveClassifier.from_pretrained("adaptive-classifier/model-name")
# Add a completely new class
new_texts = [
"Error code 404 appeared",
"System crashed after update"
]
new_labels = ["technical"] * 2
classifier.add_examples(new_texts, new_labels)
# Add more examples to existing classes
more_examples = [
"Best purchase ever!",
"Highly recommend this"
]
more_labels = ["positive"] * 2
classifier.add_examples(more_examples, more_labels)
The system combines three key components:
-
Transformer Embeddings: Uses state-of-the-art language models for text representation
-
Prototype Memory: Maintains class prototypes for quick adaptation to new examples
-
Adaptive Neural Layer: Learns refined decision boundaries through continuous training
- Python ≥ 3.8
- PyTorch ≥ 2.0
- transformers ≥ 4.30.0
- safetensors ≥ 0.3.1
- faiss-cpu ≥ 1.7.4 (or faiss-gpu for GPU support)
We evaluate the effectiveness of adaptive classification in optimizing LLM routing decisions. Using the arena-hard-auto-v0.1 dataset with 500 queries, we compared routing performance with and without adaptation while maintaining consistent overall success rates.
Metric | Without Adaptation | With Adaptation | Impact |
---|---|---|---|
High Model Routes | 113 (22.6%) | 98 (19.6%) | 0.87x |
Low Model Routes | 387 (77.4%) | 402 (80.4%) | 1.04x |
High Model Success Rate | 40.71% | 29.59% | 0.73x |
Low Model Success Rate | 16.54% | 20.15% | 1.22x |
Overall Success Rate | 22.00% | 22.00% | 1.00x |
Cost Savings* | 25.60% | 32.40% | 1.27x |
*Cost savings calculation assumes high-cost model is 2x the cost of low-cost model
The results highlight several key benefits of adaptive classification:
-
Improved Cost Efficiency: While maintaining the same overall success rate (22%), the adaptive classifier achieved 32.40% cost savings compared to 25.60% without adaptation - a relative improvement of 1.27x in cost efficiency.
-
Better Resource Utilization: The adaptive system routed more queries to the low-cost model (402 vs 387) while reducing high-cost model usage (98 vs 113), demonstrating better resource allocation.
-
Learning from Experience: Through adaptation, the system improved the success rate of low-model routes from 16.54% to 20.15% (1.22x increase), showing effective learning from successful cases.
-
ROI on Adaptation: The system adapted to 110 new examples during evaluation, leading to a 6.80% improvement in cost savings while maintaining quality - demonstrating significant return on the adaptation investment.
This real-world evaluation demonstrates that adaptive classification can significantly improve cost efficiency in LLM routing without compromising overall performance.
- RouteLLM: Learning to Route LLMs with Preference Data
- Transformer^2: Self-adaptive LLMs
- Lamini Classifier Agent Toolkit
- Protoformer: Embedding Prototypes for Transformers
- Overcoming catastrophic forgetting in neural networks
If you use this library in your research, please cite:
@software{adaptive_classifier,
title = {Adaptive Classifier: Dynamic Text Classification with Continuous Learning},
author = {Asankhaya Sharma},
year = {2025},
publisher = {GitHub},
url = {https://github.com/codelion/adaptive-classifier}
}