Technology

What is DistilBERT?

DistilBERT is a smaller, faster version of BERT (Bidirectional Encoder Representations from Transformers) that retains 97% of its language understanding while being 40% smaller and 60% faster. We use it at Kairo to classify your tasks—when you type "finish AP Bio homework," DistilBERT understands the context and categorizes it as academic work, not just random words. This deep language comprehension is what lets Kairo distinguish between "reading for fun" and "reading research papers" without you having to manually tag everything.

The magic happens through fine-tuning. We took the pre-trained DistilBERT model from Hugging Face and trained it on productivity-specific data—thousands of task descriptions, their categories, and completion times. Now it can accurately predict whether your "client meeting prep" will take 30 minutes or 3 hours based on how similar tasks have gone in the past. It's like having an AI that actually gets what you mean, not just what you typed.