Powered by Custom 140M Model + Groq API
Choose between two inference options based on your needs.
Recommended for general use. Uses RAG context from 28,071 Q&A pairs.
Trained from scratch on OSU's HPC cluster with H100 GPUs. Portfolio demonstration piece.
Both models use the same RAG retrieval system. The custom model showcases end-to-end ML engineering skills, while Groq provides fast, high-quality responses for practical use.
A full-stack ML application showcasing end-to-end machine learning engineering, from custom model training to production deployment.