Technologies: AWS, Pulumi, serverless, React, LLMs
My Protege was an AI platform that I built and ran from 2021-2024 that allowed experts in any field to train a chatbot through a normal conversation—no document ingestion required.
I built all of the tech, from the infrastructure, to the backend, to the multiple frontends, as well as the internal libraries to handle the proprietary knowledge-graph database, and the fast CI/CD deployment system.
Ultimately the project did not grow to a sustainable business, but it was an invaluable learning experience in managing all aspects of the tech.

Infrastructure
Infrastructure gets complicated quickly, expensive to lose, and error-prone to rebuild. Starting from a solid infrastructure foundation allows an engineer to build quickly and confidently.
I managed My Protege’s infrastructure as code (IaC) with Pulumi. Coming from a heavy Terraform background, I used this green field project as an opportunity to get familiar with Pulumi to learn its strengths and weaknesses relative to my existing knowledge.
I designed the infrastructure along two dimensions: environment (eg: prod/qa) and region (eg: us-west-2). Although I was only deployed in a single region, having region baked in early was a low-effort, high-payoff decision should I ever need to change regions or become multi-region.
I leveraged safe practices like using a IAM-secured, versioned S3 bucket for the Pulumi state backend, ensuring that any changes could only be performed by authorized users/roles, and could be rolled back if necessary.
Frontends
My Protege has 2 frontends, one to let the expert train the AI, and another for users to chat with the AI.
API Backend
My Protege’s API backend is a Python-based Docker image, hosted on AWS Lambda (serverless) for cost-effective autoscaling. It has