top of page

My Story 

    Around 2015, I was reading about Microfinance in African Agriculture from the World Bank when things changed.  I read a blog post how Instagram used PostgreSQL with Redis to serve User Timelines, Geo-Locations, and many other services.  I was immediately fascinated and wanted to learn more.  By 2018, I was entrenched in cloud system design with high fluency for Apache Technologies including Apache Spark, Kafka, Flink, Redis, Cassandra, and how they all ran on Apache Mesos, a technology inspired by Google’s Borg system. I went to MesosCon in LA to learn from the Twitter & Uber teams that were building scalable real-time ETL Pipelines & applications using Mesos & the Apache ecosystem.  I was investing 7 days a week riding the learning curve to learn from the very best and improve my skills across the technology arena from backend to front-end design.  

    If you were paying attention during that time in 2017-2018, as I surely was, it was noticeable when the “Attention Is All You Need” and  “BERT Papers” came out and began getting deserved praise.  The praise was more about “Natural Language Processing”, but you knew something had changed with those papers.  I began talking to people at MIT CSAIL about the Transformer Architecture that complimented the Attention mechanism.   By the end of 2019 I used my passion for Africa, hired 2 engineers from Amsterdam and was first to train the Kikuyu and Luganda LLM’s for 15 million Kenyan speakers.  The goal was a chatbot for African farmers and we got pretty far.  By then, BERT from Google was gaining momentum and I learned about FAISS and Weaviate, two early Databases that stored embeddings for “Neural Vector Search” that would become RAG years later.  I then quickly found Deepset, a Berlin based startup that was the first to be doing “Neural Vector Search/RAG” with technology they built.  I reached out, presented my knowledge, and was hired as their first American employee to lead their US efforts.  I spent some great time in Prague and Berlin learning from some of the best engineers in what would become AI.   

    For the past 4 years since then, I’ve continued riding the fast learning curve of AI and become highly fluent across the entire ecosystem.  This year, I designed & launched my first Reinforcement Learning Pipeline with successful learning using the veRL Framework with vLLM.  In 2026, I’ve moved into Transformers and Convolutional Multi-Hybrid Models used in Science like Evo2 from Stanford.  I’ve built an Advanced Compound AI System for Synthetic Biology to produce mRNA Vaccines for Oncology using Evo2, designed AI Systems around Biological Foundation Models, and currently building OmicsFlow in my spare time, an AI Biological Platform for Scientific Discovery using Reinforcement Learning.  During this time, my engineers have achieved Deterministic Inference using Batch Ops Variant which was inspired by the work of Mira Murati of OpenAI/Thinking Machines.  It’s been quite a journey thus far and I’m just getting started building AI Systems with Transformer and Neural Network Architectures across Text, Vision, Audio, and now Biology.  

 

*I currently have 5+ years in Sales across Multi-Cloud Data Infrastructure and AI Platforms, but I’m also certified in AI Product Management & looking for a role in AI PM for standard SaaS or Biology.  

  • Medium
  • LinkedIn
bottom of page