Production Inference Optimization Study
ONNX export, INT8 quantization, and adaptive batching applied to a production sentence-transformer. ~9× throughput improvement on CPU with 74% model size reduction and validated accuracy.
Machine Learning Engineer, HII Mission Technologies
Five years of professional experience across data engineering, software development, and ML engineering. Currently supporting the F-35 Joint Program Office's data modernization initiative at HII Mission Technologies on AWS GovCloud.
MS in Applied Mathematics (Towson University, 2021). Pursuing an MSE in AI Engineering at Johns Hopkins University (expected December 2028). The projects here are personal learning work and JHU coursework.
Currently exploring
Highlights from my ML/AI project portfolio — each structured as a case study with problem context, approach, and measurable outcomes.
ONNX export, INT8 quantization, and adaptive batching applied to a production sentence-transformer. ~9× throughput improvement on CPU with 74% model size reduction and validated accuracy.
A real-time anomaly detection pipeline over live ADS-B telemetry with no labeled training data. Fuses two commercial feeds, applies Kalman filtering to maintain state across sparse position updates, detects orbital and holding patterns with DBSCAN, and scores deviations with IsolationForest. Claude explains flagged aircraft in plain language. Deployed continuously on Fly.io with CI/CD.
Pure NumPy BGD, SGD+Momentum, Adam + neural net with backprop trained on MNIST. Loss landscape visualization and initialization sensitivity. 29 tests.
Systematic LoRA rank ablation on GPT-2 for dialogue summarization. Trains rank [2,4,8,16] x alpha [8,16,32] adapters on SAMSum, evaluates with ROUGE + BERTScore, and shows diminishing returns beyond rank 8. CPU-only, fully reproducible.