Medicare Part D Drug Spending Pipeline
End‑to‑end pipeline: Bash → Python → SQL → PostgreSQL → Jupyter dashboard. Airflow orchestration in progress; focuses on reproducible ingestion and transformations.
- Python
- SQL
- PostgreSQL
- Airflow (planned)
Python Developer · Data Engineering & Automation · Airflow · SQL · ETL
I build reproducible data pipelines with Python and SQL and orchestrate them with Apache Airflow. Recent work includes a Medicare Part D drug‑spending pipeline (bash → Python → SQL → PostgreSQL → Jupyter dashboard) and a locum‑tenens expense & timesheet ETL. I value clean structure, automation, and measurable outcomes.
Earlier in my career I worked across healthcare and software systems; today I focus on practical data engineering— ingestion, transformation, and lightweight analytics that turn raw files and APIs into decision‑ready datasets.
ETL design, batch orchestration, reliability.
Workflow scripting & repeatable processes.
From raw data to decision support.
End‑to‑end pipeline: Bash → Python → SQL → PostgreSQL → Jupyter dashboard. Airflow orchestration in progress; focuses on reproducible ingestion and transformations.
Automated ingestion and normalization of receipts and timesheets into a Postgres model, producing monthly reporting datasets and dashboards.