SENTINEL

The problem
UTSA's seismic research team was stuck in a manual workflow. They had access to live earthquake data from global sensor networks — feeds updating every few seconds — but their process was: export a CSV, load it into desktop mapping software, update manually. It was slow, it didn't scale, and it meant they were always looking at stale data.
What I built
A real-time monitoring dashboard that replaced the entire manual workflow. Earthquake events from USGS appear on a 3D globe within a second of detection. Researchers see color-coded markers scaled by magnitude, a live incident feed, and can monitor multiple regions simultaneously. The whole thing — from data arriving to pixels on screen — takes under a second.
What made it hard
The real challenge wasn't displaying data — it was making it reliable. When you have 20 researchers connected at once and someone's laptop goes to sleep, you can't just drop their messages. I built a connection manager that tracks who's connected, queues messages during disconnections, and delivers them when the client comes back. Result: zero message loss in production.
Performance
Thousands of earthquake markers on a 3D globe gets slow fast. The naive approach — re-render everything when new data arrives — dropped frame rates noticeably. The fix was updating a single data source in-place and letting the map engine diff the changes. Stable 60fps even with dense earthquake clusters.
Result
Deployed and in daily use by the UTSA research team. Replaced their manual CSV-and-desktop-GIS workflow entirely. Built solo in three weeks.