James Darpinian
Graphics & Computer Vision Engineer
Summary
I have built deep neural networks for eye tracking, VR rendering for Google Earth, and 3DOF head tracking for the Google VR SDK. I enjoy taking cutting-edge technology from concept to launch.
Experience
- See A Satellite Tonight 9/2019 — Present
- https://james.darpinian.com/satellites shows you where to see satellites in the night sky as they pass over your house. Full stack showcase for web tech including WebGL, Google Street View, WASM, Service Workers, Web Push, Geolocation, and much more.
- Meta 7/2021 — 4/2022
- Holograms for "Project Nazare" augmented reality glasses.
- Google 10/2016 — 5/2021
- Committer to both Chromium and WebKit open source projects. Defined and implemented Web graphics standards (WebGL, WebGPU) in collaboration with Mozilla, Microsoft, and Apple. Implemented majority of WebGL 2 API surface in WebKit/Safari.
- Eyefluence 1/2015 — 10/2016 (acquired by Google)
- I created Eyefluence's machine learning based eye tracking technology, putting the latest deep learning research into production. I contributed to Eyefluence's entire VR stack, including: UI/UX in Unity/C#, a Unity plugin for eye tracking data in C, GPU accelerated neural nets in Python, and MIPI CSI Linux kernel drivers setting camera registers over I²C.
- The system I built got rave reviews from TechCrunch, CNET, PC World, USA Today, and more.
- Google 10/2010 — 12/2014
- I conceived and created the first version of Google Earth VR. I implemented the 3DOF head tracker used in Google Cardboard and the Google VR SDK. I shipped WebGL rendering for Google Maps, one of the first and widest deployments of WebGL in the world.
- Microsoft 4/2008 — 9/2010