Nick LaHaye is a research data scientist who specializes in developing deep learning applications for Earth remote sensing, aiming to support application-driven research for researchers and decision-makers as well as operational algorithms for current and future remote sensing campaigns and missions. Nick’s current research focuses on leveraging self-supervised deep learning, open-ended algorithms, and new techniques in representation learning and analysis, alongside multiple observing modalities/multi-sensor datasets to segment, represent, track, and further analyze ( via agent/SME co-exploration) object instances (wildfire fronts, aerosol plumes, algal blooms, palm oil farms, etc.) in low and no-label environments.
Other current and recent projects include animal movement forecasting and uncertainty quantification using multi-sensor surface parameterizations and animal tracking data, retrieval techniques and uncertainty estimation for 3D cloud tomography from multi-angle imagers, support for hydrological retrievals and validation techniques for multi-mission microwave radiometers (Sentinel-6, SWOT, and CRISTAL), water quality parameter retrievals (inverse problems) and uncertainty estimation for hyperspectral instruments, and acceleration of uncertainty quantification for the CARbon DAta MOdel fraMework (CARDAMOM) model.
With over a decade of experience at the Jet Propulsion Laboratory, including operational software development, geometric calibration analysis for remote sensing instruments, and deep learning applications for Earth remote sensing, Nick is passionate about applying this multidisciplinary expertise to help tackle challenges like natural hazards, biodiversity loss, and climate change.
Nick holds a PhD. from Chapman University in Computational and Data Sciences, an MS in Computer Science from USC, and a BS in Computer Science and Mathematics from Chapman University.