Microsoft Planetary Computer as sustainability tool
Growing up on a farm opened my eyes early on to how amazing and fragile nature is and how we as humans need to live in harmony with nature. Living in the Netherlands for the last couple of years has exposed me to awesome ways that farmers are pushing the boundaries to what is possible in "smart" technology driven agriculture. With that innovation comes the responsibility to keep ourselves accountable for our impact on nature. I believe that is where the Microsoft Planetary computer can play a role as real-time "planetary nervous system".
What is the Microsoft Planetary Computer?
The Microsoft Planetary Computer is a cloud-based geospatial analytics platform designed to aggregate, process, and democratize access to petabytes of environmental data for sustainability research and decision-making. At its core, the platform integrates multi-source datasets-ranging from satellite imagery (e.g., Sentinel-2, Landsat) and climate models (e.g., NASA NEX-GDDP-CMIP6) to biodiversity records (e.g., GBIF)-into a unified catalog optimized for large-scale analysis. By leveraging Azure’s distributed computing infrastructure, the Planetary Computer enables researchers to perform tasks like land-cover classification, carbon stock estimation, and disaster modeling at planetary scales, often reducing processing times from weeks to hours. A key innovation lies in its implementation of the SpatioTemporal Asset Catalog (STAC) API, which standardizes metadata discovery and allows users to query datasets by temporal, spatial, and spectral parameters. Partnerships with organizations like Esri and the Chesapeake Conservancy further extend its utility, enabling applications in precision agriculture, watershed management, and biodiversity conservation.
How to Get Started with Microsoft Planetary Computer
Accessing the STAC API and Core Datasets
Researchers initiate work on the Planetary Computer by interacting with its STAC API endpoint (https://planetarycomputer.microsoft.com/api/stac/v1), which serves as the gateway to discover and retrieve analysis-ready datasets. For Python users, the planetary-computer
SDK simplifies authentication and data access through built-in functions. A typical workflow involves:
1. Searching for datasets: Using spatial, temporal, and asset filters to identify relevant data (e.g., Sentinel-2 Level-2A surface reflectance products over the Amazon basin).
2. Signing assets: Applying the SDK’s sign_inplace
modifier to enable direct cloud access without manual credential management.
3. Loading data: Utilizing cloud-optimized formats like Cloud-Optimized GeoTIFFs (COGs) with libraries such as Xarray or Rasterio for efficient partial data reads.
Example python code from PlanetaryComputerExamples GitHub
import planetary_computer as pc
from pystac_client import Client
# Authenticate and search Sentinel-2 items
api = Client.open(
"https://planetarycomputer.microsoft.com/api/stac/v1",
modifier=pc.sign_inplace,
)
search = api.search(
collections=["sentinel-2-l2a"],
datetime="2024-01-01/2024-01-15",
query={"eo:cloud_cover": {"lt": 10}},
)
items = search.get_all_items()
Exploring Data with Planetary Computer Explorer
For non-programmatic exploration, the Planetary Computer Explorer web interface (https://planetarycomputer.microsoft.com/explorer) provides interactive visualization tools. Users can overlay multiple datasets-such as ESA WorldCover land classification and GridMET climate variables-to create custom mosaics and assess regional environmental trends. The Explorer also integrates prebuilt Jupyter Notebook templates for common workflows like deforestation analysis and urban heat island detection, which can be cloned to Azure Notebooks for modification.
Leveraging Azure Compute Resources
Large-scale analyses require provisioning Azure virtual machines (VMs) through the Marketplace, particularly for GPU-accelerated tasks like deep learning-based land-cover classification. The “ArcGIS for Microsoft Planetary Computer” VM template preconfigures tools like ArcGIS Pro and Raster Analytics, enabling seamless integration with the platform’s data catalog. Users must select the West Europe Azure region to minimize latency, as all Planetary Computer datasets reside there. For distributed computing, Dask clusters can be orchestrated via Azure Kubernetes Service (AKS) to parallelize operations across thousands of cores.
Real-World Applications and Organizational Collaborations
Chesapeake Conservancy: Precision Watershed Management
The Chesapeake Conservancy leverages the Planetary Computer’s Chesapeake Land Cover (13-class) dataset to map land-use changes across the 250,000 km² Chesapeake Bay watershed. Using Sentinel-2 (10m resolution) and NAIP (1m resolution) imagery, their AI models achieve 94% accuracy in detecting impervious surfaces and farmed wetlands. During Hurricane Florence, they processed 12TB of Sentinel-1 SAR data in under six hours to map flood extents, guiding emergency response teams. Their deep-learning workflows, hosted on Azure Machine Learning, directly inform the U.S. “30x30” land conservation targets by identifying priority areas for restoration.
CarbonPlan: Wildfire Risk Assessment for Carbon Markets
CarbonPlan utilizes Planetary Computer’s GridMET climate data and Landsat burn scars to monitor wildfires in California’s forest carbon offset projects. Their open-source tool (github.com/carbonplan/forest-offsets-fires) cross-references fire perimeters from NIFC with carbon project boundaries, revealing that 6.2% of California’s offset credits were impacted by wildfires between 2015–2022. By integrating Planetary Computer’s CONUS404 climate reanalysis, they project a 137% increase in high-severity burn areas by 2050 under RCP 8.5 scenarios.
GEO BON: Global Biodiversity Monitoring
The Group on Earth Observations Biodiversity Observation Network (GEO BON) employs the Planetary Computer to model habitat suitability for 1,700 endangered species. Partnering with Microsoft through a $1 million AI for Earth grant, they combine GBIF species occurrence records with Sentinel-2-derived NDVI time series to predict range shifts under climate change. In Central Africa, their SAR-based poaching detection system reduced illegal logging by 41% in pilot regions by correlating Sentinel-1 backscatter anomalies with ground patrol reports.
Radiant Earth Foundation: Open ML Training Data
Radiant Earth’s MLHub integrates with the Planetary Computer to host over 700 STAC-compliant training datasets, including labeled crops from Kenya’s Taita Hills and flood masks from Bangladesh. Their collaboration enables AI for Earth grantees to train models like the U-Net flood detector, which processes Sentinel-1 VV/VH polarization data to map inundation areas with 89% IoU accuracy. A 2023 pilot in Pakistan reduced flood response times from 72 to 8 hours by automating SAR-based alerts.
Element 84: Cloud-Optimized Data Pipelines
As a core contributor to the Planetary Computer’s architecture, Element 84 standardized 87% of its 25PB catalog into cloud-optimized formats like COGs and Zarr. Their work on the Sentinel-2 Level-2A pipeline reduced data preprocessing costs by 63% by implementing Azure Batch for parallel atmospheric correction. The 3DEP LiDAR COPC dataset, co-developed with USGS, now enables watershed managers to extract DEMs at 1m resolution for erosion modeling.
Localized Agricultural Advisory Systems
In Kenya’s Tana River Basin, the Radiant MLHub team combined Planetary Computer’s CHIRPS rainfall data with PlanetScope imagery to create a crop yield forecasting system. Smallholder farmers receive SMS alerts about optimal planting windows, increasing sorghum yields by 22% in 2023 trials. Similarly, Vietnam’s Rice Department uses MODIS NDVI and SoilGrids data from the platform to issue bacterial blight warnings, preventing $47M in annual losses. These collaborations demonstrate the Planetary Computer’s role as a force multiplier for sustainability innovation, bridging gaps between satellite data providers, ML developers, and on-the-ground practitioners.
Conclusion
The Microsoft Planetary Computer is a massive enabler for organizations across the globe to work together and help make the globe a more sustainable place. I can't wait to see what people can do with it and how we can live in even more harmony with nature as a result of it.