Thanks to visit codestin.com
Credit goes to github.com

Skip to content

tpys/fuxi-weather

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FuXi Weather

A Python package for weather and climate data processing, providing utility functions for working with weather and climate datasets in various formats (NetCDF, Zarr, GRIB). Developed for use with FuXi weather models and climate research applications.

Features

  • Data Loading: Support for NetCDF, Zarr, and GRIB files
  • Data Processing: Normalization, coordinate transformation, and unit conversion
  • Spatial Operations: Resizing, cropping, interpolation, and area averaging
  • Temporal Operations: Aggregation, resampling, and rolling statistics
  • Climate Analysis: Climate means, anomalies, percentiles, and trend analysis
  • GPU Acceleration: Optional CuPy support for faster computations
  • Efficient I/O: Optimized saving and loading of large climate datasets

Installation

Basic Installation

pip install fuxi-weather

Development Installation

For development or to get the latest features:

git clone https://github.com/fanjiang/fuxi-weather.git
cd fuxi-weather
pip install -e .

Optional Dependencies

Install with optional dependencies:

# For GPU acceleration
pip install fuxi-weather[gpu]

# For GRIB file support
pip install fuxi-weather[grib]

# For development tools
pip install fuxi-weather[dev]

# Install all optional dependencies
pip install fuxi-weather[all]

Quick Start

import fuxi_weather as fw
import xarray as xr

# Load a weather dataset
data = fw.load_dataarray("path/to/weather_data.nc")

# Print dataset information
fw.print_dataarray(data)

# Normalize the data
mean, std = fw.calc_mean_std(data, "output_dir")
normalized_data = fw.normalize(data, mean, std)

# Resize to different resolution
resized_data = fw.resize_dataarray(data, resolution=1.0)

# Calculate climate statistics
climate_mean = fw.climate_mean(data)
climate_std = fw.climate_std(data)

# Save processed data
fw.save_zarr(normalized_data, "processed_data.zarr")

Key Functions

Data Loading and I/O

  • load_dataarray(): Load data from NetCDF, Zarr, or GRIB files
  • load_zarr(): Specifically load Zarr datasets
  • load_grib(): Load GRIB files with metadata
  • save_nc(): Save to NetCDF format
  • save_zarr(): Save to Zarr format with compression

Data Processing

  • normalize() / unnormalize(): Data normalization and denormalization
  • update_dims(): Standardize dimension names
  • update_coords(): Update coordinate systems
  • chunk(): Configure dask chunking for efficient processing

Spatial Operations

  • resize_dataarray(): Change spatial resolution
  • crop_dataarray(): Crop to specific geographic bounds
  • spatial_interp(): Spatial interpolation
  • area_mean(): Calculate area-weighted averages

Temporal Operations

  • aggregate(): Temporal aggregation
  • resample_dataarray(): Resample to different time frequencies
  • process_weekly_mean(): Calculate rolling weekly means
  • filter_complete_hours(): Filter for complete temporal coverage

Climate Analysis

  • climate_mean() / climate_std(): Calculate climatological statistics
  • get_anomaly(): Compute anomalies from climatology
  • compute_quintile_clim(): Calculate percentile-based climatologies
  • tercile_edge(): Compute tercile boundaries for categorical forecasts

Advanced Usage

Working with Large Datasets

import fuxi_weather as fw

# Load and chunk large dataset efficiently
data = fw.load_zarr("large_dataset.zarr")
data = fw.chunk(data, time=30, lat=100, lon=100)

# Calculate statistics with memory management
mean, std = fw.calc_mean_std(data, "stats_dir", downscaling=True)

# Process in chunks and save by year
fw.save_by_year(data, "output_dir", ftype='zarr')

GPU Acceleration

import fuxi_weather as fw

# Compute percentiles with GPU acceleration (requires cupy)
percentiles = fw.compute_quintile_clim(
    data, 
    channels=["temperature", "precipitation"],
    spatial_chunks=4,  # Use spatial chunking for GPU memory management
    percentiles=[10, 25, 50, 75, 90]
)

Climate Data Analysis

import fuxi_weather as fw

# Analyze climate trends
trend = fw.get_trend(data)
detrended = fw.detrend_single(data, trend)

# Calculate seasonal statistics
climate_stats = fw.climate_mean(data)
anomalies = fw.get_anomaly(data, climate_stats)

# Compute probability forecasts
edges = fw.compute_edge(data, q=[1/3, 2/3])
probabilities = fw.compute_prob(forecast_data, edges)

Constants and Configuration

The package includes predefined constants for common use cases:

import fuxi_weather as fw

# Pressure levels
print(fw.LEVELS_13)  # [50, 100, 150, ..., 1000]
print(fw.LEVELS_37)  # Full pressure level list

# Regional domains
print(fw.HUADONG_AREA)  # East China domain
print(fw.ZHEJIANG_AREA)  # Zhejiang province domain

# Variable name mappings
print(fw.SHORT_NAME_MAPPING)  # Standard name conversions

Requirements

  • Python >= 3.8
  • numpy >= 1.20.0
  • pandas >= 1.3.0
  • xarray >= 0.20.0
  • zarr >= 2.10.0
  • dask[complete] >= 2021.6.0
  • tqdm >= 4.60.0

Optional Requirements

  • cupy >= 9.0.0 (for GPU acceleration)
  • cfgrib >= 0.9.10 (for GRIB file support)
  • eccodes >= 1.4.0 (for GRIB file support)

Contributing

We welcome contributions! Please see our contributing guidelines for details.

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Submit a pull request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Citation

If you use this package in your research, please cite:

Jiang, F. (2024). FuXi Weather: A Python package for weather and climate data processing. 
GitHub repository: https://github.com/fanjiang/fuxi-weather

Support

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages