A comprehensive Python client for all Open-Meteo weather APIs (community package, not affiliated with openmeteo-sdk)
Project description
OpenMeteo Python
The Open-Meteo SDK on steroids — but not actually the SDK, just a really enthusiastic fan.
Think of this as the unofficial remix: same great weather data, but with extra features your workflow didn't know it needed. We took the Open-Meteo API and added CLI tools, Airflow DAG generators, standalone script generators, FlatBuffers support for the speed freaks, and enough validation to make your data pipeline sleep soundly at night.
Disclaimer: This is a community package and has no affiliation with the official
openmeteo-sdk. We just really like weather data and couldn't stop adding features.
Features
- Complete API Coverage: Access all Open-Meteo APIs from a single library
- Type-Safe: Full type hints for better IDE support
- DataFrame Support: Built-in pandas DataFrame conversion
- FlatBuffers Support: High-performance binary format with zero-copy data access
- CLI Tool: Command-line interface for quick data extraction
- Script Generator: Generate standalone Python scripts for any API
- DAG Generator: Generate Airflow-ready DAG scripts
- Docker Environment: Local Airflow setup for testing DAGs without installation
- Comprehensive Validation: Input validation for coordinates, dates, and API responses
- Retry Logic: Automatic retries with configurable backoff
- Airflow Ready: Works seamlessly with Apache Airflow and other orchestration tools
Supported APIs
| API | Description |
|---|---|
| Forecast | Weather forecasts up to 16 days |
| Historical | Weather data from 1940 onwards |
| Air Quality | Pollutants, pollen, and AQI indices |
| Marine | Wave heights, ocean currents, sea temperature |
| Flood | River discharge forecasts |
| Climate | Climate model projections (1950-2050) |
| Ensemble | Probabilistic forecasts from multiple models |
| Geocoding | Location name to coordinates |
| Elevation | Terrain elevation data |
Installation
pip install openmeteo-python
For high-performance FlatBuffers support with zero-copy data access:
pip install openmeteo-python[fast]
Or install from source:
git clone https://github.com/tmmsunny012/openmeteo-python.git
cd openmeteo-python
pip install -e .
# With FlatBuffers support
pip install -e ".[fast]"
Quick Start
Python Library
from openmeteo import OpenMeteo
# Initialize client
client = OpenMeteo()
# Get weather forecast
forecast = client.forecast.get(
latitude=52.52,
longitude=13.41,
hourly=["temperature_2m", "precipitation", "wind_speed_10m"]
)
# Convert to DataFrame
df = forecast.to_dataframe()
print(df.head())
# Get current conditions
current = client.forecast.get_current(
latitude=52.52,
longitude=13.41,
variables=["temperature_2m", "weather_code", "wind_speed_10m"]
)
print(f"Temperature: {current.current.temperature_2m}°C")
Command Line Interface
# Get weather forecast
openmeteo forecast --lat 52.52 --lon 13.41 --hourly temperature_2m,precipitation
# Export to CSV
openmeteo forecast --lat 52.52 --lon 13.41 --hourly temperature_2m -o forecast.csv
# Get historical data
openmeteo historical --lat 52.52 --lon 13.41 \
--start 2024-01-01 --end 2024-01-31 \
--hourly temperature_2m,precipitation
# Search for a location
openmeteo geocoding --search "New York"
# Get air quality
openmeteo air-quality --lat 52.52 --lon 13.41 --hourly pm10,pm2_5,european_aqi
# List available APIs for script generation
openmeteo list-apis
Generate Standalone Scripts
Generate standalone Python scripts that work without installing this package. Perfect for cron jobs, one-off runs, or integration into other systems.
Generate Simple Script (Single API)
# Generate a forecast script
openmeteo generate-script --api forecast \
--locations "Berlin:52.52:13.41,Paris:48.85:2.35" \
-o forecast_script.py
# Generate historical data script
openmeteo generate-script --api historical \
--locations "NYC:40.71:-74.01" \
-o historical_script.py
# Generate with JSON output format
openmeteo generate-script --api air_quality \
--locations "London:51.51:-0.13" \
--format json \
-o air_quality_script.py
Running Generated Scripts
# Run with defaults
python forecast_script.py
# Custom output directory and format
python forecast_script.py -o ./data -f json
# Historical script with date range
python historical_script.py --start-date 2024-01-01 --end-date 2024-06-30
# Show help
python forecast_script.py --help
Generated Script Features
- No dependencies on this package - runs standalone with just
requestsandpandas - Built-in validation - validates coordinates, dates, and API responses
- Logging - comprehensive logging for debugging and monitoring
- CLI arguments - configurable output directory, format, and dates
- Retry logic - automatic retries with exponential backoff
- Error handling - graceful error handling with informative messages
Generate Airflow DAG Scripts
Generate multi-API pipeline scripts that work with Apache Airflow or as standalone scripts.
Generate DAG Script (Multiple APIs)
# Generate DAG with multiple APIs
openmeteo generate-dag \
--apis forecast,air_quality,marine \
--locations "Berlin:52.52:13.41,Paris:48.85:2.35" \
--schedule "@daily" \
-o weather_pipeline.py
# Generate DAG with custom settings
openmeteo generate-dag \
--apis historical,climate \
--locations "NYC:40.71:-74.01" \
--schedule "0 6 * * *" \
--output-dir /data/weather \
--output-format json \
--dag-id my_weather_dag \
-o my_dag.py
# Generate without Airflow wrapper (standalone only)
openmeteo generate-dag \
--apis forecast,historical \
--locations "Tokyo:35.68:139.69" \
--no-airflow \
-o standalone_pipeline.py
Using Generated DAG
# Run standalone
python weather_pipeline.py
# Or copy to Airflow DAGs folder
cp weather_pipeline.py ~/airflow/dags/
# The DAG will be auto-detected by Airflow
DAG Script Features
- Works as standalone script - run directly with Python
- Works as Airflow DAG - auto-detected when placed in DAGs folder
- Multi-API support - fetch from multiple APIs in parallel
- Configurable schedule - cron expression or Airflow presets (@daily, @hourly, etc.)
- Built-in validation - validates all inputs and API responses
- Comprehensive logging - visible in Airflow task logs
API Examples
Weather Forecast
from openmeteo import OpenMeteo
client = OpenMeteo()
# Get hourly forecast
response = client.forecast.get_hourly(
latitude=52.52,
longitude=13.41,
variables=["temperature_2m", "precipitation", "wind_speed_10m"],
forecast_days=7
)
# Get daily forecast
response = client.forecast.get_daily(
latitude=52.52,
longitude=13.41,
variables=["temperature_2m_max", "temperature_2m_min", "precipitation_sum"]
)
# Get current conditions
response = client.forecast.get_current(
latitude=52.52,
longitude=13.41,
variables=["temperature_2m", "weather_code"]
)
Historical Weather
# Get historical hourly data
response = client.historical.get(
latitude=52.52,
longitude=13.41,
start_date="2023-01-01",
end_date="2023-12-31",
hourly=["temperature_2m", "precipitation"]
)
df = response.to_dataframe()
Air Quality
# Get air quality forecast
response = client.air_quality.get(
latitude=52.52,
longitude=13.41,
hourly=["pm10", "pm2_5", "european_aqi", "us_aqi"]
)
# Get pollen data (Europe only)
response = client.air_quality.get_pollen(
latitude=52.52,
longitude=13.41
)
Marine Weather
# Get marine forecast
response = client.marine.get(
latitude=54.32,
longitude=10.13,
hourly=["wave_height", "wave_direction", "sea_surface_temperature"]
)
Flood / River Discharge
# Get river discharge forecast
response = client.flood.get_forecast(
latitude=52.52,
longitude=13.41,
variables=["river_discharge", "river_discharge_max"],
forecast_days=92
)
# Get historical discharge
response = client.flood.get_historical(
latitude=52.52,
longitude=13.41,
start_date="2020-01-01",
end_date="2023-12-31"
)
Climate Projections
# Get climate model projections
response = client.climate.get(
latitude=52.52,
longitude=13.41,
start_date="2030-01-01",
end_date="2030-12-31",
models=["EC_Earth3P_HR"],
daily=["temperature_2m_max", "temperature_2m_min", "precipitation_sum"]
)
Ensemble Models
# Get ensemble forecast for probabilistic predictions
response = client.ensemble.get(
latitude=52.52,
longitude=13.41,
models=["icon_seamless", "gfs_seamless"],
hourly=["temperature_2m", "precipitation"]
)
Geocoding
# Search for a location
results = client.geocoding.search("Berlin")
print(f"{results[0].name}: {results[0].latitude}, {results[0].longitude}")
# Get coordinates directly
coords = client.geocoding.get_coordinates("New York")
print(f"Coordinates: {coords}")
Elevation
# Get elevation for a point
elevation = client.elevation.get_elevation(
latitude=52.52,
longitude=13.41
)
print(f"Elevation: {elevation}m")
# Get elevations for multiple points
elevations = client.elevation.get_batch([
(52.52, 13.41),
(48.85, 2.35),
(40.71, -74.01)
])
Configuration
API Key (Commercial Use)
from openmeteo import OpenMeteo
from openmeteo.base import APIConfig
config = APIConfig(api_key="your-api-key")
client = OpenMeteo(config=config)
Custom Timeout and Retries
config = APIConfig(
timeout=60, # Request timeout in seconds
retry_attempts=5, # Number of retry attempts
retry_delay=2.0 # Delay between retries
)
client = OpenMeteo(config=config)
FlatBuffers Support (High Performance)
For processing large datasets, this library supports FlatBuffers, a high-performance binary serialization format. This provides significant performance improvements through:
- Zero-copy data access: Numpy arrays are created directly from binary data without copying
- Faster parsing: Binary format is faster to parse than JSON
- Lower memory usage: No intermediate string representations
Installation
pip install openmeteo-python[fast]
Usage
from openmeteo import OpenMeteo, APIConfig, is_flatbuffers_available
# Check if FlatBuffers is available
print(f"FlatBuffers available: {is_flatbuffers_available()}")
# Use FlatBuffers format
config = APIConfig(format="flatbuffers")
client = OpenMeteo(config=config)
# Fetch data - returns same response format
response = client.forecast.get(
latitude=52.52,
longitude=13.41,
hourly=["temperature_2m", "precipitation", "wind_speed_10m"]
)
# Zero-copy DataFrame conversion
df = response.to_dataframe()
# Or get raw numpy arrays for maximum performance
numpy_data = response.hourly.to_numpy()
print(numpy_data["temperature_2m"]) # numpy.ndarray
Performance Comparison
| Operation | JSON | FlatBuffers |
|---|---|---|
| Parse 1000 hourly records | ~50ms | ~5ms |
| Memory allocation | High | Low (zero-copy) |
| DataFrame conversion | Copy data | Zero-copy |
When to Use FlatBuffers
- Processing large historical datasets
- Real-time data processing pipelines
- Memory-constrained environments
- High-frequency data fetching
Fallback Behavior
If FlatBuffers dependencies are not installed, the library automatically falls back to JSON format:
config = APIConfig(format="flatbuffers")
client = OpenMeteo(config=config) # Raises ImportError if not installed
Input Validation
The library includes comprehensive input validation:
from openmeteo.validators import (
CoordinateValidator,
DateValidator,
VariableValidator,
ValidationError
)
# Validate coordinates
try:
CoordinateValidator.validate_latitude(91.0) # Raises ValidationError
except ValidationError as e:
print(f"Invalid: {e}")
# Validate date format
DateValidator.parse_date("2024-01-15") # Returns date object
# Validate date range
DateValidator.validate_date_range("2024-01-01", "2024-12-31")
# Validate forecast variables
VariableValidator.validate_variables(
["temperature_2m", "precipitation"],
VariableValidator.FORECAST_HOURLY,
"hourly"
)
Using with Airflow
Direct Usage
from airflow import DAG
from airflow.operators.python import PythonOperator
from datetime import datetime
from openmeteo import OpenMeteo
def fetch_weather_data(**context):
client = OpenMeteo()
response = client.forecast.get(
latitude=52.52,
longitude=13.41,
hourly=["temperature_2m", "precipitation"]
)
df = response.to_dataframe()
df.to_csv("/tmp/weather_data.csv", index=False)
with DAG(
dag_id="weather_data_pipeline",
start_date=datetime(2024, 1, 1),
schedule_interval="@daily"
) as dag:
fetch_task = PythonOperator(
task_id="fetch_weather",
python_callable=fetch_weather_data
)
Using Generated DAG
# Generate the DAG
openmeteo generate-dag \
--apis forecast,air_quality \
--locations "Berlin:52.52:13.41" \
--schedule "@daily" \
-o ~/airflow/dags/weather_dag.py
# The DAG is automatically detected by Airflow
# View logs in Airflow UI
Docker Airflow Environment
Test your Airflow DAGs locally without installing Airflow on your machine. The Docker setup provides a complete Airflow environment with openmeteo-python pre-installed.
Quick Start
# Navigate to docker directory
cd docker
# Create environment file (Windows)
echo AIRFLOW_UID=50000 > .env
# Or on Linux/Mac
echo -e "AIRFLOW_UID=$(id -u)" > .env
# Build and start services
docker-compose build
docker-compose up -d
# Wait for services to be healthy
docker-compose ps
Access Airflow UI at http://localhost:8080 (username: airflow, password: airflow)
Service Architecture
┌─────────────────────────────────────────────────────────────┐
│ Docker Environment │
├─────────────────────────────────────────────────────────────┤
│ postgres (5432) → Airflow metadata database │
│ airflow-init → DB migrations, user creation │
│ airflow-webserver → Web UI (port 8080) │
│ airflow-scheduler → DAG scheduling │
│ airflow-triggerer → Async trigger handling │
└─────────────────────────────────────────────────────────────┘
Services start in sequence: postgres → airflow-init → webserver/scheduler/triggerer
DAG Management
# List all DAGs
docker-compose exec airflow-webserver airflow dags list
# Pause a DAG
docker-compose exec airflow-webserver airflow dags pause openmeteo_weather_pipeline
# Unpause a DAG
docker-compose exec airflow-webserver airflow dags unpause openmeteo_weather_pipeline
# Trigger a DAG manually
docker-compose exec airflow-webserver airflow dags trigger openmeteo_weather_pipeline
# Check DAG run status
docker-compose exec airflow-webserver airflow dags list-runs -d openmeteo_weather_pipeline
# Test a specific task
docker-compose exec airflow-webserver airflow tasks test \
openmeteo_weather_pipeline fetch_forecast 2025-01-15
Sample DAG
The Docker setup includes an example DAG (docker/dags/example_weather_dag.py) that:
- Fetches weather forecasts for Berlin, London, New York, and Tokyo
- Fetches air quality data for the same locations
- Generates a summary report
- Outputs CSV files to
docker/output/
Directory Structure
docker/
├── Dockerfile # Airflow + openmeteo-python image
├── docker-compose.yml # Service definitions
├── README.md # Detailed documentation
├── dags/ # Place your DAGs here
│ └── example_weather_dag.py
├── logs/ # Airflow logs
├── plugins/ # Custom plugins
└── output/ # DAG output files
Generate and Test Custom DAGs
# Generate a DAG
openmeteo generate-dag \
--apis forecast,air_quality \
--locations "Paris:48.85:2.35,Madrid:40.42:-3.70" \
-o docker/dags/europe_weather.py
# Wait for Airflow to detect it (~30 seconds)
docker-compose exec airflow-webserver airflow dags list
# Trigger the new DAG
docker-compose exec airflow-webserver airflow dags trigger europe_weather_pipeline
# Check output
ls -la docker/output/
Cleanup
# Stop services
docker-compose down
# Stop and remove all data (full reset)
docker-compose down -v
For detailed documentation, troubleshooting, and advanced configuration, see docker/README.md.
CLI Reference
usage: openmeteo [-h] [--version] [--api-key API_KEY] [--timeout TIMEOUT]
{forecast,historical,air-quality,marine,flood,climate,
ensemble,geocoding,elevation,generate-dag,generate-script,list-apis}
Commands:
forecast Get weather forecast data
historical Get historical weather data
air-quality Get air quality data
marine Get marine weather data
flood Get flood/river discharge data
climate Get climate change projection data
ensemble Get ensemble model forecast data
geocoding Search for locations
elevation Get elevation data
generate-dag Generate standalone DAG script (multi-API, Airflow support)
generate-script Generate standalone Python script (single API, no Airflow)
list-apis List available APIs for generation
Global Options:
-h, --help Show help message
--version Show version
--api-key API_KEY API key for commercial use
--timeout TIMEOUT Request timeout in seconds
-o, --output FILE Output file path (.csv or .json)
--format FORMAT Output format (csv, json, table)
generate-script Options:
--api API API to use (forecast, historical, air_quality, etc.)
--locations LOCS Locations as "Name:lat:lon,Name2:lat2:lon2"
-o, --output FILE Output Python file path
--output-dir DIR Data output directory (default: ./output)
--format FORMAT Data format: csv or json (default: csv)
generate-dag Options:
--apis APIS Comma-separated APIs (forecast,historical,etc.)
--locations LOCS Locations as "Name:lat:lon,Name2:lat2:lon2"
-o, --output FILE Output Python file path
--dag-id ID Airflow DAG ID (auto-generated if not provided)
--schedule SCHEDULE Cron or Airflow preset (default: @daily)
--output-dir DIR Data output directory (default: ./output)
--output-format FMT Data format: csv or json (default: csv)
--no-airflow Generate without Airflow DAG wrapper
Available Variables
Forecast Hourly Variables
- Temperature:
temperature_2m,apparent_temperature,dew_point_2m - Precipitation:
precipitation,rain,showers,snowfall,snow_depth - Wind:
wind_speed_10m,wind_speed_80m,wind_direction_10m,wind_gusts_10m - Clouds:
cloud_cover,cloud_cover_low,cloud_cover_mid,cloud_cover_high - Solar:
shortwave_radiation,direct_radiation,diffuse_radiation,uv_index - Soil:
soil_temperature_0cm,soil_moisture_0_to_1cm - Other:
weather_code,visibility,pressure_msl,cape
Air Quality Variables
- Pollutants:
pm10,pm2_5,carbon_monoxide,nitrogen_dioxide,ozone - Pollen:
alder_pollen,birch_pollen,grass_pollen(Europe only) - AQI:
european_aqi,us_aqi
Marine Variables
- Waves:
wave_height,wave_direction,wave_period - Swell:
swell_wave_height,swell_wave_direction - Ocean:
sea_surface_temperature,ocean_current_velocity
Logging
Generated scripts include comprehensive logging:
2025-01-15 10:30:00 - INFO - ============================================================
2025-01-15 10:30:00 - INFO - Weather Forecast Data Fetcher
2025-01-15 10:30:00 - INFO - ============================================================
2025-01-15 10:30:00 - INFO - Fetching Weather Forecast data for 2 locations
2025-01-15 10:30:00 - INFO - Fetching Berlin...
2025-01-15 10:30:01 - INFO - Fetching Paris...
2025-01-15 10:30:02 - INFO - Total records: 336
2025-01-15 10:30:02 - INFO - Saved 336 records to ./output/forecast_20250115_103002.csv
2025-01-15 10:30:02 - INFO - Done!
For cron jobs, redirect logs to a file:
python forecast.py >> /var/log/weather_fetch.log 2>&1
For Airflow, logs are automatically captured in task logs.
Development
# Install development dependencies
pip install -e ".[dev]"
# Run tests
pytest
# Run tests with coverage
pytest --cov=openmeteo
# Format code
black src tests
isort src tests
# Type checking
mypy src/openmeteo
Test Coverage
The library includes comprehensive test coverage with 198 tests achieving 80% code coverage:
Test Modules
| Module | Tests | Description |
|---|---|---|
test_clients.py |
43 | API client functionality, request handling, response parsing |
test_generator.py |
44 | DAG and script generators, validation, output formats |
test_validators.py |
25 | Coordinate, date, variable, and response validation |
test_flatbuffers.py |
25 | FlatBuffers parsing, zero-copy conversion, schema enums |
test_base_client.py |
35 | Base client retry logic, HTTP methods, parameter building |
test_cli.py |
38 | CLI commands, output formats, error handling |
Coverage by Component
| Component | Coverage |
|---|---|
base.py |
91% |
cli.py |
88% |
client.py |
100% |
generator.py |
97% |
models.py |
80% |
validators.py |
82% |
flatbuffers_schema.py |
100% |
What's Tested
Base Client:
- HTTP GET and POST request handling
- Retry logic with configurable attempts and delays
- Server error recovery (5xx errors)
- API error response handling
- Parameter building (lists, booleans, API keys)
- FlatBuffers format integration
- Context manager functionality
API Clients:
- All 9 API endpoints (forecast, historical, air_quality, marine, flood, climate, ensemble, geocoding, elevation)
- Request parameter building and validation
- Response parsing and model conversion
- Error handling and retries
CLI:
- All CLI commands (forecast, historical, air-quality, marine, flood, climate, ensemble, geocoding, elevation)
- Generator commands (list-apis, generate-dag, generate-script)
- Output format handling (CSV, JSON, table)
- Location resolution from coordinates or name
- Error handling and user feedback
Validators:
- Coordinate validation (latitude/longitude ranges, list matching)
- Date format and range validation
- Variable name validation for each API
- Response structure validation
- Input sanitization (timezone, units)
FlatBuffers:
- Schema enum definitions (Variable, Unit, Aggregation)
- Parser initialization and availability checks
- Zero-copy numpy array conversion
- DataFrame conversion with/without zero-copy
- Response-to-dict conversion
Generators:
- DAG generator with/without Airflow
- Standalone script generator for all 7 APIs
- Location and coordinate validation
- Output format configuration (CSV/JSON)
- Generated code syntax validation (compile check)
- Retry logic and error handling in generated scripts
Running Tests
# Run all tests
pytest
# Run with verbose output
pytest -v
# Run specific test module
pytest tests/test_generator.py
# Run with coverage report
pytest --cov=openmeteo --cov-report=term-missing
# Run only FlatBuffers tests
pytest tests/test_flatbuffers.py -v
License
MIT License - see LICENSE file for details.
Acknowledgments
This library is built on top of the excellent Open-Meteo API created by Patrick Zippenfenig and the Open-Meteo team. Open-Meteo provides free, open-source weather data for non-commercial use, making weather information accessible to everyone.
Important Notes:
- This is an unofficial community package and is not affiliated with, endorsed by, or connected to the official Open-Meteo project or the openmeteo-sdk
- All weather data is provided by Open-Meteo's APIs - please respect their terms of use
- For commercial use, consider supporting Open-Meteo by purchasing an API subscription
We are grateful to the Open-Meteo team for providing such an amazing free weather API to the community!
Author
TM Moniruzzaman Sunny
- GitHub: @tmmsunny012
If you find this project helpful, please consider:
Give it a Star
If this library saved you time or helped your project, a GitHub star would mean a lot and helps others discover this project.
Buy Me a Coffee
Building and maintaining open-source software takes time and effort. If you'd like to support continued development, you can buy me a coffee:
PayPal: sunny.tuhh@gmail.com
Your support keeps the code flowing and the features coming. Thank you!
Tags
#python #weather #api #open-meteo #openmeteo #forecast #climate #air-quality #marine #flood #geocoding #pandas #dataframe #flatbuffers #airflow #dag #etl #data-pipeline #meteorology #weather-data #climate-data #environmental-data #open-source
Made with passion for weather data and open source
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openmeteo_python-1.0.0.tar.gz.
File metadata
- Download URL: openmeteo_python-1.0.0.tar.gz
- Upload date:
- Size: 74.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0a72071fa461ee7cdf44ea831b359ce73ee585eb91c464adb0ddca0c91c47dc1
|
|
| MD5 |
e27b70fa74a3edb32d759e75b2b2fda8
|
|
| BLAKE2b-256 |
3ceaea0294dd3e4fff686d3ff64ea6c379147dbe944ada0e31a8dda0c1128856
|
File details
Details for the file openmeteo_python-1.0.0-py3-none-any.whl.
File metadata
- Download URL: openmeteo_python-1.0.0-py3-none-any.whl
- Upload date:
- Size: 62.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e42fbe3ee27081bab0dcc834320acdefb653dd6434577e337f11708733db289e
|
|
| MD5 |
d745a58581ebfaae9f32ddec22949790
|
|
| BLAKE2b-256 |
b61da0fc1e281fcb7598f89d8258db0c212936fac2725ba9cba8e96bf6af92dc
|