Skip to main content

RankSEG: A Statistically Consistent Segmentation Prediction Solver for Dice and IoU Metrics Optimization

Project description

🧩 RankSEG

Boost Segmentation Performance Instantly via Direct Dice/IoU Post-Optimization

PyPI License Python PyTorch GitHub Stars Documentation Hugging Face Spaces Open In Colab 中文文档

JMLR NeurIPS

Quick Start | Key Features | Benchmarks | Citation


RankSEG is a plug-and-play post-processing module that boosts segmentation performance (Dice/IoU) during inference. It works with ANY pre-trained probabilistic segmentation model (SAM, DeepLab, SegFormer, etc.) without any retraining or fine-tuning.

Explore RankSEG by reading our documentation.

🌟 Why RankSEG?

Conventional methods use argmax or fixed thresholding, which are not theoretically optimized for non-decomposable metrics like Dice or IoU. RankSEG bridges this gap by directly optimizing the target metric, yielding "free" performance gains.

Demo: RankSEG vs. Argmax on fashn-human-parser

RankSEG vs Argmax Comparison

⚡ Quick Start

RankSEG is designed to be dropped into your existing inference pipeline with just a few lines of code.

1. Installation

pip install -U rankseg

2. Basic Usage (3 Lines of Code)

from rankseg import RankSEG
import torch.nn.functional as F

# 1. Initialize RankSEG (optimizing for Dice)
rankseg = RankSEG(metric='dice')

# 2. Get probability output from YOUR model
# probs: (Batch, Class, H, W)
probs = F.softmax(model_logits, dim=1)

# 3. Get optimized predictions (Instantly!)
preds = rankseg.predict(probs)

💡 Try it now: Open In Colab

✨ Key Features

  • 🚀 Performance Boost: Consistently improves mIoU/mDice scores over standard argmax.
  • 🔌 Zero Effort: Compatible with any PyTorch model. No retraining, no fine-tuning.
  • 🆓 Training-Free: Purely post-processing. Works with frozen weights.
  • ⚡ Real-time Inference: Efficient RMA (Reciprocal Moment Approximation) solver.
  • 🧩 Versatile: Supports semantic (multi-class) and binary (multi-label) tasks.

📊 Benchmarks

RankSEG delivers consistent gains across various architectures and datasets without touching a single weight.

Model Dataset mIoU (Argmax) mIoU (RankSEG) Gain
DeepLabV3+ PASCAL VOC 77.25% 78.14% +0.89%
SegFormer PASCAL VOC 77.57% 78.59% +1.02%
UPerNet PASCAL VOC 79.52% 80.31% +0.79%
SegFormer ADE20K 40.00% 40.82% +0.82%
UPerNet ADE20K 42.86% 43.84% +0.98%

Detailed results available in our NeurIPS 2025 paper.

🛠️ Integrations & Demos

Framework Task Quick Start
Standard PyTorch Semantic Segmentation Colab
Segment Anything (SAM) Zero-shot Segmentation Colab
Hugging Face Interactive Demo Spaces
PaddleSeg Docs Docker

🔗 Citation

If you use RankSEG in your research, please cite our papers:

  • Dai, B., & Li, C. (2023). RankSEG: A Consistent Ranking-based Framework for Segmentation. Journal of Machine Learning Research, 24(224), 1-50. [link]
  • Wang, Z., & Dai, B. (2025). RankSEG-RMA: An Efficient Segmentation Algorithm via Reciprocal Moment Approximation. Advances in Neural Information Processing Systems (NeurIPS 2025). [link]
@article{dai2023rankseg,
  title={RankSEG: A Consistent Ranking-based Framework for Segmentation},
  author={Dai, Ben and Li, Chunlin},
  journal={Journal of Machine Learning Research},
  volume={24},
  number={224},
  pages={1--50},
  url={https://www.jmlr.org/papers/v24/22-0712.html},
  year={2023}
}

@inproceedings{wang2025rankseg,
  title={RankSEG-RMA: An Efficient Segmentation Algorithm via Reciprocal Moment Approximation},
  author={Wang, Zixun and Dai, Ben},
  booktitle={Advances in Neural Information Processing Systems},
  url={https://arxiv.org/abs/2510.15362},
  year={2025}
}

Star us on GitHub if RankSEG helps your project! ⭐

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rankseg-0.0.4.tar.gz (16.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

rankseg-0.0.4-py3-none-any.whl (15.1 kB view details)

Uploaded Python 3

File details

Details for the file rankseg-0.0.4.tar.gz.

File metadata

  • Download URL: rankseg-0.0.4.tar.gz
  • Upload date:
  • Size: 16.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for rankseg-0.0.4.tar.gz
Algorithm Hash digest
SHA256 e38ea46a0a07b5a4a6c900b67db809df8bc6643f9ccc9288714c10d245dd1596
MD5 bdb4dfbcad41dcfdf4b2e6dd609b4e32
BLAKE2b-256 5480d6db34f8969f5eb2615e8e49b7141ae6348d75663752691a4009a4c3b8bb

See more details on using hashes here.

File details

Details for the file rankseg-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: rankseg-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 15.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for rankseg-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 604696ffa2cad891782da03c0c10fb7d30757cbbd068d8d58702e62e7386f096
MD5 c8075de2b1ff7c225518510b193a06a9
BLAKE2b-256 e9d4ecb8942b230e84da4101c0de994590e779254b19ee6fbb6523b597812d4a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page