Run and summarize PHOTONAI analyses
Project description
Getting started
This page shows how to set up a simple PHOTONAI project using
PhotonaiProject, run analyses, perform permutation tests, and
statistically compare different analyses.
You can find the in-depth documentation here: https://wwu-mmll.github.io/photonai_projects/
Installation
Install the package (and PHOTONAI) into your environment:
pip install photonai photonai-projects
Basic concepts
A PhotonaiProject manages multiple PHOTONAI analyses in a single project folder. Each analysis has its own subfolder containing:
-
a hyperpipe constructor script (hyperpipe_constructor.py)
-
a metadata file (hyperpipe_meta.json)
-
a data/ folder with X.npy and y.npy
-
(optionally) a permutations/ folder for permutation tests
The typical workflow is:
- Create a project with PhotonaiProject.
- Add analyses (data + hyperpipe constructor).
- Run analyses to train and evaluate the models.
- Run permutation tests to obtain null distributions.
- Compare analyses statistically.
Minimal example
Below is a complete example using the breast cancer dataset from scikit-learn. We create three analyses using different feature sets, run them, run permutation tests, and then compare them statistically.
from photonai_projects.project import PhotonaiProject
from sklearn.datasets import load_breast_cancer
# Load example data
X, y = load_breast_cancer(return_X_y=True)
# Split features into different sets
X_1 = X[:, :3]
X_2 = X[:, 3:6]
# Create a project
project = PhotonaiProject(project_folder="example_project")
# ---------------------------------------------------------------------
# 1) Register analyses
# ---------------------------------------------------------------------
for name, current_X in [
("all_features", X),
("first_feature_set", X_1),
("second_feature_set", X_2),
]:
project.add(
name=name,
X=current_X,
y=y,
hyperpipe_script="path/to/hyperpipe_constructor.py",
name_hyperpipe_constructor="create_hyperpipe",
)
project.list_analyses()
# ---------------------------------------------------------------------
# 2) Run analyses
# ---------------------------------------------------------------------
for name in ["all_features", "first_feature_set", "second_feature_set"]:
project.run(name=name)
# ---------------------------------------------------------------------
# 3) Run permutation tests (local example)
# ---------------------------------------------------------------------
# Use a small number of permutations for testing; increase for real studies.
for name in ["all_features", "first_feature_set", "second_feature_set"]:
project.run_permutation_test(name=name, n_perms=10, overwrite=True)
# ---------------------------------------------------------------------
# 4) Statistical comparison of analyses
# ---------------------------------------------------------------------
# For the Nadeau–Bengio test you must provide n_train and n_test as used
# during cross-validation. Here we give a simple example.
n_samples = X.shape[0]
n_train = int(0.8 * n_samples)
n_test = n_samples - n_train
# Compare two analyses (Nadeau–Bengio corrected t-test)
project.compare_analyses(
first_analysis="first_feature_set",
second_analysis="second_feature_set",
method="nadeau-bengio",
n_train=n_train,
n_test=n_test,
)
# Compare two analyses (permutation-based)
project.compare_analyses(
first_analysis="all_features",
second_analysis="second_feature_set",
method="permutation",
n_perms=10,
)
# Compare all pairs at once (optional)
multi_results = project.compare_multiple_analyses(
analyses=["all_features", "first_feature_set", "second_feature_set"],
method="permutation",
n_perms=10,
)
print(multi_results.head())
Running permutation tests on a SLURM cluster
For large numbers of permutations, you can distribute them across a SLURM array:
project.prepare_slurm_permutation_test(
name="second_feature_set",
n_perms=1000,
conda_env="my_photonai_env",
memory_per_cpu=2,
n_jobs=20,
run_time="0-02:00:00",
random_state=1,
)
This creates a slurm_job.cmd script in the analysis folder which you can submit with:
cd example_project/second_feature_set
sbatch slurm_job.cmd
Each array job will call the Typer CLI entry point run_perm_job and execute a subset of permutation runs.
Next steps
See the Usage page for more details on:
- how to design your hyperpipe constructor,
- how metrics and scorers are handled,
- how to interpret the comparison reports.
See the API Reference for the full documentation of PhotonaiProject.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file photonai_projects-0.1.1.tar.gz.
File metadata
- Download URL: photonai_projects-0.1.1.tar.gz
- Upload date:
- Size: 17.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
082ce904c3dc6e5e606f8745cc942663153fcc68ee135da2c93f3f1f198cc126
|
|
| MD5 |
b62c625fe9e9fc05b3d116407db4dc0f
|
|
| BLAKE2b-256 |
43b40727f02eba0cdcd4dc8065a14a25c19ca13cd35032c371d763a7059b015d
|
Provenance
The following attestation bundles were made for photonai_projects-0.1.1.tar.gz:
Publisher:
publish.yml on wwu-mmll/photonai_projects
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
photonai_projects-0.1.1.tar.gz -
Subject digest:
082ce904c3dc6e5e606f8745cc942663153fcc68ee135da2c93f3f1f198cc126 - Sigstore transparency entry: 721342714
- Sigstore integration time:
-
Permalink:
wwu-mmll/photonai_projects@1958ece49e72a20aeb0530ec241a8efed1beca7a -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/wwu-mmll
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@1958ece49e72a20aeb0530ec241a8efed1beca7a -
Trigger Event:
push
-
Statement type:
File details
Details for the file photonai_projects-0.1.1-py3-none-any.whl.
File metadata
- Download URL: photonai_projects-0.1.1-py3-none-any.whl
- Upload date:
- Size: 17.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
57b98161aae22f3558f93d805821cbf88349f0b61ab129d3cfed5d64c16cede3
|
|
| MD5 |
604ac1ba5212ed19033b3abfa82a690f
|
|
| BLAKE2b-256 |
29aa2e4b1b2ecf975da6a3c2a52ecea03764de0d2befa65098dc9eb04dcc93a7
|
Provenance
The following attestation bundles were made for photonai_projects-0.1.1-py3-none-any.whl:
Publisher:
publish.yml on wwu-mmll/photonai_projects
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
photonai_projects-0.1.1-py3-none-any.whl -
Subject digest:
57b98161aae22f3558f93d805821cbf88349f0b61ab129d3cfed5d64c16cede3 - Sigstore transparency entry: 721342736
- Sigstore integration time:
-
Permalink:
wwu-mmll/photonai_projects@1958ece49e72a20aeb0530ec241a8efed1beca7a -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/wwu-mmll
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@1958ece49e72a20aeb0530ec241a8efed1beca7a -
Trigger Event:
push
-
Statement type: