MHubAI

mhub-segmentation

0
0
# Install this skill:
npx skills add MHubAI/MHubSkill

Or install specific skill: npx add-skill https://github.com/MHubAI/MHubSkill

# Description

Discover MHub medical imaging AI models, look up anatomical segment codes (SegDB/SNOMED), and generate workflow configurations for running models on NIfTI files. Use this skill when users ask about medical image segmentation models, what models can segment specific anatomy, how to run MHub models on their data, or need DICOM-SEG metadata. Works offline with cached data.

# SKILL.md


name: mhub-segmentation
description: Discover MHub medical imaging AI models, look up anatomical segment codes (SegDB/SNOMED), and generate workflow configurations for running models on NIfTI files. Use this skill when users ask about medical image segmentation models, what models can segment specific anatomy, how to run MHub models on their data, or need DICOM-SEG metadata. Works offline with cached data.


MHub Segmentation Skill

This skill provides tools for working with MHub medical imaging AI models and the SegDB anatomical segment database.

Capabilities

  1. Model Discovery - Find models by modality, anatomy, or capability
  2. Segment Lookup - Get SNOMED codes and metadata for anatomical structures
  3. Workflow Generation - Create custom configs for NIfTI/NRRD input
  4. DCMQI Config Generation - Generate metadata for DICOM-SEG conversion

Quick Reference

Find Models

# List all models
python scripts/mhub_helper.py models

# Filter by modality
python scripts/mhub_helper.py models --modality CT
python scripts/mhub_helper.py models --modality MR

# Find models that segment specific anatomy
python scripts/mhub_helper.py find liver kidney
python scripts/mhub_helper.py find heart cardiac
python scripts/mhub_helper.py find lung

# Get model details
python scripts/mhub_helper.py model totalsegmentator
python scripts/mhub_helper.py model lungmask

Look Up Segments

# Search segments
python scripts/mhub_helper.py segments --search kidney
python scripts/mhub_helper.py segments --search heart

# Get segment details (SNOMED code, color)
python scripts/mhub_helper.py segment LIVER
python scripts/mhub_helper.py segment LEFT_KIDNEY

Generate Workflow Configs

# Generate NIfTI workflow for a model
python scripts/mhub_helper.py config totalsegmentator --pattern flat --output custom.yml
python scripts/mhub_helper.py config lungmask --pattern subject_folders --output custom.yml
python scripts/mhub_helper.py config platipy --pattern bids --modality ct --output custom.yml

# Generate DCMQI config for DICOM-SEG
python scripts/mhub_helper.py dcmqi totalsegmentator --output dcmqi_meta.json

Run Models via Docker

# Run a model on your NIfTI folder with an optional workflow or config override
python scripts/mhub_helper.py run lungmask \  
   --input /path/to/nifti \  
   --output /path/to/results \  
   --config ./custom.yml

If you prefer to use one of the built-in workflows, swap --config for --workflow default (or the workflow name from assets/workflow-templates).

Data Cache

This skill includes cached data for offline operation:

Cache Contents Location
Models 30 MHub models with metadata data/models_summary.json
SegDB 155 anatomical segments with SNOMED data/segdb_cache.json
Configs Default workflows for all models assets/workflow-templates/defaults/

Cache date: 2025-01-29

To refresh cache (requires network):

python scripts/mhub_helper.py refresh

Common Tasks

"What models can segment the liver?"

python scripts/mhub_helper.py find liver

Returns: totalsegmentator, nnunet_liver, bamf_nnunet_ct_liver, mrsegmentator, etc.

"How do I run TotalSegmentator on my NIfTI files?"

  1. Generate a custom workflow config:
    bash python scripts/mhub_helper.py config totalsegmentator --pattern flat --output custom.yml

  2. Run with Docker:
    bash docker run --rm --gpus all \ -v /path/to/nifti:/app/data/input_data:ro \ -v /path/to/output:/app/data/output_data \ -v ./custom.yml:/app/config/custom.yml:ro \ mhubai/totalsegmentator:latest \ --config /app/config/custom.yml

For detailed NIfTI workflow instructions, see references/nifti-workflows.md.

"What's the SNOMED code for left kidney?"

python scripts/mhub_helper.py segment LEFT_KIDNEY

Returns: SNOMED code 64033007, color RGB(212, 126, 151)

"I need to create a DICOM-SEG from my segmentations"

  1. Generate DCMQI metadata:
    bash python scripts/mhub_helper.py dcmqi totalsegmentator --output meta.json

  2. Run DCMQI converter:
    bash itkimage2segimage \ --inputImageList segmentation.nii.gz \ --inputDICOMDirectory /path/to/dicom \ --outputDICOM output.seg.dcm \ --inputMetadata meta.json

File Organization Patterns

The skill supports three common file organization patterns:

Pattern Description Example Structure
flat All NIfTI in one folder input_data/*.nii.gz
subject_folders One folder per subject input_data/subject_id/*.nii.gz
bids BIDS-compliant input_data/sub-XX/anat/*_T1w.nii.gz

For advanced patterns (clinical trials, multi-site, custom naming), see:
- references/filestructure-patterns.md - FileStructureImporter syntax
- references/dataorganizer-patterns.md - Output organization options
- references/nifti-workflows.md - Complete workflow guide

Workflow Templates

Pre-built templates in assets/workflow-templates/:

Template Use Case
nifti_generic.yml Starting point for any model
bids_template.yml BIDS-compliant data
clinical_trial_template.yml Multi-site with encoded filenames
defaults/<model>.yml Original DICOM configs for reference

Available Models (30)

CT Segmentation

  • totalsegmentator - 104 structures (organs, bones, muscles, vessels)
  • platipy - 17 cardiac structures for radiotherapy
  • lungmask - Lungs and 5 lobes
  • casust - 8 cardiac structures
  • nnunet_liver - Liver + tumor
  • nnunet_pancreas - Pancreas + tumor
  • bamf_nnunet_ct_kidney - Kidney + tumor + cyst

MR Segmentation

  • mrsegmentator - 38 structures (CT/MR compatible)
  • bamf_nnunet_mr_prostate - Prostate
  • monai_prostate158 - Prostate zones
  • gc_spider_baseline - 48 spine structures

PET/CT

  • bamf_pet_ct_lung_tumor - Lung + FDG-avid tumor
  • bamf_pet_ct_breast_tumor - Breast FDG-avid tumor

Prediction Models

  • gc_picai_baseline - Prostate cancer likelihood
  • gc_grt123_lung_cancer - Lung cancer risk
  • pyradiomics - Radiomic feature extraction

Use python scripts/mhub_helper.py models for the complete list.

Dependencies

For offline use: No dependencies (uses cached data)

For cache refresh: pip install requests

For SegDB Python API: pip install segdb

Environment Notes

  • Claude.ai (restricted network): Full functionality using cached data
  • Claude Code: Full functionality + cache refresh capability
  • Local execution: Full functionality + Docker for running models

# README.md

MHub Segmentation Skill

A Claude skill for discovering MHub medical imaging AI models, looking up anatomical segment codes, and generating workflow configurations for running models on NIfTI files.

Features

  • Model Discovery: Find models by modality (CT, MR, PET), anatomy (liver, lung, heart), or capability
  • Segment Lookup: Get SNOMED CT codes, display colors, and metadata for 155 anatomical structures
  • Workflow Generation: Create custom YAML configs to run MHub models on NIfTI/NRRD files
  • DCMQI Config Generation: Generate metadata JSON for DICOM-SEG conversion
  • Offline-First: Works without network access using bundled cached data

Installation

For Claude.ai

  1. Download or clone this skill folder

  2. Create a ZIP file of the skill:
    bash cd mhub-segmentation zip -r ../mhub-segmentation.zip .

  3. In Claude.ai:

  4. Go to Settings β†’ Features β†’ Skills
  5. Click Add Skill
  6. Upload the mhub-segmentation.zip file
  7. Enable the skill

  8. The skill will automatically activate when you ask about:

  9. MHub models or medical image segmentation
  10. Running AI models on NIfTI files
  11. SNOMED codes for anatomical structures
  12. DICOM-SEG conversion

For Claude Code

  1. Clone or copy the skill to your Claude Code skills directory:
    ```bash
    # Option 1: User skills (available in all projects)
    cp -r mhub-segmentation ~/.claude/skills/

# Option 2: Project skills (available in current project)
cp -r mhub-segmentation .claude/skills/
```

  1. The skill is now available. Claude Code will automatically use it when relevant, or you can invoke it directly:
    /mhub-segmentation

  2. To refresh the model cache (requires network):
    bash python ~/.claude/skills/mhub-segmentation/scripts/mhub_helper.py refresh

Usage Examples

Find Models

Ask Claude:
- "What MHub models can segment the liver?"
- "Show me CT segmentation models"
- "What models work with MRI data?"

Or use the helper script directly:

python scripts/mhub_helper.py find liver kidney
python scripts/mhub_helper.py models --modality CT
python scripts/mhub_helper.py model totalsegmentator

Run Models on NIfTI Files

Ask Claude:
- "How do I run TotalSegmentator on my NIfTI files?"
- "Generate a workflow config for lungmask with BIDS data"
- "Help me process my CT scans with MHub"

Or generate configs directly:

python scripts/mhub_helper.py config totalsegmentator --pattern flat --output custom.yml

docker run --rm --gpus all \
  -v /path/to/nifti:/app/data/input_data:ro \
  -v /path/to/output:/app/data/output_data \
  -v ./custom.yml:/app/config/custom.yml:ro \
  mhubai/totalsegmentator:latest \
  --config /app/config/custom.yml

Look Up Segment Codes

Ask Claude:
- "What's the SNOMED code for left kidney?"
- "Show me segment info for LIVER"
- "What segments does TotalSegmentator output?"

Or use the script:

python scripts/mhub_helper.py segment LEFT_KIDNEY
python scripts/mhub_helper.py segments --search heart

Skill Contents

mhub-segmentation/
β”œβ”€β”€ SKILL.md                 # Main skill instructions
β”œβ”€β”€ README.md                # This file
β”œβ”€β”€ data/
β”‚   β”œβ”€β”€ models_cache.json    # Raw MHub API response
β”‚   β”œβ”€β”€ models_summary.json  # Processed model index
β”‚   └── segdb_cache.json     # SegDB segments with SNOMED codes
β”œβ”€β”€ references/
β”‚   β”œβ”€β”€ nifti-workflows.md   # Complete NIfTI workflow guide
β”‚   β”œβ”€β”€ filestructure-patterns.md  # FileStructureImporter syntax
β”‚   └── dataorganizer-patterns.md  # Output organization options
β”œβ”€β”€ scripts/
β”‚   └── mhub_helper.py       # CLI tool for queries and config generation
└── assets/
    └── workflow-templates/
        β”œβ”€β”€ nifti_generic.yml        # Generic NIfTI template
        β”œβ”€β”€ bids_template.yml        # BIDS-compatible template
        β”œβ”€β”€ clinical_trial_template.yml  # Multi-site template
        └── defaults/                # Original configs for 30 models

Cached Data

The skill includes cached data for offline operation:

Data Count Source Cache Date
MHub Models 30 mhub.ai API 2025-01-29
SegDB Segments 155 segdb Python package 2025-01-29
Default Configs 30 MHub GitHub repository 2025-01-29

To update the cache (requires network access):

python scripts/mhub_helper.py refresh

Supported Models

The skill includes data for all 30 MHub models:

CT Segmentation: totalsegmentator (104 structures), platipy (cardiac), lungmask, casust, nnunet_liver, nnunet_pancreas, bamf_nnunet_ct_kidney, gc_lunglobes, nnunet_segthor, gc_autopet_fpr, gc_nnunet_pancreas, msk_smit_lung_gtv

MR Segmentation: mrsegmentator, bamf_nnunet_mr_prostate, bamf_nnunet_mr_liver, nnunet_prostate_zonal_task05, nnunet_prostate_task24, monai_prostate158, gc_spider_baseline

PET/CT: bamf_pet_ct_lung_tumor, bamf_pet_ct_breast_tumor

Prediction: gc_picai_baseline, gc_grt123_lung_cancer, gc_stoic_baseline, fmcib_radiomics, pyradiomics, gc_node21_baseline, gc_wsi_bgseg, gc_tiger_lb2

Requirements

  • Offline use: Python 3.8+ (no additional packages needed)
  • Cache refresh: requests package
  • SegDB Python API: segdb package
  • Running models: Docker with NVIDIA GPU support

License

This skill is provided under the MIT License.

MHub models have individual licenses (typically Apache 2.0 for code, CC BY-NC 4.0 for weights). Check each model's license before use.

Resources

# Supported AI Coding Agents

This skill is compatible with the SKILL.md standard and works with all major AI coding agents:

Learn more about the SKILL.md standard and how to use these skills with your preferred AI coding agent.