Skip to content

Add internal helper for batched modelzoo inference from in-memory arrays (inference runner)#3222

Merged
deruyter92 merged 16 commits intomainfrom
jaap/modelzoo_inference_api
Mar 19, 2026
Merged

Add internal helper for batched modelzoo inference from in-memory arrays (inference runner)#3222
deruyter92 merged 16 commits intomainfrom
jaap/modelzoo_inference_api

Conversation

@deruyter92
Copy link
Copy Markdown
Collaborator

@deruyter92 deruyter92 commented Feb 26, 2026

Summary

  • Adds a new in-memory inference helper:
    • create_superanimal_inference_runners(...)
    • located in deeplabcut.pose_estimation_pytorch.modelzoo.inference
  • Exposes this helper from deeplabcut.pose_estimation_pytorch.modelzoo.__init__ for convenient imports.

Motivation

Currently it is fairly easy to run batched inference on our PyTorch models, but the modelzoo api is an exception that requires us to save images to disk first. This PR implements a modelzoo inference runner factory to easily run batched inference from in-memory image arrays. This can be used internally for faster inference. (Inspired by get_inference_runners).

This solves #3218

Example usage

from pathlib import Path
import numpy as np
from PIL import Image
from deeplabcut.pose_estimation_pytorch.modelzoo.inference import (
    create_superanimal_inference_runners,
)

img_paths = [
    "/path/to/images/frame_0000.png",
    "/path/to/images/frame_0001.png",
    "/path/to/images/frame_0002.png",
]
images = [np.asarray(Image.open(Path(p)).convert("RGB")) for p in img_paths]

pose_runner, det_runner, model_cfg = create_superanimal_inference_runners(
    superanimal_name="superanimal_quadruped",
    model_name="hrnet_w32",
    detector_name="fasterrcnn_resnet50_fpn_v2",
    max_individuals=10,
    batch_size=1,
    detector_batch_size=1,
)

det_preds = det_runner.inference(images)
pose_inputs = list(zip(images, det_preds))
pose_preds = pose_runner.inference(pose_inputs)

Introduce a modelzoo helper that resolves configs and checkpoints via build_weight_init, then constructs pose/detector runners for future in-memory batched APIs.
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds a new Model Zoo helper to construct pose + detector inference runners for batched in-memory (array-based) SuperAnimal inference, and re-exports it for easier imports.

Changes:

  • Added create_superanimal_inference_runners(...) to build pose/detector InferenceRunners from an in-memory workflow.
  • Integrated config loading/updating + weight initialization (including optional custom checkpoints) into the runner factory.
  • Re-exported the helper from deeplabcut.pose_estimation_pytorch.modelzoo.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 5 comments.

File Description
deeplabcut/pose_estimation_pytorch/modelzoo/inference.py Adds the new runner-factory helper and required imports for config/weight init + device selection.
deeplabcut/pose_estimation_pytorch/modelzoo/__init__.py Re-exports the new helper for convenient imports.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link
Copy Markdown
Collaborator

@C-Achard C-Achard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good so far, thanks !

@deruyter92 deruyter92 added the enhancement New feature or request label Feb 28, 2026
Instead of several definitions of the persons class in the coco dataset, a single constant is placed in modelzoo.utils. This reduces code duplication improves maintainability and promotes centralized logic, rather than implying that the argument is intented to be tuned by the callers.
@deruyter92 deruyter92 marked this pull request as ready for review March 12, 2026 15:13
@deruyter92 deruyter92 requested review from AlexEMG and C-Achard March 12, 2026 15:14
Copy link
Copy Markdown
Collaborator

@C-Achard C-Achard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really nice, I like the minimally invasive design.
If we're alright with making small tweaks under the hood, see comments, otherwise all good.

(Note : did not have the opportunity to test it yet)

@deruyter92 deruyter92 requested a review from C-Achard March 16, 2026 15:29
@C-Achard C-Achard requested a review from MMathisLab March 19, 2026 15:16
Copy link
Copy Markdown
Member

@MMathisLab MMathisLab left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did not run the PR, so maybe speed test this, but otherwise Lgtm!

@deruyter92
Copy link
Copy Markdown
Collaborator Author

deruyter92 commented Mar 19, 2026

This PR basically exposes the inference runner that is used in superanimal image folder analysis, so you don't have the overhead of reading and writing images. As the backend inference runner is exactly the same (and you have to load the images at some point in your pipeline) the comparison is a bit unfair. But assuming that you already loaded the images in memory elsewhere in your script (the usecase of this PR), the benefits are as follows:

image

@deruyter92
Copy link
Copy Markdown
Collaborator Author

Thanks for the review! I'll merge it once the tests are complete!

@deruyter92 deruyter92 merged commit 856dfac into main Mar 19, 2026
11 checks passed
@deruyter92 deruyter92 deleted the jaap/modelzoo_inference_api branch March 19, 2026 19:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants