Add internal helper for batched modelzoo inference from in-memory arrays (inference runner)#3222
Add internal helper for batched modelzoo inference from in-memory arrays (inference runner)#3222deruyter92 merged 16 commits intomainfrom
Conversation
Introduce a modelzoo helper that resolves configs and checkpoints via build_weight_init, then constructs pose/detector runners for future in-memory batched APIs.
There was a problem hiding this comment.
Pull request overview
Adds a new Model Zoo helper to construct pose + detector inference runners for batched in-memory (array-based) SuperAnimal inference, and re-exports it for easier imports.
Changes:
- Added
create_superanimal_inference_runners(...)to build pose/detectorInferenceRunners from an in-memory workflow. - Integrated config loading/updating + weight initialization (including optional custom checkpoints) into the runner factory.
- Re-exported the helper from
deeplabcut.pose_estimation_pytorch.modelzoo.
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 5 comments.
| File | Description |
|---|---|
deeplabcut/pose_estimation_pytorch/modelzoo/inference.py |
Adds the new runner-factory helper and required imports for config/weight init + device selection. |
deeplabcut/pose_estimation_pytorch/modelzoo/__init__.py |
Re-exports the new helper for convenient imports. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
C-Achard
left a comment
There was a problem hiding this comment.
Looks good so far, thanks !
deeplabcut/pose_estimation_pytorch/modelzoo/inference_helpers.py
Outdated
Show resolved
Hide resolved
Instead of several definitions of the persons class in the coco dataset, a single constant is placed in modelzoo.utils. This reduces code duplication improves maintainability and promotes centralized logic, rather than implying that the argument is intented to be tuned by the callers.
C-Achard
left a comment
There was a problem hiding this comment.
Really nice, I like the minimally invasive design.
If we're alright with making small tweaks under the hood, see comments, otherwise all good.
(Note : did not have the opportunity to test it yet)
deeplabcut/pose_estimation_pytorch/modelzoo/inference_helpers.py
Outdated
Show resolved
Hide resolved
deeplabcut/pose_estimation_pytorch/modelzoo/inference_helpers.py
Outdated
Show resolved
Hide resolved
deeplabcut/pose_estimation_pytorch/modelzoo/inference_helpers.py
Outdated
Show resolved
Hide resolved
MMathisLab
left a comment
There was a problem hiding this comment.
I did not run the PR, so maybe speed test this, but otherwise Lgtm!
|
Thanks for the review! I'll merge it once the tests are complete! |

Summary
create_superanimal_inference_runners(...)deeplabcut.pose_estimation_pytorch.modelzoo.inferencedeeplabcut.pose_estimation_pytorch.modelzoo.__init__for convenient imports.Motivation
Currently it is fairly easy to run batched inference on our PyTorch models, but the modelzoo api is an exception that requires us to save images to disk first. This PR implements a modelzoo inference runner factory to easily run batched inference from in-memory image arrays. This can be used internally for faster inference. (Inspired by
get_inference_runners).This solves #3218
Example usage