Segment Anything in 3D with NeRFs
Jiazhong Cen*1, Zanwei Zhou*1, Jiemin Fang2, Chen Yang1, Wei Shen1✉, Lingxi Xie3, Dongsheng Jiang3, Xiaopeng Zhang3, Qi Tian3
1AI Institute, SJTU 2School of EIC, HUST 3Huawei Inc.
*denotes equal contribution
Given a NeRF, just input prompts from one single view and then get your 3D model.

We propose a novel framework to Segment Anything in 3D, named SA3D. Given a neural radiance field (NeRF) model, SA3D allows users to obtain the 3D segmentation result of any target object via only one-shot manual prompting in a single rendered view. The entire process for obtaining the target 3D model can be completed in approximately 2 minutes, yet without any engineering optimization. Our experiments demonstrate the effectiveness of SA3D in different scenes, highlighting the potential of SAM in 3D scene perception.
- 2023/11/11: We release the nerfstudio version of SA3D. Currently it only supports the text prompt as input.
- 2023/06/29: We now support MobileSAM as the segmentation network. Follow the installation instruction in MobileSAM, and then download mobile_sam.pt into folder
./dependencies/sam_ckpt. You can use--mobile_samto switch to MobileSAM.
With input prompts, SAM cuts out the target object from the according view. The obtained 2D segmentation mask is projected onto 3D mask grids via density-guided inverse rendering. 2D masks from other views are then rendered, which are mostly uncompleted but used as cross-view self-prompts to be fed into SAM again. Complete masks can be obtained and projected onto mask grids. This procedure is executed via an iterative manner while accurate 3D masks can be finally learned. SA3D can adapt to various radiance fields effectively without any additional redesigning.
git clone https://github.com/Jumpat/SegmentAnythingin3D.git
cd SegmentAnythingin3D
conda create -n sa3d python=3.10
pip install -r requirements.txtFollow this guidance to install nerfstudio.
Note: We developed our code under nerfstudio==0.2.0.
cd sa3d/self_prompting; # now the folder 'dependencies' is under 'sa3d/self_prompting';
# Installing SAM
mkdir dependencies; cd dependencies
mkdir sam_ckpt; cd sam_ckpt
wget https://dl.fbaipublicfiles.com/segment_anything/sam_vit_h_4b8939.pth
git clone git@github.com:facebookresearch/segment-anything.git
cd segment-anything; pip install -e .
# Installing Grounding-DINO
git clone https://github.com/IDEA-Research/GroundingDINO.git
cd GroundingDINO/; pip install -e .
mkdir weights; cd weights
wget https://github.com/IDEA-Research/GroundingDINO/releases/download/v0.1.0-alpha/groundingdino_swint_ogc.pthIn the root directory of this repo, conduct
pip install -e .- Train NeRF
ns-train nerfacto --load-data {data-dir} - Run SA3D
ns-train sa3d --data {data-dir} \ --load-dir {ckpt-dir} \ --pipeline.text_prompt {text-prompt} \ --pipeline.network.num_prompts {num-prompts} \ - Render and Save Fly-through Videos
ns-viewer --load-config {config-dir}
SA3D can handle various scenes for 3D segmentation. Find more demos in our project page.
Thanks for the following project for their valuable contributions:
If you find this project helpful for your research, please consider citing the report and giving a ⭐.
@inproceedings{cen2023segment,
title={Segment Anything in 3D with NeRFs},
author={Jiazhong Cen and Zanwei Zhou and Jiemin Fang and Chen Yang and Wei Shen and Lingxi Xie and Dongsheng Jiang and Xiaopeng Zhang and Qi Tian},
booktitle = {NeurIPS},
year = {2023},
}