Takeshi Noda* · Chao Chen* · Weiqi Zhang · Xinhai Liu · Yu-Shen Liu · Zhizhong Han
(* Equal Contribution)
Overview of our method: (a) Frequency Feature Transformation (FFT) module and (b) Multi-Step Pulling (MSP) module. In (a), we learn Fourier bases
Please also check out the following works that inspire us a lot:
- Junsheng Zhou et al. - # CAP-UDF: Learning Unsigned Distance Functions Progressively from Raw Point Clouds with Consistency-Aware Field Optimization (TPAMI 2024)
- Chao Chen et al. - Gridpull: Towards Scalability in Learning Implicit Representations from 3D Point Clouds (ICCV 2023)
- Baorui Ma et al. - Neural-Pull: Learning Signed Distance Functions from Point Clouds by Learning to Pull Space onto Surfaces (ICML2021)
Our code is implemented in Python 3.8, PyTorch 1.11.0 and CUDA 11.3.
-
Install python Dependencies
conda create -n mmpull python=3.8 conda activate mmpull conda install pytorch torchvision torchaudio cudatoolkit=11.3 -c pytorch pip install tqdm pyhocon==0.3.57 trimesh PyMCubes scipy point_cloud_utils==0.29.7
-
Compile C++ extensions
cd extensions/chamfer_dist python setup.py install
We provide the input point cloud data and pretrained model in /data/dfaust as example. The datasets is organised as follows:
│data/
├──dfaust/
│ ├── input.txt
│ ├── inpu_data
│ ├── query_data
You can train our method to reconstruct surfaces from a single point cloud as:
python run.py --gpu 0 --conf confs/dfaust.conf --filelist input.txt
You can find the generated mesh and the log in ./outs.
We also provide the instructions for training your own data in the following.
First, you should put your own data to the ./data/Custom_Dataset/input_data folder, and mark data names/ID in input.txt. The datasets is organised as follows:
data/
├──Custom_Dataset/
│ ├── input_data
│ ├── (dataname).ply/xyz/npy
│ ├── input.txt
│ ├── query_data.txt
To train your own data, simply run:
python run.py --gpu 0 --conf confs/<your config> --filelist <your filelist> --dir (dataname)
The scale of the sampling range has a significant impact on the quality of the final reconstruction results. We give 0.25 * np.sqrt(POINT_NUM_GT / 20000) here as a reference value for processing input query points, and this value can be used for most object-level reconstructions.
If you find our code or paper useful, please consider citing
@inproceedings{nodamultipull,
title={MultiPull: Detailing Signed Distance Functions by Pulling Multi-Level Queries at Multi-Step},
author={Noda, Takeshi and Chen, Chao and Zhang, Weiqi and Liu, Xinhai and Liu, Yu-Shen and Han, Zhizhong},
booktitle={The Thirty-eighth Annual Conference on Neural Information Processing Systems}
year = {2024}



