Implementation of the paper ESCAPE: Equivariant Shape Completion via Anchor Point Encoding.
Burak Bekci, Mahdi Saleh, Dr. Federico Tombari, Prof. Dr. Nassir Navab
ESCAPE (Equivariant Shape Completion via Anchor Point Encoding), a novel framework designed to achieve rotation-equivariant shape completion. Our approach employs a distinctive encoding strategy by selecting anchor points from a shape and representing all points as a distance to all anchor points. This enables the model to capture a consistent, rotation-equivariant understanding of the object’s geometry. ESCAPE leverages a transformer architecture to encode and decode the distance transformations, ensuring that generated shape completions remain accurate and equivariant under rotational transformations. Subsequently, we perform optimization to calculate the predicted shapes from the encodings. Experimental evaluations demonstrate that ESCAPE achieves robust, high-quality reconstructions across arbitrary rotations and translations, showcasing its effectiveness in real-world applications without additional pose estimation modules.
The code is tested with below versions
CUDA 11.3
Python 3.8
PyTorch 1.11
Install required libraries with the following command:
bash scripts\install.sh
Before running any script, change the dataset path in the pcn.yml config. Namely partial_points_path, complete_points_path
partial_points_parent_path and complete_points_parent_path.
Once everything is installed the training can be started with:
python train.py
To evaluate run:
python eval.py
Download dataset from below sources:
For OmniObject3D there is a reference notebook file for creating the dataset containing partial and complete point clouds.
Pretrained models for PCN dataset can be downloaded from here
For the first pretrained model use below parameters:
| Parameter | Value |
|---|---|
| k | 16 |
| curvature_radius | 0.075 |
| curvature_thres | 0.5 |
| neighbor_size | 16 |
for the second provided model below parameters change k with 32.
Some parts of the code is borrowed from below repositories:

