This project is my personal implementation of DeepSDF, focused on reconstructing 3D shapes using signed distance functions (SDFs). It supports training on .obj files, partial-view shape completion, and mesh generation via Marching Cubes.
- Trains a DeepSDF model on a set of 3D
.objfiles - Samples 50,000 points per shape (surface + random)
- Learns per-shape latent embeddings
- Reconstructs meshes from predicted SDF values
- Supports shape completion using:
- Nearest latent match
- Latent code optimization
The DeepSDF paper uses 500,000 points per shape. I used 50,000 due to hardware limits, but more points generally yield better reconstructions. You can change this in the code by increasing the
num_samplesargument.
Trained over 10,000 epochs. Here are two sample outputs comparing the original and reconstructed mesh:
Sample 0
comparison_0
Sample 1
comparison_1
View Remaining 28 Reconstructions
comparison_29 comparison_28 comparison_27 comparison_26 comparison_25 comparison_24 comparison_23 comparison_22 comparison_21 comparison_20 comparison_19 comparison_18 comparison_17 comparison_16 comparison_15 comparison_14 comparison_13 comparison_12 comparison_11 comparison_10 comparison_9 comparison_8 comparison_7 comparison_6 comparison_5 comparison_4 comparison_3 comparison_2
Given only a partial input (e.g. front view), the model can:
Best Match from Latent Bank
comparison_best_match
Latent Optimization
comparison_optimized
This bunny was trained using 50,000 sampled points over 5,000 epochs. Since it’s a simple object compared to a chair, the result came out clean and accurate.
Original
image
Reconstructed
image
Testing how an LLM can modify latent codes from prompts like "a chair with curved legs.". The LLM interprets phrases like "a chair with curved legs" and adjusts latent space dimensions accordingly.
Plotting shape embeddings to explore clustering, interpolation, and prompt-based transformations.
These tools help link semantic meaning and 3D geometry. Still testing accuracy and generation quality but early results are interesting.
This code was originally written just for me and is not yet modular or generalized. I’m planning to clean it up and release a version that works out-of-the-box on any dataset.
If you're interested in trying it now:
- You’ll need to manually update the paths for
obj_path,save_dir, etc. - It works best on ShapeNet Core
.objmeshes NOT_RECONSTRUCTED = Truetoggles training vs test mode
pip install torch trimesh scikit-image matplotlib numpy
python DeepSDFCode.py
- Clean up and modularize the codebase
- Add CLI for custom datasets and checkpoint control
- Finish LLM integration and latent-guided generation
- Publish code once more stable
If you have questions or want help running this, feel free to reach out. I’ll be happy to explain anything.
DeepSDF: Learning Continuous Signed Distance Functions for Shape Representation
Example of a trained latent vector used to reconstruct the Stanford Bunny.
{2.571929804980754852e-02
-1.891424879431724548e-02
-2.272037602961063385e-02
-3.282329766079783440e-03
2.671889029443264008e-02
-1.568010449409484863e-02
-1.949955709278583527e-02
1.988929882645606995e-02
-2.002627216279506683e-02
-5.950008518993854523e-03
2.137670479714870453e-02
1.250777766108512878e-02
2.193937823176383972e-02
-1.024499349296092987e-02
6.561588961631059647e-03
1.607674919068813324e-02
-2.690345235168933868e-02
1.764042116701602936e-02
1.839219965040683746e-02
2.136653102934360504e-02
2.471710927784442902e-02
1.862458698451519012e-02
8.582584559917449951e-03
2.496489696204662323e-02
-4.940207581967115402e-03
1.576032303273677826e-02
-1.771047100191935897e-04
1.025762408971786499e-02
-2.179501205682754517e-02
-1.298787910491228104e-02
2.090901695191860199e-02
-1.437296066433191299e-02
2.377056516706943512e-02
4.118666984140872955e-03
-2.042565681040287018e-02
2.172596752643585205e-02
2.784682251513004303e-02
2.780549041926860809e-02
-1.262331102043390274e-02
2.527266740798950195e-02
2.769804000854492188e-02
-7.739173714071512222e-03
-2.074931748211383820e-02
-7.500117644667625427e-03
1.348469965159893036e-02
-1.210085395723581314e-02
-2.020902000367641449e-02
-8.486658334732055664e-03
-1.604524441063404083e-02
-2.227163501083850861e-02
-3.943101502954959869e-03
-2.004387788474559784e-02
-1.141452789306640625e-02
1.389472465962171555e-02
1.287480466999113560e-03
-1.063225232064723969e-02
-1.090580690652132034e-02
-1.872522383928298950e-02
-1.800170727074146271e-02
-6.147798616439104080e-03
-1.314948219805955887e-02
-1.917962916195392609e-02
-4.188995808362960815e-03
-1.953523047268390656e-02
1.575730927288532257e-02
1.653507165610790253e-02
-1.710680127143859863e-02
1.698636449873447418e-02
7.092911284416913986e-03
3.082667663693428040e-02
1.420361734926700592e-02
2.613485325127840042e-03
2.594907954335212708e-02
2.252011001110076904e-02
1.041685137897729874e-02
2.781263366341590881e-02
2.060546353459358215e-02
-1.181704830378293991e-02
-6.185079459100961685e-03
-2.402738109230995178e-02
1.151771191507577896e-02
-2.488913759589195251e-02
-1.272862404584884644e-02
-2.601240994408726692e-03
6.778646260499954224e-03
-2.375466376543045044e-02
1.616838946938514709e-02
-9.508697316050529480e-03
1.516710966825485229e-02
-2.343936264514923096e-02
-2.233266830444335938e-02
2.641224674880504608e-02
-1.049401331692934036e-02
1.826234348118305206e-02
2.033309265971183777e-02
1.292957924306392670e-02
-2.118060551583766937e-02
-3.032799577340483665e-03
1.503079291433095932e-02
-1.850883662700653076e-02
1.203035190701484680e-02
2.134085074067115784e-02
2.147503569722175598e-02
9.650061838328838348e-03
-1.575212366878986359e-02
-1.155801489949226379e-02
1.429485809057950974e-02
-2.237617783248424530e-02
-1.444547809660434723e-02
-1.785629987716674805e-02
-2.891577221453189850e-02
2.882771193981170654e-02
2.162054367363452911e-02
-2.001342736184597015e-02
-2.540876902639865875e-02
-1.157671213150024414e-02
-2.842639572918415070e-02
-1.069029141217470169e-02
1.705634035170078278e-02
2.369702793657779694e-02
8.442553691565990448e-03
1.473719719797372818e-02
3.313469886779785156e-02
1.589295081794261932e-02
2.056014165282249451e-02
-2.329988777637481689e-02
2.158742398023605347e-02
-2.093892544507980347e-02
}