Thanks to visit codestin.com
Credit goes to github.com

Skip to content

StableNormal: Reducing Diffusion Variance for Stable and Sharp Normal

License

Notifications You must be signed in to change notification settings

zebin-dm/StableNormal

 
 

Repository files navigation

StableNormal

Website Paper Hugging Face Space Hugging Face Model License

We propose StableNormal, which tailors the diffusion priors for monocular normal estimation. Unlike prior diffusion-based works, we focus on enhancing estimation stability by reducing the inherent stochasticity of diffusion models ( i.e. , Stable Diffusion). This enables “Stable-and-Sharp” normal estimation, which outperforms multiple baselines (try Compare), and improves various real-world applications (try Demo).

teaser

Installation:

Please run following commands to build package:

git clone https://github.com/Stable-X/StableNormal.git
cd StableNormal
pip install -r requirements.txt

or directly build package:

pip install git+https://github.com/Stable-X/StableNormal.git

Usage

To use the StableNormal pipeline, you can instantiate the model and apply it to an image as follows:

import torch
from PIL import Image

# Load an image
input_image = Image.open("path/to/your/image.jpg")

# Create predictor instance
predictor = torch.hub.load("Stable-X/StableNormal", "StableNormal", trust_repo=True)

# Apply the model to the image
normal_image = predictor(input_image)

# Save or display the result
normal_image.save("output/normal_map.jpg")

Additional Options:

  • If you need faster inference(10 times faster), use StableNormal_turbo:
predictor = torch.hub.load("Stable-X/StableNormal", "StableNormal_turbo", trust_repo=True)
  • If Hugging Face is not available from terminal, you could download the pretrained weights to weights dir:
predictor = torch.hub.load("Stable-X/StableNormal", "StableNormal", trust_repo=True, local_cache_dir='./weights')

About

StableNormal: Reducing Diffusion Variance for Stable and Sharp Normal

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.5%
  • Other 1.5%