File size: 2,141 Bytes
76363cf |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 |
---
base_model: stabilityai/stable-diffusion-2-1-base
tags:
- art
- controlnet
- stable-diffusion
- image-to-image
---
# Geometry- and Light-aware ControlNet
![img](./assets/pipeline_controlnet.png)
The geometry- and light-aware ControlNet uses an object's normal and depth maps as geometry conditions and six predefined materials with a given environment light as lighting conditions. Our model generates images that align with the given geometry and environment light.
![img](./assets/controlnet.png)
## Material Generation
This ControlNet can be used in distillation process to generate PBR materials. Please refer to the paper "DreamMat: High-quality PBR Material Generation with Geometry- and Light-aware Diffusion Models" for technical details.
<p>
<a href=https://zzzyuqing.github.io/dreammat.github.io/>Project Page</a> •
<a href=https://arxiv.org>Arxiv</a> •
<a href=https://github.com/zzzyuqing/DreamMat>GitHub</a>
</p>
![img](./assets/teaser.png)
![img](./assets/pipeline.png)
## Training Dataset
We train the geometry- and light-aware ControlNet from the images which are rendered on the objects in the LVIS subset of the Objaverse. Since the names and tags of objects in this dataset are rather noisy, we employ BLIP for captioning all rendered images. We render 16 random views for every object under randomly chosen environment light maps. The light condition maps are obtained by using ray tracing in Blender, which represents the radiance for different materials under the environment light. For normal maps, we transform the model's normal vectors into view space and flip the x-axis following ScanNet’s protocol. Depth maps are processed by inverting the real depth values and normalizing them.
## 📖 Citation
```bib
@inproceedings{zhang2024dreammat,
title={DreamMat: High-quality PBR Material Generation with Geometry- and Light-aware Diffusion Models},
author={Zhang, Yuqing and Liu, Yuan and Xie, Zhiyu and Yang, Lei and Liu, Zhongyuan and Yang, Mengzhou and Zhang, Runze and Kou, Qilong and and Lin, Cheng and Wang, Wenping and Jin, Xiaogang},
booktitle={SIGGRAPH},
year={2024}
}
``` |