TY - JOUR
T1 - The Butterfly Effect
T2 - Color-Guided Image Generation from Unconditional Diffusion Models
AU - Yeo, Yunha
AU - Um, Daeho
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2025
Y1 - 2025
N2 - Diffusion models, which generate images by denoising initial noise through iterative steps, have emerged as a new state-of-the-art family of deep generative models. Several methods have been proposed to guide user requirements to unconditional diffusion models without an additional training process. However, these methods generate images using complex functions, necessitating the calculation of gradients. To address this issue, we propose a new approach called Color-Guided Diffusion (CGDiff) that guides the color of images generated by unconditional diffusion models, without requiring any structural modification or additional training. Specifically, CGDiff adds color-specific noise to the generating image according to a guidance color during the early denoising steps. Despite its simplicity, CGDiff effectively offers color guidance to diffusion models, resulting in generated images imbued with the guidance color. Furthermore, since diverse diffusion models commonly share the denoising process for image generation, CGDiff can be easily applied to any pre-trained diffusion model. Extensive experimental results with various off-the-shelf diffusion models demonstrate the effectiveness and broad applicability of our CGDiff.
AB - Diffusion models, which generate images by denoising initial noise through iterative steps, have emerged as a new state-of-the-art family of deep generative models. Several methods have been proposed to guide user requirements to unconditional diffusion models without an additional training process. However, these methods generate images using complex functions, necessitating the calculation of gradients. To address this issue, we propose a new approach called Color-Guided Diffusion (CGDiff) that guides the color of images generated by unconditional diffusion models, without requiring any structural modification or additional training. Specifically, CGDiff adds color-specific noise to the generating image according to a guidance color during the early denoising steps. Despite its simplicity, CGDiff effectively offers color guidance to diffusion models, resulting in generated images imbued with the guidance color. Furthermore, since diverse diffusion models commonly share the denoising process for image generation, CGDiff can be easily applied to any pre-trained diffusion model. Extensive experimental results with various off-the-shelf diffusion models demonstrate the effectiveness and broad applicability of our CGDiff.
KW - Butterfly effect
KW - color guidance
KW - diffusion models
KW - image generation
UR - https://www.scopus.com/pages/publications/85217573384
U2 - 10.1109/ACCESS.2025.3539101
DO - 10.1109/ACCESS.2025.3539101
M3 - Article
AN - SCOPUS:85217573384
SN - 2169-3536
VL - 13
SP - 27794
EP - 27804
JO - IEEE Access
JF - IEEE Access
ER -