The Butterfly Effect: Color-Guided Image Generation from Unconditional Diffusion Models

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Diffusion models, which generate images by denoising initial noise through iterative steps, have emerged as a new state-of-the-art family of deep generative models. Several methods have been proposed to guide user requirements to unconditional diffusion models without an additional training process. However, these methods generate images using complex functions, necessitating the calculation of gradients. To address this issue, we propose a new approach called Color-Guided Diffusion (CGDiff) that guides the color of images generated by unconditional diffusion models, without requiring any structural modification or additional training. Specifically, CGDiff adds color-specific noise to the generating image according to a guidance color during the early denoising steps. Despite its simplicity, CGDiff effectively offers color guidance to diffusion models, resulting in generated images imbued with the guidance color. Furthermore, since diverse diffusion models commonly share the denoising process for image generation, CGDiff can be easily applied to any pre-trained diffusion model. Extensive experimental results with various off-the-shelf diffusion models demonstrate the effectiveness and broad applicability of our CGDiff.

Original languageEnglish
Pages (from-to)27794-27804
Number of pages11
JournalIEEE Access
Volume13
DOIs
StatePublished - 2025

Keywords

  • Butterfly effect
  • color guidance
  • diffusion models
  • image generation

Fingerprint

Dive into the research topics of 'The Butterfly Effect: Color-Guided Image Generation from Unconditional Diffusion Models'. Together they form a unique fingerprint.

Cite this