Papers
arxiv:2409.15761

TFG: Unified Training-Free Guidance for Diffusion Models

Published on Sep 24, 2024
Authors:
,
,
,
,
,
,
,

Abstract

Given an <PRE_TAG>unconditional diffusion model</POST_TAG> and a predictor for a target property of interest (e.g., a classifier), the goal of <PRE_TAG>training-free guidance</POST_TAG> is to generate samples with <PRE_TAG>desirable target properties</POST_TAG> without additional training. Existing methods, though effective in various individual applications, often lack theoretical grounding and rigorous testing on extensive benchmarks. As a result, they could even fail on simple tasks, and applying them to a new problem becomes unavoidably difficult. This paper introduces a novel <PRE_TAG><PRE_TAG>algorithmic framework</POST_TAG></POST_TAG> encompassing existing methods as special cases, unifying the study of <PRE_TAG>training-free guidance</POST_TAG> into the analysis of an algorithm-agnostic design space. Via theoretical and empirical investigation, we propose an efficient and effective <PRE_TAG><PRE_TAG>hyper-parameter searching strategy</POST_TAG></POST_TAG> that can be readily applied to any downstream task. We systematically benchmark across 7 diffusion models on 16 tasks with 40 targets, and improve performance by 8.5% on average. Our framework and benchmark offer a solid foundation for <PRE_TAG><PRE_TAG>conditional generation</POST_TAG></POST_TAG> in a training-free manner.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2409.15761 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2409.15761 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2409.15761 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.